Oregon Attorney General issues AI Guidance for businesses

In a significant move to regulate the growing impact of artificial intelligence, Oregon lawmakers recently passed Senate Bill 1571, requiring campaigns to disclose when they use AI to manipulate audio or video images, including deepfakes, to influence voters.  Although SB 1571 applies only to political campaigns, the Attorney General has issued guidance that may be helpful to businesses seeking to minimize their legal risks in connection with the use of AI.

In an official Guidance document issued on December 24, Oregon Attorney General Ellen Rosenblum acknowledged the benefits and risks associated with AI. She recognized the ability of AI to streamline tasks and analyze data, but also highlighted the significant concerns surrounding privacy, discrimination, and accountability. Oregon businesses should be vigilant about how AI can intersect with the Oregon Unlawful Trade Practices Act, the Oregon Consumer Privacy Act, Oregon’s data breach notification statute, and the Oregon Equality Act. Below, we break down how these laws apply to AI and discuss ways businesses can reduce their risk of legal exposure.

Oregon’s Unlawful Trade Practices Act

The UTPA was designed to protect consumers against unfair and deceptive business practices, including misrepresentations in consumer transactions. The UTPA mirrors Section 5 of the Federal Trade Commission Act and is intended to apply to emerging technologies like AI. The Guidance issued by AG Rosenblum addresses several examples in which the UTPA would directly apply to the AI technology. This includes the following:

  • Known material defects. AI developers, as well as companies that deploy AI in their operations (which are referred to in the Guidance as “deployers”), may be held liable under ORS 646.608(1)(t) if their product regularly generates false or misleading information and fails to disclose these limitations to purchasers and users.
  • Misrepresentation of characteristics, benefits, uses, or sponsorship. Developers or deployers who claim that their AI has specific characteristics, uses, benefits, or qualities, or who claim that their AI has sponsorship, approval, affiliation, or connection falsely, may be held liable under ORS 646.608(1)(e). For example, the use of deep fakes to create a false celebrity endorsement or affiliation may directly violate this statute.
  • False urgency: Developers or deployers who use AI to falsely claim that a discount is being offered for a limited time when a similar discount is available year-round may be a “flash sale” in violation of ORS 646.608(1)(j).
  • Price gouging: Companies that use AI to set unconscionable prices during a state of emergency may violate ORS 646.607(3).
  • AI-generated robocalls: AI-generated voice calls or robocalls containing false information may violate ORS 646.608(1)(ff).
  • Unconscionable tactics: Use of AI to knowingly take advantage of a consumer’s ignorance or to knowingly permit them to enter into a transaction with no material benefit may subject a company to liability under ORS 646.607(1).

Oregon Consumer Privacy Act

The OCPA imposes strict requirements on the use of consumer data and is particularly applicable when that data is used to train AI systems. The OCPA requires clear and conspicuous privacy notices, and AG Rosenblum makes clear that this is especially pertinent for developers and deployers of AI seeking to train their AI with consumer data. Liability under the OCPA may arise in the following areas:

  • Notice and consent. AI developers and deployers must disclose the use of personal data in a clear and conspicuous privacy notice. The notice must clearly disclose that the business intends to use personal information to train its AI and must clearly explain to consumers their statutory rights. These rights include (1) the right to know whether the company is processing their data, (2) the right to request access to their data, (3) the right to amend and correct inaccuracies, (4) the right to delete their personal data, and (5) the right to opt out of the use of AI models for profiling in decisions that have legal or similarly significant impacts like housing, education, or lending.
  • Sensitive data. AI developers and deployers who use sensitive data as specified under the OCPA must first obtain explicit consent before using the data to train their AI model.
  • Controller liability. Developers that purchase or use another company’s data set for model training may be considered a “controller” under the OCPA, meaning they will be considered the person who determines the purposes and means for processing personal data.
  • Retroactive privacy notices. Under the OCPA, retroactive privacy notices that purport to legitimize the use of previously collected personal data to train AI models are prohibited. Developers must instead obtain explicit, affirmative consent for the secondary use of previously collected data and must provide the consumer with a mechanism to withdraw previous consent.
  • Data protection assessments. Oregon businesses must conduct a data protection assessment before processing personal data for the purposes of profiling or other activities that are considered “heightened risk.” Notably, AG Rosenblum considers the use of personal data to train AI models to create a heightened risk.

Oregon Consumer Information Protection Act

The Oregon Consumer Information Protection Act is the state’s data breach notification statute. It requires businesses to notify consumers in the event of a breach and to comply with statutory baseline safeguards for protecting consumer personal information. Developers or deployers of AI may therefore need to notify affected individuals and the Oregon Attorney General if there is a security breach that affects personal data within AI systems.

Oregon Equality Act

The Oregon Equality Act prohibits discrimination based on identity characteristics including race, color, religion, sex, sexual orientation, gender identity, national origin, marital status, age, or disability. AI systems are trained by human beings and thus may inadvertently be susceptible to discriminatory results. For example, an AI loan approval system that consistently denies loans to qualified applicants from certain ethnic backgrounds may violate the Oregon Equality Act. AG Rosenblum hopes to head this off by calling on deployers and developers to address these concerns during the development process as they consider potentially discriminatory inputs or biased outcomes.

Looking ahead: Increased legal scrutiny and evolving liability

Undoubtedly, AI and emerging technologies will continue to change and guide new business standards in Oregon and across the nation, and businesses must stay up to date to ensure compliance. If your business deals with AI or other emerging technologies, Constangy’s highly experienced Cybersecurity and Data Privacy Team is here to assist.

  • Smiling man with dark hair, beard, and glasses wearing a dark plaid suit, white shirt, and striped tie. He is standing against a transparent background.
    Associate Attorney

    As a member of our rapid response team, Seth assists clients in responding to a variety of cyberattacks including ransomware, business email compromises, fraudulent wire transfers, and ACH payments. In doing so, he helps deploy the ...

  • Melissa Sachs smiling in a professional headshot, wearing a dark blazer over an olive green top, with straight brown hair parted to the side and a transparent background.
    Partner

    With a focus in privacy law, she brings extensive experience providing both incident response and proactive compliance advisory services.  As a member of the Constangy Cyber Team, Melissa applies years of experience in managing ...

The Constangy Cyber Advisor posts regular updates on legislative developments, data privacy, and information security trends. Our blog posts are informed through the Constangy Cyber Team's experience managing thousands of data breaches, providing robust compliance advisory services, and consultation on complex data privacy and security litigation. 

Search

Get Updates By Email

Subscribe

Archives

Jump to Page

Constangy, Brooks, Smith & Prophete, LLP Cookie Preference Center

Your Privacy

When using this website, Constangy and certain third parties may collect and use cookies or similar technologies to enhance your experience. These technologies may collect information about your device, activity on our website, and preferences. Some cookies are essential to site functionality, while others help us analyze performance and usage trends to improve our content and features.

Please note that if you return to this website from a different browser or device, you may need to reselect your cookie preferences.

For more information about our privacy practices, including your rights and choices, please see our Privacy Policy. 

Strictly Necessary Cookies

Always Active

Strictly Necessary Cookies are essential for the website to function, and cannot be turned off. We use this type of cookie for purposes such as security, network management, and accessibility. You can set your browser to block or alert you about these cookies, but if you do so, some parts of the site will not work. 

Functionality Cookies

Always Active

Functionality Cookies are used to enhance the functionality and personalization of this website. These cookies support features like embedded content (such as video or audio), keyword search highlighting, and remembering your preferences across pages—for example, your cookie choices or form inputs during submission.

Some of these cookies are managed by third-party service providers whose features are embedded on our site. These cookies do not store personal information and are necessary for certain site features to work properly.

Performance Cookies

Performance cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.

Powered by Firmseek