Adobe Responds to ‘Terms of Use’ Controversy, Says It Doesn’t Spy on Users

Adobe has released a new blog post explaining changes to its Terms of Use, when Adobe applications can access user content, and whether user content will be used to train Adobe’s artificial intelligence (AI) models and services.

The need for clarification came after a number of users, including some established creative professionals, received pop-up notifications in Adobe applications that stated, among other things, that Adobe may access user content through automated and manual methods. The resulting anger among the creative community is easy to understand.

The popup, which required a person’s consent to continue using Adobe software, failed to explain exactly what was updated in the terms of use and how Adobe can access someone’s content. Adobe’s lack of transparency has left the door open to speculation, confusion and fear.

Among the most prominent concerns was that Adobe would essentially be spying on a person’s work. This applies in general, but especially to those who use Adobe software to create work for NDA projects. Given Adobe’s prominence in the creative software segment, this fear concerns many.

There have also been concerns that Adobe is claiming ownership of a particular person’s work, which is very unclear given how some of Adobe’s Creative Cloud services work. Ultimately no, Adobe does not own the work that someone creates in Creative Cloud applications or uploads to Adobe platforms, but must have some form of license to provide specific services. Some of the rather standard legalese tucked into the terms of use and end-user license agreements, while common to many asset uploading and sharing tool services, can be daunting to those who read them.

It wouldn’t be an Adobe controversy without people wondering if Adobe is training its Firefly AI using customer content, so understandable concerns have also been raised again.

“We recently updated our Terms of Use to provide more clarity in a few specific areas, and we’re forcing Adobe Creative Cloud and Document Cloud customers to routinely re-accept these terms. We’ve received a number of questions arising from this update and want to provide some clarification,” Adobe writes in a new blog post. “We remain committed to transparency, protecting the rights of creators and enabling our customers to do their best work.”

As noted in the pop-up that sparked outrage this week, Adobe has updated the language in sections two and four of its terms of use. The exact changes Adobe made are detailed in its blog post , but the main revisions are the sections about Adobe being able to access, view, or listen to user content “in a limited manner and only in accordance with law “. Reasons for doing so include responding to customer feedback and support, detecting and preventing legal and technical issues, and enforcing content terms such as those prohibiting the use of Adobe software to create child sexual abuse material (CSAM).

Adobe further details its content moderation policies in a separate transparency section of its website.

Modern office with brick walls and large windows.  Several people work at desks with computers while others stand and converse.  Plants and office equipment are spread around, creating a collaborative and bustling work environment.

“To be clear, Adobe requires a limited license to access Content solely to operate or improve the Services and Software and to enforce our Terms and comply with laws such as protection against inappropriate content,” Adobe continues.

The company lists three instances where Adobe apps and services can access user content. These include when access is required to provide basic services and functions, such as opening and editing files for users or creating thumbnails or previews for sharing.

Access is also required to provide some cloud-based features, including Photoshop’s Neural Filters, Liquid Mode, or Background Removal. People can learn more about how content can be viewed and analyzed in these cases in Adobe’s Content Analysis FAQ. For those working on sensitive, confidential material, it may be worth considering the limited situations in which Adobe may view this content, including real people.

Finally, Adobe may access content that is processed or stored on Adobe servers. In these cases, Adobe may automatically or human-controlled certain types of illegal content (such as CSAM).

Adobe reaffirms that it “does not train Firefly Gen AI models on customer content.” Firefly is trained using licensed content such as Adobe Stock media and public content.

Further, Adobe says it “never takes ownership of a customer’s work.”

“Adobe hosts content to enable customers to use our apps and services,” the tech giant explains. “Customers own their content and Adobe takes no ownership of customer work.”

“We appreciate our customers reaching out and asking us these questions, which gave us the opportunity to clarify our terms and our commitments. We will refine the terms of acceptance that customers will see when they open apps,” Adobe concludes.

Hopefully these changes will hit customers sooner rather than later, as it’s easy to see how this situation has developed so quickly. Without the context that Adobe failed to include in its popup message, some of the standard terms of use seemed anything but.


Image credits: Adobe

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top