01-07-2023 , 02:39 AM
https://www.fastcompany.com/90831386/art...ontent=rss 01-06-239:45 AM
Artists accuse Adobe of tracking their design process to power its AI
A curious setting in Adobe Photoshop’s privacy preferences has the artistic community on edge this week. BY CHRIS STOKEL-WALKER5 MINUTE READ
A recent viral moment highlights just how nervous the artist community is about artificial intelligence (AI). It started earlier this week, when French comic book author Claire Wendling posted a screenshot of a curious passage in Adobe’s privacy and personal data settings to Instagram. It was quickly reposted on Twitter by another artist and campaigner, Jon Lam, where it subsequently spread throughout the artistic community, drawing nearly 2 million views and thousands of retweets. (Neither Wendling nor Lam responded to requests to comment for this story.)
The fear among those who shared the tweet was simple: That Photoshop, and other Adobe products, are tracking artists that use their apps to see how they work—in essence, stealing the processes and actions that graphic designers have developed over decades of work to mine for its own automated systems. The concern is that what is a complicated, convoluted artistic process becomes possible to automate—meaning “graphic designer” or “artist” could soon join the long list of jobs at risk of being replaced by robots.
Watch out for Adobe automatically Opting you In for “Machine learning” aka Ai. Also, tech companies that glorify “Opting out” options are using this to shift responsibility of Data mining onto US. Sneaky. Meanwhile Ai never forgets. It’s theatre. #adobe #OptOut pic.twitter.com/pMmdM4SBq6
— Jon Lam #CreateDontScrape (@JonLamArt) January 3, 2023
The reaction was predictable: One commenter accused Adobe of having “predatory business practices against artists”; another worried “machine working overlords… steal from you while you work.” Another noted it was, “reason number 32405585382281858428 on why you shouldn’t use Adobe products.”
The reality may be more complex. An Adobe spokesperson says that the company is not using customer accounts to train AI. “When it comes to Generative AI, Adobe does not use any data stored on customers’ Creative Cloud accounts to train its experimental Generative AI features,” said the company spokesperson in a written statement to Fast Company. “We are currently reviewing our policy to better define Generative AI use cases.” Meanwhile, Adobe’s FAQ on its machine learning content analysis cites examples of how the company may use machine learning-based object recognition to auto-tag photographs of known creatures or individuals, such as dogs and cats. “In Photoshop, machine learning can be used to automatically correct the perspective of an image for you,” the company says. It also can be utilized to suggest context-aware options: If the company’s apps believe you’re designing a website, it might suggest relevant buttons to include.
The fear of design processes being tracked by technology and then used to train artificial intelligence taps into a broader discomfort with the way in which artists are treated by generative AI apps. David Holz, the founder of Midjourney, an AI image generator, said in a September interview with Forbes that his organization had not sought the permission of the artists on which its AI was trained—which caused consternation among the art community. Some artists have created tools designed to allow their colleagues to opt out of having their work used to train AI. The European Union has also questioned the legality of AI tools hoovering up vast amounts of artwork to train their machine learning models.
“For me, it’s astonishing that a paid service assumes it’s okay to violate users’ privacy at such a scale,” says Andrey Okonetchnikov, a front-end developer and UI and UX designer from Vienna, Austria, who uses Adobe products to sync photographs. “It’s troublesome because companies who offer to store data in the cloud assume that they own the data. It violates intellectual property and privacy of millions of people and it’s assumed to be ‘business as usual’. This must stop now.”
Yet not everyone is quite as concerned. As with lots of movements on social media, some think that a legitimate concern has been overhyped and misconstrued. Part of that confusion stems from prior controversies with Adobe, where they planned to allow AI-generated images in their stock library—a move announced on December 5—which some saw as directly harmful to stock artists, and an earlier dispute with Pantone which resulted in some users losing access to the Pantone colors they had deployed in earlier design projects using Adobe software. “People saw that little checkbox for sharing data used for machine learning and conflated it with all the current AI image generation drama currently underway,” says Daniel Landerman, a Los Angeles-based creative director and illustrator.
To Landerman’s eyes, the data sharing feature for machine learning has been present in Adobe apps for years—and only applies to files stored in the Adobe cloud, which he says “any professional shouldn’t be doing anyway.” Landerman has long made sure to uncheck any options that share data with app makers, as part of the process of working with clients who often require him to sign non-disclosure agreements to work on projects.
“Everything is moving so fast with all the AI stuff: Artists trying to get regulations to catch up, AI engineers and NFT bros trying to outpace the artists,” Landerman says. “I’m not surprised some other non-issues get caught up in the turmoil.”
But beyond artists’ concerns, data protection experts say they’re worried about the way Adobe has handled the process. “Under European ePrivacy law, Adobe needs opt-in consent before reading data from individuals’ devices for the purpose not necessary for the service the user requested,” says Michael Veale, a University College London professor who specializes in digital rights.
“Sharing data in this way isn’t just unnecessary,” says Veale. “It’s way beyond users’ expectations, many of whom may have signed NDAs with clients ensuring that the content they’re editing doesn’t go anywhere.” Veale says he believes that the way the opt-out, rather than opt-in, is handled could be subject to investigations under European ePrivacy law, similar to one Apple was fined $8.5 million for earlier this week.
“We give customers full control of their privacy preferences and settings,” an Adobe spokesperson told Fast Company in a statement. “The policy in discussion is not new and has been in place for a decade to help us enhance our products for customers.” The spokesperson directed any customer who prefers their content be excluded from analysis to the options on the privacy page.
Update, January 6, 2023: This story has been updated with a comment from Adobe.
Artists accuse Adobe of tracking their design process to power its AI
A curious setting in Adobe Photoshop’s privacy preferences has the artistic community on edge this week. BY CHRIS STOKEL-WALKER5 MINUTE READ
A recent viral moment highlights just how nervous the artist community is about artificial intelligence (AI). It started earlier this week, when French comic book author Claire Wendling posted a screenshot of a curious passage in Adobe’s privacy and personal data settings to Instagram. It was quickly reposted on Twitter by another artist and campaigner, Jon Lam, where it subsequently spread throughout the artistic community, drawing nearly 2 million views and thousands of retweets. (Neither Wendling nor Lam responded to requests to comment for this story.)
The fear among those who shared the tweet was simple: That Photoshop, and other Adobe products, are tracking artists that use their apps to see how they work—in essence, stealing the processes and actions that graphic designers have developed over decades of work to mine for its own automated systems. The concern is that what is a complicated, convoluted artistic process becomes possible to automate—meaning “graphic designer” or “artist” could soon join the long list of jobs at risk of being replaced by robots.
Watch out for Adobe automatically Opting you In for “Machine learning” aka Ai. Also, tech companies that glorify “Opting out” options are using this to shift responsibility of Data mining onto US. Sneaky. Meanwhile Ai never forgets. It’s theatre. #adobe #OptOut pic.twitter.com/pMmdM4SBq6
— Jon Lam #CreateDontScrape (@JonLamArt) January 3, 2023
The reaction was predictable: One commenter accused Adobe of having “predatory business practices against artists”; another worried “machine working overlords… steal from you while you work.” Another noted it was, “reason number 32405585382281858428 on why you shouldn’t use Adobe products.”
The reality may be more complex. An Adobe spokesperson says that the company is not using customer accounts to train AI. “When it comes to Generative AI, Adobe does not use any data stored on customers’ Creative Cloud accounts to train its experimental Generative AI features,” said the company spokesperson in a written statement to Fast Company. “We are currently reviewing our policy to better define Generative AI use cases.” Meanwhile, Adobe’s FAQ on its machine learning content analysis cites examples of how the company may use machine learning-based object recognition to auto-tag photographs of known creatures or individuals, such as dogs and cats. “In Photoshop, machine learning can be used to automatically correct the perspective of an image for you,” the company says. It also can be utilized to suggest context-aware options: If the company’s apps believe you’re designing a website, it might suggest relevant buttons to include.
The fear of design processes being tracked by technology and then used to train artificial intelligence taps into a broader discomfort with the way in which artists are treated by generative AI apps. David Holz, the founder of Midjourney, an AI image generator, said in a September interview with Forbes that his organization had not sought the permission of the artists on which its AI was trained—which caused consternation among the art community. Some artists have created tools designed to allow their colleagues to opt out of having their work used to train AI. The European Union has also questioned the legality of AI tools hoovering up vast amounts of artwork to train their machine learning models.
“For me, it’s astonishing that a paid service assumes it’s okay to violate users’ privacy at such a scale,” says Andrey Okonetchnikov, a front-end developer and UI and UX designer from Vienna, Austria, who uses Adobe products to sync photographs. “It’s troublesome because companies who offer to store data in the cloud assume that they own the data. It violates intellectual property and privacy of millions of people and it’s assumed to be ‘business as usual’. This must stop now.”
Yet not everyone is quite as concerned. As with lots of movements on social media, some think that a legitimate concern has been overhyped and misconstrued. Part of that confusion stems from prior controversies with Adobe, where they planned to allow AI-generated images in their stock library—a move announced on December 5—which some saw as directly harmful to stock artists, and an earlier dispute with Pantone which resulted in some users losing access to the Pantone colors they had deployed in earlier design projects using Adobe software. “People saw that little checkbox for sharing data used for machine learning and conflated it with all the current AI image generation drama currently underway,” says Daniel Landerman, a Los Angeles-based creative director and illustrator.
To Landerman’s eyes, the data sharing feature for machine learning has been present in Adobe apps for years—and only applies to files stored in the Adobe cloud, which he says “any professional shouldn’t be doing anyway.” Landerman has long made sure to uncheck any options that share data with app makers, as part of the process of working with clients who often require him to sign non-disclosure agreements to work on projects.
“Everything is moving so fast with all the AI stuff: Artists trying to get regulations to catch up, AI engineers and NFT bros trying to outpace the artists,” Landerman says. “I’m not surprised some other non-issues get caught up in the turmoil.”
But beyond artists’ concerns, data protection experts say they’re worried about the way Adobe has handled the process. “Under European ePrivacy law, Adobe needs opt-in consent before reading data from individuals’ devices for the purpose not necessary for the service the user requested,” says Michael Veale, a University College London professor who specializes in digital rights.
“Sharing data in this way isn’t just unnecessary,” says Veale. “It’s way beyond users’ expectations, many of whom may have signed NDAs with clients ensuring that the content they’re editing doesn’t go anywhere.” Veale says he believes that the way the opt-out, rather than opt-in, is handled could be subject to investigations under European ePrivacy law, similar to one Apple was fined $8.5 million for earlier this week.
“We give customers full control of their privacy preferences and settings,” an Adobe spokesperson told Fast Company in a statement. “The policy in discussion is not new and has been in place for a decade to help us enhance our products for customers.” The spokesperson directed any customer who prefers their content be excluded from analysis to the options on the privacy page.
Update, January 6, 2023: This story has been updated with a comment from Adobe.