Search for content, post, videos

Responsible AI in stock photography – protecting creators

AI, machine ‘learning’ or machine ‘scraping’ as some fear, is going to be a problem for image creators.
Some of the big stock photo agencies are trying to get to grips with it but Adobe have jumped straight in to try to start controlling the situation for image creators – watch how in their explainer video below:

Adobe say: In this video, watch how digital storyteller Hallease Narvaez uses the Adobe Firefly beta, a new family of creative generative AI models, in her workflow. Adobe automatically attaches the CAI feature Content Credentials to content created with Firefly to bring more transparency to digital content and indicate that generative AI was used—allowing us to experience content and context together. From generating images to editing a final thumbnail, Content Credentials in Firefly and Photoshop gives you easy access to the open standard for content authenticity and is coming soon to more Creative Cloud apps. The ability to determine who created digital content, how it was made and edited enables trust and authentic storytelling online. Since its founding in 2019, the Content Authenticity Initiative has grown to a global community of more than 900 members—media and technology companies, NGOs, developers, designers, photographers—all focused on building an ecosystem of attribution and transparency in the digital content we create and consume.


• Related on PAN:
Adobe launches it’s AI Generator – meet Firefly
Adobe unveils content attribution tool – helps prove image authenticity

Leave a Reply

Your email address will not be published. Required fields are marked *