Adobe embraces conversational AI editing, marking a ‘fundamental shift’ in creative work

theverge

Adobe embraces conversational AI editing, marking a ‘fundamental shift’ in creative work

Adobe’s new Firefly AI assistant can use Creative Cloud apps for you.

Adobe’s new Firefly AI assistant can use Creative Cloud apps for you.

You don’t need to understand any fancy editing terms — just describe what changes you want to make.
Image: Adobe
is a news writer focused on creative industries, computing, and internet culture. Jess started her career at TechRadar, covering news and hardware reviews.

Adobe is fully embracing AI tools that enable creators to edit their work using descriptive prompts, instead of manually using specific Creative Cloud apps. The software giant’s new Firefly AI Assistant allows users to describe what they want to change by typing their own words into a conversational interface.

Adobe says this marks a “fundamental shift in how creative work is done” by removing skill barriers and laborious tasks, while still giving creatives full control over their work. It’ll be “available soon” on the Firefly AI studio platform according to Adobe, though no specific launch date was provided in the announcement.

The unified AI interface, which builds on the Project Moonlight experiment that Adobe introduced at its Max conference last year, automatically performs “complex, multi-step workflows” to edit projects, utilizing specific tools and apps (including Firefly, Photoshop, Premiere, Lightroom, Express, Illustrator, and more) on the user’s behalf.

Users of the Firefly AI Assistant can instruct the chatbot to “retouch this image” or “resize this for social media,” for example, with Adobe’s AI agent then providing a selection of edits to choose from alongside surfacing specific tools or sliders that allow creators to fine-tune the results. For more detailed adjustments, creatives can also open the edited results in Creative Cloud apps to finish the project.

The Firefly AI assistant will learn the user’s preferences over time, such as preferred tools, workflows, and aesthetic choices, to help make the results feel more personalized and consistent. Adobe’s AI chief Alexandru Costin told The Verge that creatives will be able to choose whether to enable this feature, and can select specific projects for the AI assistant to learn from. Creatives can also create “Creative Skills” — tools that provide specific and consistent presets — that the AI assistant can execute, or select from a library of pre-made skills at launch.

The Firefly AI Assistant is designed to understand natural language commands, allowing you to adjust content without expertise in Adobe’s professional editing tools.
Image: Adobe

This is Adobe’s latest push into the world of AI agents, having already launched specific AI assistants for apps like Adobe Acrobat, Express, and Photoshop. Adobe says it will also bring these agentic features to third-party AI apps like Anthropic’s Claude, allowing those users to access Adobe tools outside of its own Firefly and Creative Cloud platforms.

This announcement comes alongside some new image, video, and audio editing capabilities for Adobe’s Firefly platform, which are rolling out starting today. The Firefly Video Editor is now integrated with Adobe Stock for easy access to B-roll footage, and allows users to access new features for improving color adjustments and the clarity of spoken dialogue. New editing features are also available in the Firefly image editing tool — Precision Flow, which enables creators to make and compare a wider range of generated images without adjusting their prompts, and a new AI Markup tool that lets users control where edits should be made using brush and rectangle tools or reference images.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Source: theverge

arrow_back Back to News