Firefly Now Runs the Whole Creative Process, Not Just Parts of It

Published on 30.04.2026

GENERAL

TLDR

Adobe Firefly just stopped being a generator and became a creative operator. The new Firefly AI Assistant takes a single prompt, maps out a multi-step workflow, and returns finished assets, from product photos to social sets to brand moodboards. It runs across 60+ Adobe tools, coordinates multiple AI models, and does it without you switching contexts or opening a second app.

From One Photo to a Full Product Set

Here's what got my attention. Someone dropped a backyard squirrel photo into Firefly and described what they wanted. The assistant turned it into a cartoon, generated a sticker sheet, built a tote bag mockup, produced a studio product photo, sized social assets for seven platforms, and assembled a full brand moodboard. All inside one conversation. No separate apps, no format juggling, no opening a new file per step.

That's the difference this update makes. Creative work normally lives across five tools and three folders. The "final" version is buried in a file named something-final-v3-ACTUAL.psd. Firefly AI Assistant collapses that chain. You describe the direction, it maps the execution, asks clarifying questions at the forks, and brings back finished assets.

Eight built-in Creative Skills power this. Each one is a complete pre-built workflow: batch edit photos, build a moodboard, convert to vector, create mockups, generate social variations, prepare product photos, remove or replace objects, retouch portraits. You can call them directly with a slash command or let the assistant route to them based on what you asked for.

The Orchestration Layer Nobody Noticed Was Missing

Under the hood, Firefly AI Assistant coordinates over 60 Adobe tools. It's not a single model. It uses a mix, including GPT Image 2 alongside models from Google, Runway, and ElevenLabs, routed automatically based on the task. You're not picking the model. The assistant decides.

What I find interesting about this architecture is that it finally separates the work of choosing tools from the work of using them. Most AI tools make you operate them. You pick the tool, write the prompt, get an output, carry that output to the next tool, repeat. Firefly AI Assistant takes the routing step off your plate entirely. You stay at the direction level. The system handles execution.

The multi-model routing also matters more than it sounds. Image generation, video, and audio are not the same problem space. Using specialized models per task instead of one generalist model for everything is the right call, and it's the kind of decision that would normally land on you.

Templates Are the Real Power Move

There's one piece of this I think is being undersold. Adobe's connector for Claude and ChatGPT lets you point an AI at an Express template you already built. You lock the structure, leave the variable fields open, and the AI only touches what you left editable. Titles, dates, product names. The layout and branding stay intact.

This flips how AI-generated design work usually goes. The common failure mode is getting outputs that look different every time, because the AI is guessing layout and visual structure from scratch. Templates remove that guess. You build the system once, define what can change and what can't, and the AI operates inside those rules. Same assistant, consistent assets, because you gave it a playbook instead of a blank canvas.

Pair that with a naming convention, something like "use the product template for launch assets" or "use the e-book template for lead magnets," and the AI stops improvising structure entirely. That's the kind of setup that makes AI actually useful for production work rather than experimentation.

Key Takeaways

  • Firefly AI Assistant orchestrates 60+ Adobe tools from a single prompt, handling multi-step creative workflows end to end
  • Eight built-in Creative Skills cover the most common production workflows, from mockups to social variations to product photos
  • The assistant uses multiple AI models (GPT Image 2, Google, Runway, ElevenLabs) routed per task, not one model for everything
  • An Adobe connector for Claude and ChatGPT lets you run Firefly workflows from wherever you already work
  • Building templates in Adobe Express and pairing them with AI is the fastest way to get consistent, production-ready outputs at scale
  • The shift is from operating tools to giving direction: specificity in your prompt determines how close the first output lands