A speaker at our TEDAI event last year said something that stayed with us: The best AI gives you access to forms of creativity, insight, and comprehension you didn't have before. AI thinks differently than we do, and that difference is useful. As a design partner, it brings a new kind of material to work with: it reasons, generates, and connects ideas in ways humans don't. Just as Steve Jobs called the computer "a bicycle for the mind," AI is another new cognitive tool, and we're still figuring out the metaphors that explain what it can make possible.
To uncover these metaphors, we need to ask ourselves some potentially uncomfortable questions: What happens to creative work when making gets easier? What changes when nearly everyone can also build the whole thing themselves? And what happens to human creativity when the baseline gets handled by a machine? By partnering with something that doesn't think like us, we think the answers might be wilder than we expect.
.webp)
Co-conspirator creativity: AI as a design partner
Unlike earlier computational tools that sort or combine information in mostly predictable ways, LLMs do something unusual: reason step by step and make sudden associative leaps. They can be methodical and surprising in the same conversation. That dual nature, combined with access to a compressed map of human knowledge, is what makes them feel less like a tool and more like a thinking partner. And that matters for creativity specifically. A partner that can both reason carefully and leap associatively gives you more surface area to work with, more directions to explore, more edges to push against. They can go both deep and broad, depending on how you engage them, helping expand the perspectives and information you're considering, or interrogating the quality of your work through expertise and asking questions. The effect is an expansion of what you can reach from where you're standing. A different architecture of thought, neither the analysis of past statistical methods nor the stochastic insight of tools like the I Ching.
NotebookLM is an example we often come back to. It created a new way to understand dense, complicated information. Suddenly, you can hold the shape of big ideas in your hands, rearrange them, question them, and make more sense of them, instead of skimming endlessly across disconnected documents. Before LLMs, there really wasn't a tool that made this kind of sense-making possible at scale to so many.
Tangible thinking: Bringing ideas from vision to reality across mediums
There's a whole wave of tools right now that give designers more reach. Agentic coding tools like Claude Code and Cursor, creative suites like OpenStudio and Flora, and all the rapid prototyping environments that have emerged over the last couple of years let us move from idea to expression much faster than before. Sometimes that means getting a feeling out of your head and onto a screen in minutes. Sometimes it means exploring a direction you wouldn't have bothered with when the cost of making was higher.
One of our favorite early signals of this shift came from an IDEO project in 2024. Our colleague Tomocini Mori was prototyping possibilities for new, more sustainable materials for children's toys. Instead of showing kids sketches or static prototypes—in under two hours—he used a mix of tools to create short trailers of what the toys could be. ChatGPT helped him shape the scripts, Runway generated the visuals, and ElevenLabs produced the voiceover. The result was a set of small cinematic worlds that kids could react to. They got to go beyond just looking at a toy: they experienced the toy's personality, imagined the story around it, and gave feedback that went way deeper than what we'd get from a traditional prototype. It gave them a real sense of the future object, and it gave us research insights with a rigor we probably wouldn't have reached with our previous methods.
Tomocini was already a creative thinker. AI didn't do that for him. Instead, it collapsed the distance between his idea and something kids could really experience. When AI is built well and used well, it makes human ideas easier to reach.
.webp)
Fast food creativity: Making more with less substance
Of course, more people making more things isn't automatically good. Some of what AI enables does create meaning or value: people finding expression across languages, mediums, and skill gaps they couldn't cross before. But too often, people use these tools just for volume. "Fast food creatives" are creating, but they're not creative. If we want the next generation of technology to support the next generation of society, the difference matters. Supporting people in evolving their craft and doing better, deeper, more imaginative work is the point. Strip-mining labor and creativity to produce more isn't. It also isn't a particularly good strategy for expanding our society's capabilities and economic future.
That doesn't mean every part of design needs to be preserved or protected. Some tasks are repetitive or simply tedious (like summarizing, formatting, and documentation). If a tool can take care of those, great. But let's be real: a freelancer whose reliable output has just become automatable is not experiencing this moment as liberation. The point isn't to hold onto the past. It's to position how these tools are valued as a widening and deepening of creativity, both for the craftsperson and for the business who might employ them. Current media narratives often paint a picture of cost-cutting in service of building that next data center vs. defending some last bastion of the value of human labor. This zero-sum thinking will not lead us to any of the abundant futures that so clearly could be supported by these new capabilities. We need to build access to these tools in a way that doesn't leave people behind, like our current trajectory. As the designer's toolkit expands, our role changes: we become builders, and with that, we bring judgment, taste, and curiosity to a much wider range of problems.
Abundant innovation: Accelerating breakthroughs
Deeper, more imaginative work might matter more than ever. There are growing problems to solve in education, medicine, climate resilience, outer space, materials, communication, and dozens of emerging fields that didn't exist even 10 years ago. There's no shortage of things to invent. There's only a shortage of time, bandwidth, and incentives.
AI can be a useful multitool, helping people with ideas but no way to act on them augment themselves enough to make those ideas real. If we use and design AI tools intentionally, they make design more informed. They also make room for slower, deeper thinking; when the superficial parts are handled, you can choose to stay with an idea longer. A designer can now think alongside the depth of materials science, cognitive research, or ecological systems in ways that weren't previously practical. In a recent example, an interaction designer and a behavioral scientist built a simulation environment, using LLM agents informed by real behavioral research, to explore how social dynamics impact how customers might interact with some of our exploratory new feature designs. By observing these limited models of human behavior, we gained a different kind of insight into the kinds of services our client might credibly offer their customers.
That’s the opportunity in front of us: AI that raises our ceiling. Tools that help us understand more, see more, create more, and build a future that feels worth living in.
.webp)
Strange environments: Giving LLMs a weird world to live in
Expecting brilliance, or honestly even helpfulness, from LLMs out of the box is a recipe for disappointment. After all, LLMs are prediction machines (arguably, we are too). They're rewarded when their predictions are reasonable, which means if you give them average inputs, you get average outputs. "Garbage in, garbage out" is only half the story. The next most likely token will only be boring if what came before is boring as well. The better half is: strange in, strange out.
To get divergent creative output, your responsibility is to build a weird world for the model to live in. That means pushing context further than feels necessary, adding texture, contradiction, and the considered detail and constraints that paint a rich picture of your intentions for the model. It may also mean swapping out large parts of the frame entirely, much like Brian Eno and Peter Schmidt's classic Oblique Strategies cards, which force creatives to abandon their default moves.
In some ways these divergences were more common, if less controlled, in earlier AI systems. When DALL-E first came out, one of us used it to generate images of kids playing with electronics for a hardware project. The results were janky: weird form factors, strange proportions, nothing polished. But those hallucinations were the point. The odd shapes pushed thinking beyond how we assumed hardware could be designed or used, and led to ideas we wouldn't have reached from a blank page.
Models that work well aren't going to surprise you on their own. But give them an environment rich enough, and they can help you go somewhere you couldn't have predicted. And if that's what it takes to coax some new ideas out of a model, it might also be what it takes to coax something new out of ourselves.
Becoming wild: When the floor is raised, the ceiling shatters
There's another idea we keep circling, and it sits next to all of this. It's not comfortable, because we know what this moment actually feels like for a lot of our creative community. It doesn't feel like expansion. It feels like the ground is moving. But as AI gets really good at producing the clean, average version of creativity—the competent logo, the solid first draft, the serviceable ad—human creativity is going to move somewhere else entirely. We'll use the machine to explore the edges.
When the generative AI boom started, we were obsessed with the idea of becoming wild. We were thinking about how much of our lives are still shaped by industrial-era patterns. Wake up, sit at a computer, produce, consume, repeat. AI threatens those patterns, but has also opened a door.
As the ground shifts, those of us deep in this technology are getting wilder. We’re experimenting. We’re following curiosity. We’re wandering without a plan because the tools can keep up. We’re surrendering to a direction before we know where it leads. We’re spending more of our energy on the parts of creativity that feel alive. In this way, AI gives us room to express the parts of humanity we've had to compress just to keep up.
You can see this woven throughout our annual AI Variety Show. In Grishma Rao's Entire History of You project, she fed participants' real Instagram feeds and personal goals into a generative system that produced future-facing social media timelines—possible versions of your life played out in posts. It was unsettling and compelling in equal measure, the kind of thing that makes you reconsider your own choices. Interaction designer and ceramicist Julia Grandfield used generative AI to design new patterns and structures for her pieces. Dave Vondle created a game with friends out of animals they see in the real world.
Maybe that's one of the futures available to us. A future where the tools ground us and connect us to the understanding of the past, and we use that room to be more imaginative, more instinctive, more playful. A little more wild. If AI can help us get there, that future of creativity is something worth building toward.
Thank you to Danny DeRuntz, Angela Kochoska, Thomas Overthun, and Ridima Ramesh for their many edits and additions to this piece.
(AI was used in the generation of these images.)
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript















.webp)

