November 13, 2025
November 13, 2025

AI Can Now See Its Own Work: How Codex Changes Design Forever

by
Sasha Shumylo
,
Associate Partner and Product Strategist

OpenAI has just unveiled the next generation of Codex, now powered by GPT-5 multimodal capabilities. For the first time, AI can not only write code — it can see, understand, and visually check its own work.

To be clear, Codex isn’t a new “design tool” — it’s a developer IDE that now happens to understand visuals. But as more designers move closer to code, it opens new ways to build, test, and validate ideas directly — without waiting for full dev cycles.

This isn’t just faster coding. It’s a redefinition of how design and development collaborate.

From prompt to prototype in minutes

In the demo, Codex received a rough wireframe — a 3D globe for a travel app. Within minutes, it built a responsive screen, animated the globe using 3JS, and verified both mobile and desktop layouts. No manual QA, no handoff, no waiting for developer capacity.

It even validated dark mode and suggested adjustments like tooltips or spacing tweaks. That’s not magic — it’s fast automation with “average but good-enough” output, ideal for testing before design teams step in to refine.

For design teams, it means less time lost between idea and implementation. For businesses, it means faster MVPs, quicker validation, and shorter time-to-market. Let’s dive in a little deeper.

What this means for the design industry

Typical product development process

Until now, design agencies have worked across three main layers: research, UX/UI design, and implementation. Codex, among other AIs, blends these layers into one continuous loop.

  1. From static to interactive design — teams can test flows directly in the browser, not just on Figma boards.
  2. From handoff to co-creation — developers and designers now share one AI teammate instead of passing files back and forth.
  3. From weeks to hours — interactive prototypes can be generated and tested the same day.
AI driven product development

This shift makes AI-assisted UX/UI design a new standard rather than an experiment.

How it transforms our process at The Gradient

At The Gradient, we’ve been exploring AI-powered workflows long before they became fashionable — from AI-accelerated product discovery to rapid prototyping. As early adopters of AI in design, we’ve built our own way of working — one that merges design, engineering, and AI experimentation into a single, fluid process.

Prompts > Pixels

Prompting for Norvana, an AI-first health assistant

Beautiful UIs matter, but in AI products, prompts do the heavy lifting. So we focus heavily on writing, testing, and evolving the right ones.

We call it “first, we build.”

Instead of spending months on screens and presentations, we start coding from day one. In the first week of collaboration, our team already produces a live, playable prototype — not for show, but to test value. Because ideas that sound great in theory often collapse in practice — and we’d rather learn that early.

Take our Norvana project as an example. Initially, we designed an “activity score” feature — a daily tracker that showed users how close they were to reaching a target score (say, 78 out of 85). On paper, it sounded motivating. But once we built and tested it with real data, it turned out no one cared. The metric felt abstract, disconnected from real behaviour — users didn’t feel progress or purpose.

Norvana AI interface

That moment became a turning point: we realised you can’t design AI experiences in theory; you have to live them. The only way to know if something works is to make it real, interact with it, and see where it breaks. This philosophy now defines how we build. We don’t chase perfect flows — we chase validated value.

Our designers don’t stop at Figma; they work hands-on with AI tools like Cursor, Lovable, and Replit. That allows us to feel the product, not just visualise it — to experience how AI responds, what it asks, and whether the interaction feels natural. It’s design through experimentation, not assumption.

Here’s how this process changes outcomes for our clients:

1. Faster design validation

Codex and similar AI tools allow us to assemble interface components and test them across devices within hours. This cuts the validation cycle by up to 60 %, giving clients a working version to explore almost immediately.

2. Real-time iteration and value testing

Our teams iterate in real time — testing hypotheses, user reactions, and flows in living prototypes instead of static mock-ups. It’s no longer about making things look perfect; it’s about finding what genuinely works and improves user experience.

3. From chaos to clarity

Every project starts with a messy mix of ideas and hypotheses. AI helps us turn that chaos into something tangible in days. By bringing prototypes to the first client meeting, we make collaboration faster, cheaper, and more engaging — clients can see, click, and react.

The result: fewer assumptions, more evidence, and a strong sense of momentum from day one.

4. Continuous co-creation

Designers, PMs, and engineers now collaborate inside shared AI environments. Instead of lengthy handoffs, we exchange prompts, sketches, and lightweight code snippets. This makes enterprise design cycles leaner, clearer, and far more human-centred.

5. The lower cost of error

Because everything moves faster, mistakes are cheaper. When testing takes days, not months, teams are more open to feedback and experimentation — and products evolve faster.

The human factor still matters

While AI tools redefine speed and automation, great design still requires empathy, judgement, and taste — things AI can’t replicate. 

We use AI to generate quick MVPs, then jump into Figma to polish, craft, and infuse visual identity. Design remains the key differentiator — the layer that turns “AI-made” into “human-loved.”

In a nutshell

Vibecoding brings speed to design — not perfection. It helps teams test ideas earlier, validate faster, and bridge the gap between concept and code.

At The Gradient, this isn’t a shift we’re adapting to — it’s the space we’ve already been building in. We design AI-native products by combining intelligent automation, rapid prototyping, and human-level craft — using tools like Codex not to replace creativity, but to scale it.

About author
Sasha Shumylo
LinkedIn
A Product Strategist with over 13 years of experience in marketing, product strategy, and branding. His love for analytics, funnels, and a structured approach ensures that the digital products we craft aren't just functional—they impress.

More from The Gradient