Vercel introduced the new design mode in v0 — quickly tweak generated copy, typography, layout, colors, styling, and mode. These changes do not require spending credits or waiting for an LLM. Tailwind and shadcn/ui are supported out of the box.
Designer Advocate Alexia Danton shares the team’s favorite prompts, pro tips, and best practices for using Figma Make to help you get the most out of the recently launched prompt-to-code feature.
Romina recorded a quick walkthrough on how to build clickable prototypes using Figma MCP and Cursor.
Elie Majorel shares the playbook of prototyping with AI tools, allowing other designers to spend less time on appearance and more time on impact. “One Sunday I opened Miro, sketched a few boxes for a new agent search, and copied the flow into Claude. Claude wrote a clear spec. I pasted that prompt into Lovable, pressed generate, and two hours later a working React repo ran in a sandbox. Engineers forked the code the week after. Leaders clicked the demo and said keep going. Two hours from idea to running product. No Figma layers. No endless handoff.”
Gary Simon recorded a video tutorial on how to set up and use the new Figma MCP server. He shares tips on how to set up Cursor and prepare your designs for agentic coding tools with Auto Layout, variables, and naming your layers.
Figma announced the beta release of the Dev Mode MCP server, which brings it directly into the developer workflow to help LLMs achieve design-informed code generation. Jake Albaugh shared a sneak peek at the GitHub × Figma Dev community event during Config, and I’m excited to finally give it a try.
“[MCP server] allows developers to bring context from Figma into agentic coding tools like Copilot in VS Code, Cursor, Windsurf, and Claude Code. Whether it’s creating new atomic components with the proper variables and stylings or building out multi-layer application flows, we believe this server will provide a more efficient and accurate design-to-code workflow.”
“If you’ve already invested in a design system and leverage patterns like components, variables, and styles that are aligned between design and code, the Dev Mode MCP server is a multiplier — we want to make sure that the LLM can benefit from these patterns, too. Agentic search techniques can take quite a bit of time to locate the right patterns, especially in large codebases. They may also find valid patterns that stray from those referenced in a design. By providing references to specific variables, components, and styles, the Dev Mode MCP server can make generated code more precise, efficient, and reduce LLM token usage. […] If Figma knows which components you’re using, it can share the exact path to the code file the agent needs with Code Connect.”
“The State of AI in Design report, created by Foundation Capital and Designer Fund, is based on a survey of 400+ designers and conversations with leaders at Stripe, Notion, Anthropic, and more. It explores the real impact of AI on design today, in 2025.”
Meng To first wrote a comprehensive guide on UI prompting, and now recorded this 44-minute tutorial as well.
(Read without a paywall.) The Verge interviews Dylan Field about “how he sees AI fitting into Figma after a rough start to integrating the technology last year, the new areas he’s targeting to grow the platform, and more.”
A preview of an interesting new “design system tool for the AI era”. Imports design from Figma, creates a new component, and makes it instantly installable with the shadcn CLI tool.
Meng To shows how to generate designs in Aura and bring them to Figma. He includes a Figma file with 57 examples and they look pretty good!
John Maeda: “Even with these shifts, I don’t believe AI is replacing designers. If anything, it’s forcing us to focus on what only humans can provide: judgment, empathy, ethics, and the ability to ask the right questions. AI lets us scale and experiment in ways that weren’t possible before, but meaning, care, and resonance still come from human insight and intent.”
Andrew Hogan, Head of Insights at Figma: “With AI and the momentum around “just doing things,” we’re embracing experimentation and building at an eye-watering pace. Still, it’s up to us to steer these tools in the right direction—and if history is any guide, the most valuable innovations may be just around the corner.”
Figma explores five key takeaways from the report, and what they say about the state of design and development: agentic AI is the fastest growing product category; design and best practices are even more important for AI-powered products than traditional ones; smaller companies are going all in; designers are less satisfied with the output of AI tools than developers; there are still questions about how to use AI to make people better at their role.
If you’re curious about the new gpt-image‑1 model, check out this announcement from OpenAI: “Today, we’re bringing the natively multimodal model that powers this experience in ChatGPT to the API via gpt-image‑1, enabling developers and businesses to easily integrate high-quality, professional-grade image generation directly into their own tools and platforms. The model’s versatility allows it to create images across diverse styles, faithfully follow custom guidelines, leverage world knowledge, and accurately render text—unlocking countless practical applications across multiple domains.”
The new Edit Image feature allows changing an image using prompts, powered by gpt-image‑1. The Make an Image feature got an AI model picker so users can choose between gpt-image‑1, Gemini Imagen 3, or Titan V2. Additionally, the AI beta was rolled out to all Professional, Organization, and Enterprise plans. If you’re not seeing AI features, check that your Admin has your AI access toggle turned on.
After reading Lenny’s Newsletter for a few years, I’ve recently changed to an annual subscription to benefit from the incredible value of this bundle. In addition to free annual plans of great productivity tools Linear, Notion, Perplexity Pro, Superhuman, and Granola, the bundle now also offers the hottest AI tools Bolt, Lovable, Replit, and v0.
“Superflex helps you write front-end code from Figma, images and prompts while matching your coding style and utilizing your UI components.”
Bold moves from Shopify’s CEO Tobi Lutke, shared in an internal memo. On general AI usage: “Using AI effectively is now a fundamental expectation of everyone at Shopify. It’s a tool of all trades today, and will only grow in importance.”
On prototyping: “AI must be part of your GSD Prototype phase. The prototype phase of any GSD project should be dominated by AI exploration. Prototypes are meant for learning and creating information. AI dramatically accelerates this process. You can learn to produce something that other team mates can look at, use, and reason about in a fraction of the time it used to take.”
AI skills will be a part of the performance reviews and affect future hiring. Highly recommend reading an entire thing.
Karri Saarinen from Linear: “Prompting is essentially like writing a spec, sometimes it’s hard to articulate exactly what you want and ultimately control the outcome. Two people looking for the same thing might get wildly different results just based on how they asked for it, which creates an unprecedented level of dynamism within the product. This shift from deterministic traditional UI to something more unbridled raises a challenge for designers: with no predictable journeys to optimize, how do you create consistent, high-quality experiences?”