Nick Babich explores his process of turning design into code using Lovable and Anima and shares the pros and cons of each tool.
Karri Saarinen: “The idea that AI might ruin visual quality feels like a non-issue since there wasn’t much quality to ruin in the first place. […] My general view of AI is that it will just let us do more things, not take away things.”
Great post by an industry veteran Mike Davidson, offering a few suggestions to those feeling behind the AI wave already: “When it comes down to it, your future in design is the sum of all of your actions that got you here in the first place. The skills you’ve built, the artifacts demonstrated in your portfolio, your helpfulness as a teammate, your reputation as a person, and now more than ever, your curiosity to shed your skin and jump into an undiscovered ocean teeming with new life, hazards, and opportunity. Someone will invent the next CSS, the next Responsive Design, the next sIFR, the next TypeKit, the next IE6 clearfix, and the next Masonry for the AI era. That someone might as well be you.”
“This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.”
Tia Sydorenko argues that our interactions with digital systems “are not just changing; they are shifting in their very essence.” She builds her argument on this insight from Jakob Nielsen: “With the new AI systems, the user no longer tells the computer what to do. Rather, the user tells the computer what outcome they want.”
“Unlike straightforward direct manipulation — such as dragging a file between folders, where actions unfold step by step — AI interactions demand a more fluid, iterative process. Users articulate their goals, but instead of executing every step manually, they collaborate with the system, refining inputs and guiding the AI as it interprets, adjusts, and responds dynamically.”
Visual Electric is now available as a Figma plugin! It’s the first image generator built for designers, so you can ditch stock photography and generate precisely what you need. Requires an account; the free plan includes 20 image generations per month.
“Join Anton Osika (Lovable co-founder), Nad Chishtie (Design @ Lovable) & Steve (Builder.io co-founder) on a livestream where they’ll talk about Builder.io’s new Lovable integration that lets you turn Figma designs into Lovable apps.”
Xinran Ma walks through the creation of an AI automation that instantly categorizes Figma comments and generates a structured summary in Google Docs.
Carly Ayres asks the Figma community to weigh in on Andrej Karpathy’s “vibe coding.” “Perhaps the question isn’t whether vibe coding will replace traditional development—it won’t—but rather how it might expand who can build software and how we build it.”
“There’s a lot of buzz about AI agents. Robots that do more with less supervision—what could go wrong? We asked our community how this might shake up how we think about UX.”
“WaveMaker AutoCode is an AI-powered WaveMaker plugin crafted for Figma, enabling product teams to jump from design to code in an instant.AutoCodeproduces a fully‑functional app, with pixel-perfect translation of design elements, app navigations, and intended interactions.” (See the official press release for more details.)
Anima has been working on design-to-code tools since before the recent AI craze. A few months ago, they added support for shadcn/ui components, which I tried last week on my current project designed with this library.
Unlike v0, they parse the Figma file and get a lot of details right. I was impressed with how accurately it selected shadcn/ui components, even if layers weren’t explicitly named or instances were detached in the mockup. It becomes obvious that parsing a file is the right approach when different components look the same on the surface. For example, the trigger for opening a dropdown or date picker uses the same button, but they are different Figma components under the hood, and Anima chose their counterparts in code correctly.
Exporting custom colors and typography variables to a Tailwind config is also a nice touch. I ran into a few issues with excessive Tailwind styling and newer shadcn/ui components like the Sidebar not being recognized, but overall, this clearly feels like the right direction.
Vercel shares best practices on importing your designs from Figma to v0 and working with shadcn/ui. I was excited about this integration until I realized it simply exports the Figma frame as an image and passes it to v0’s AI vision. Information about Auto Layout, spacing, color tokens, and typography is not preserved from Figma but inferred from the image. That’s fine for rough prototypes, but there is a better way.
Two new AI features — quickly search through top Community files to find assets you need and increase the resolution and clarity of your images in just one click in the editor.
The new Lovable and Builder.io integration lets you turn Figma designs into full applications. Lovable is a full-stack AI software engineer and editing environment. It’s designed to let you quickly create and iterate on your projects so you can move from an idea to a real application, deployed live, without wrangling complex tools or environments. AI-Powered Figma to Code is Builder.io’s toolchain that leverages AI models to convert Figma designs into React, Vue, Svelte, Angular, HTML, etc code in real-time. It can even reuse your existing components in the output and use any styling library or system you need.
So, by using the integration, you can convert Figma mockups into code with Builder.io and then open them in Lovable, where you can add new functionality or connect it to the real data from Supabase. Soon, you’ll be able to update your app in Lovable whenever designs change in Figma. AI will merge the design changes while keeping all your custom code intact. (Unrelated, this combo was most recommended in answers to this question about the best AI tool for turning designs into a website.)
Vincent van der Meulen, Design Engineer at Figma, talks about Figma’s approach of complementing designers rather than replacing them as a part of the SaaStr AI Summit panel. They follow four key AI principles: improve existing user behaviors, embrace frequent iteration, systematic quality control, and foster cross-functional collaboration.
Speaking of shadcn/ui, Matt Wierzbicki published a new plugin using Claude 3.5 Sonnet (requires an API key) to convert Figma designs into production-ready shadcn/ui and Tailwind CSS code. It’s tailored to work best with his commercial shadcn/ui kit for Figma, but I’d expect it to work with Luis’ kit as well.
Moty Weiss shares his experience capturing and preserving brand consistency with AI illustrations. The idea of analyzing existing brand illustrations with ChatGPT to create a foundational prompt for Midjourney really stood out to me. The resulting illustrations adhere to the brand style and have a unique voice, looking very different from the AI-generated images flooding the internet. While they still need some work, the new tools are truly empowering: “While Midjourney’s results may still require final touches — such as vector conversion, line refinement, detail enhancement, and final polish from a professional illustrator — it represents a significant step toward independence for designers who struggle with illustration.”
A new plugin from Meng To turns Figma designs into production-level code with the power of Claude AI and GPT-4o. I mentioned it in the last newsletter, and it looks very promising so far. The plugin is free, but you’ll need to bring your own API keys.
Watch the video where Meng explains his Figma to SwiftUI code workflow.
AI is a big help in developing software, but this plugin takes it to another level: “Artifig is an AI-powered Figma plugin that empowers anyone to build their own Figma plugins using just natural language. No coding needed — simply describe what you want, and watch as your idea transforms into a fully functional, real-time plugin.” See examples in a thread from one of the authors.