Figma opens the canvas to agents. The use_figma MCP tool lets Claude Code and Codex generate and modify designs grounded in your actual design system. The key distinction from earlier code-to-design experiments: agents work with what your team has already built, making design system quality a direct input to AI output quality.
Mike Davidson runs the largest design team at Microsoft AI, and shares tips on making it through 2026. The assembly layer of design is being absorbed — button states, data processing, detailed specs. What remains, and what companies are hiring for, is orchestration: running AI and human teams toward a shipping goal. Specific advice on portfolio, job search strategy, and skills worth building.
Worth noting Mike’s scepticizm about data in the above report: “Perhaps the data reflects reality, or perhaps design jobs aren’t accurately tracked by this company, but either way, this is not what I or a lot of my colleagues at other companies are seeing. If anything, most cross-functional teams are more underwater on design than on other functions.”
Luis on what the “shadcn-ification” debate is actually about — not the visual uniformity, but the organizational misread: “The mistake isn’t in the ingredient. It’s in thinking that having access to good ingredients is the same as knowing how to cook.”
Stakeholders are concluding that the design system infrastructure is done because a great foundation exists. The teams that have spent years practicing design systems understand exactly why that conclusion is dangerous.
TK Kong shares a detailed guide to his workflow with Claude Code and Paper, the design tool built on native HTML/CSS rather than a WebGL canvas. The workflow of agent writing HTML into Paper frames, designer editing on canvas, and agent implementing code is similar to the Figma MCP workflow covered above, but also allows working with existing designs.
The Paper Snapshot Chrome plugin — which copies live web UIs directly into Paper as editable layers — is exactly what I wanted while wondering why Figma won’t make “a universal “Send to Figma” browser extension”.
Jakub Krehel’s collection of small interface details that compound into a significantly better experience: text-wrap balance, concentric border radius, contextual icon animations, tabular numbers, interruptible animations, optical vs. geometric alignment, and shadows instead of borders. Each one has a live interactive demo. The kind of small details that separate a polished interface from one that just functions.
These details also exist as an installable skill for Claude Code, Codex, and Cursor. Once installed, your AI coding agent automatically applies these principles when building UI.
Yann-Edern Gillet, design engineer at Linear, revisits his “Rosetta Stone” metaphor for design engineering translation through the lens of AI. The central argument: when translation is cheap and instantaneous, the bottleneck shifts from execution to meaning — and the new craft is preserving intent while everything accelerates.
An upcoming interactive exhibit restoring the design moments that shaped software history, from Xerox Alto’s Smalltalk-76 interface to the original “slide to unlock” on iPhone. Each restoration requires reverse-engineering details, motion, and behavior from low-quality assets.
Patrick Morgan makes a clean distinction that vibe-coding discourse keeps blurring: prototype code is for exploration, production code is for endurance. He is building a protected prototyping environment using Claude Code, a place where his team can move fast and then deliberately port the right assets across the boundary into production.
There is a clear parallel with how the design team at Notion works. In the recent episode of How I AI, Brian Lovin showed their collaborative “prototype playground,” where the entire team can create, share, and iterate on functional prototypes.
That also reminded me of how my team worked a decade ago, back when front-end development was a tad simpler. We had a separate “mockups” directory inside the Rails monorepo, where designers prepared static HTML mockups with production-ready CSS and JS. By the time designs were handed off to engineers in a feature branch, all polish and design details were already baked in. The design team must be fairly technical, but there is no going back to handing off Figma files after working this way.
The new Workflow Lab format, showing an end-to-end process, is a smart way to frame the new AI image tools in context. The three new tools (erase object, isolate object, expand image) are genuinely useful for anyone who’s had to leave Figma to do basic cleanup in Photoshop, and Vectorize finally removes a step that’s been a quiet annoyance for years.
A first look at Config 2026 speakers — AI artist Holly Herndon, creator of the world’s first 3D-printed fashion collection Danit Pele, designer and author Vicki Tan, designer and founder Matthew Ström-Aw, and mathematician and educator Grant Sanderson from 3Blue1Brown. Config returns to San Francisco on June 23–25. Virtual registration is free.
Figma explains how its new MCP server lets Codex generate Figma Design files from live code and, in the other direction, use Figma frames as structured context for agentic code generation.