At its annual I/O 2025 conference, Google unveiled Stitch, a new AI-driven design assistant that helps developers create web and mobile app interfaces using simple text prompts or images. Built on top of Google’s powerful Gemini 2.5 models—Pro and Flash—Stitch can instantly generate front-end code, including HTML and CSS, tailored to the user’s vision.
Whether you’re sketching out an idea or refining a functional UI, Stitch aims to fast-track the design process by turning vague concepts into clean, editable code. In a few words or with just an image, developers can generate responsive layouts and component-ready designs, which can be exported directly to tools like Figma or opened in traditional IDEs for further customization.
What makes Stitch stand out isn’t just speed—it’s its adaptability. Designers can modify specific elements, fine-tune layouts, and even annotate screenshots to request visual changes. This image-to-UI workflow is expected to roll out soon after the I/O showcase, according to Google’s product lead Kathy Korevec.
In a live demo, Korevec presented two examples: one was a mobile app concept for book lovers, and the other was a web-based dashboard for beekeepers. In both cases, Stitch demonstrated its ability to provide a strong design foundation while allowing room for human creativity and refinement.
Korevec emphasized that Stitch isn’t trying to replace robust platforms like Figma or Adobe XD. Instead, it serves as a launchpad—a fast, approachable way to build the first iteration of an app’s design. It’s ideal for developers who want to move quickly from idea to prototype without needing to code every detail from scratch.
Although Stitch doesn’t offer all the advanced features of professional design suites, it marks a key step in the growing movement known as “vibe coding.” This trend is all about using AI to handle the heavy lifting in programming and design so creators can focus on innovation. Startups like Anysphere (creator of Cursor), Cognition, and Windsurf are also targeting this space. Meanwhile, OpenAI’s recently launched Codex assistant and Microsoft’s enhanced GitHub Copilot show just how quickly this segment is heating up.
Alongside Stitch, Google also announced broader access to Jules, its AI coding agent now available in public beta. Designed to assist developers with tasks like debugging, managing pull requests, and tackling project backlogs, Jules uses Gemini 2.5 Pro to understand and interact with complex codebases.
In a separate demo, Korevec showcased Jules upgrading a legacy Node.js 16 site to version 22. Jules cloned the site, proposed an upgrade plan, executed the migration in a clean environment, and ran post-upgrade tests to ensure everything worked seamlessly.
Although Stitch and Jules aren’t full replacements for advanced design or dev tools yet, they represent Google’s broader push toward an AI-first development experience—one where collaboration between human creativity and machine intelligence isn’t just possible, but effortless.