My first real-world use of vibe coding to improve a product design workflow
Not all vibe coding is useful. But sometimes, it actually helps you do the job better, faster.
Over the past year, I explored a lot of AI tooling during my sabbatical, mostly for personal creative work. I was excited but skeptical about the benefits for my professional product design workflow.
Tools like Bolt.new, Lovable, and Figma Make are fun to experiment with, but in a real product environment, especially one like cybersecurity, there’s no way I’m handing off that code and expecting it to reach production.
Then I joined Sublime Security. And within a couple weeks, I found a use case for vibe coding that actually made sense. Not as an engineering shortcut, but as a clearer, faster way to communicate design intent inside a constrained, technical system.
The quick turnaround data viz challenge
Shortly into my new role, our team was preparing for RSA Conference in San Francisco, a major security industry event. We wanted to showcase a dashboard visualizing the value of our new AI security agent.
This posed some immediate challenges:
- As a new employee, I hadn’t yet built out many system resources in Figma
- Creating static mockups of data visualizations is always a pain
- We use Apache ECharts for our visualization library, which comes with its own constraints and capabilities
I had a choice: either spend a bunch of time reverse engineering ECharts components into Figma mockups, or find another approach.
Why static mockups fall short for data viz
If you’ve designed dashboards before, you know the pain of communicating data visualizations through static design files:
- Interaction details are easier to miss
- Realistic data distributions are time-consuming to fake
- Ignoring library constraints leads to design–dev mismatches
- Engineers still have to translate designs back into library-compatible configurations
All of this is tedious at best.
In my case, since we were already committed to using ECharts, I would effectively be recreating assets that already exist in code, using up my limited time and introducing more opportunity for error.
Ultimately, I decided that wrangling Figma to communicate these charts would’ve cost more time than it was worth.
A better approach: Safe environments for vibe prototyping
Instead of fighting these limitations, I decided to work directly with the tools our engineers use.
ECharts has a built-in playground demo mode where you can prototype specific chart types right in the browser. And since it’s a well-known, publicly documented library, major LLMs like those from OpenAI and Anthropic know about its capabilities. This let me set up a simple vibe coding workspace: ChatGPT on one side, ECharts sandbox on the other.
With this setup, I could:
- Ask ChatGPT about ECharts capabilities and syntax
- Describe what I wanted to achieve from a design perspective
- Generate code that I could immediately test in the sandbox
- Iterate rapidly until the visualization matched my design intent
- Share working examples directly with the engineering team using the sandbox’s share link
This approach greatly reduced the design execution gap.
Engineers could see and feel exactly what I wanted, within the constraints of the library they needed to use to achieve the task.
What made this valuable
The most important outcome was that I communicated the design intent more clearly and quickly by leaning into a tool that was better suited for the job than static mockups.
I don’t know how much of the code I generated was actually used by the engineers on my team. But that’s not what mattered.
The value came from:
- Staying within actual technical constraints rather than designing fantasy interfaces
- Exploring possibilities more efficiently than reading documentation
- Prototyping in the actual medium that would be used in production
- Communicating design intent more clearly through interactive prototypes
Most importantly, this approach respected everyone’s role. I wasn’t trying to be an engineer — I was using code as a tool to express intent, nothing more.
When vibe coding actually makes sense in production product design environments
This experience helped me crystallize when the current AI vibe coding tools are genuinely valuable for product designers:
- When the goal is communication, not production: Using code to express intent, not commit to implementation details
- When there’s a safe sandbox environment: Places to experiment where nothing mission-critical can break
- When working within well-defined, constrained systems: Charting libraries, design systems, UI frameworks with clear documentation
- When the traditional design-to-development handoff creates friction: Highly interactive elements that static mockups can’t adequately represent
- When the design execution translation cost is high: Complex experiences and interactions where misalignment is expensive
Vibing beyond data visualization
The point of all of this is not to turn myself into a person who pushes code to production (I don’t want that responsibility).
The point is to communicate my intent as clearly as possible to the people responsible for building it, maximizing the chances we build something great, with quality and speed.
Data visualization is a familiar example that nearly every product designer has grappled with at some point. But this workflow will become increasingly important as the non-deterministic and interactive nature of AI data and workflows requires us to use tools that better communicate designs for those fluid experiences.
This project wasn’t a grand statement about the future of product design. It was simply the best option to get the job done well.
Product design exists to help the team create the thing we’re all trying to build together.
And I’ll happily use any tool that improves the odds we get there.
Patrick Morgan is the creator of Unknown Arts and lead product designer at Sublime Security. If you enjoyed this post, subscribe to his newsletter or follow him on LinkedIn for weekly insights.
Why I skipped Figma and prototyped in code instead was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.