Many told us it was impossible - too ambitious, not realistic.
I now know this decision was the right one. It was a bet, and we won. I can state with certainty: the direction we took with Zadig&Voltaire allowed us to secure many things essential for the business.
A front-end rebuild is no small feat. Typically, you bring in an external agency, teams get duplicated, and there’s a massive project to manage - we all know the risks, delays, and costs involved.
What we know now is that with 1 Technical Project Manager, 2 Product Owners, and 3 developers, it’s possible to do a complete internal rebuild with AI.
We started in July 2025. Today, we’re done. And we did it in style.
I even have the luxury of writing this blog to describe what was accomplished - before the new site even goes live.
Yes, we saved time. We believe it’s time to share this experience, because we see that the industry hasn’t yet grasped the stakes and the gains. There’s fear, hesitation, a lack of boldness.
Be bold. Take the leap. You won’t be able to go back.
That’s our message, in February 2026.
What we had: technical debt
Our legacy frontend ran on Vue Storefront, a solution that had proven itself but was showing its limits.
Vue 2.7 with a monolithic Vuex store: 104 lines of wrapper to orchestrate over 20 modules. Apollo Client v2 with manual cache management that looked more like archaeology than modern code.
The most painful part? Three separate Webpack configurations. One for the client, one for the SSR server, one for the service worker. Each with its own overlay system for themes. Modifying a build alias meant touching three files and hoping nothing broke.
The modular structure looked organized on paper: 20+ modules in src/modules/, each with its own graphql/, store/, helpers/ tree. In practice, it was massive pattern duplication and accumulating technical debt. All sprinkled with Vue 2 mixins - that antipattern we’d been carrying for years.
The timeline: our AI stack evolution
This is where our story diverges from classic migrations. We didn’t just change frameworks - we changed how we work.
We explore Cursor and Windsurf. Both tools promise AI assistance integrated into the IDE. We test, compare, hesitate. The team is skeptical but curious.
The decision is made: we rebuild cleanly from scratch. And we do it with AI as our copilot.
Migration kicks off with the full team. Everyone codes with AI assistance. The first Nuxt 3 composables are born. GraphQL Codegen is installed from day one - no more manual types ever again.
It’s hard. The legacy code serves as reference but the shift to Composition API is a complete paradigm change. The team does things right and co-builds an extensible architecture.
We go full Claude Code. The change is radical.
Where Cursor and Windsurf allowed us to co-build, Claude Code offers contextual understanding of the entire codebase. We’re no longer talking about code suggestions - we’re talking about conversations with a developer who knows the entire project.
The SSR rules we were learning the hard way? Claude Code knows them. The Nuxt 3 patterns we were searching for in the docs? It applies them directly.
- The explosion of MCPs helps us in our build - they contribute to the overall project quality
- We integrate specialized agents
- We migrate from
.windsurf-rulestoCLAUDE.md
Opus 4.5 releases. The acceleration is immediate.
The model understands intentions better, generates cleaner code, makes fewer mistakes. AI code reviews become reliable. Complex refactorings pass on the first try.
We start delegating entire features, moving from co-development to agentic development and orchestration.
Late November we integrate SKILLs: zv-commit, zv-code-review, zv-jira, zv-jira-qa…
As well as community skills: skills.sh
Team productivity explodes. x2, x3 compared to our initial estimates.
What was supposed to take a week gets done in two days. Bugs are detected before they’re even merged.
We finish the migration ahead of schedule. We have time to polish, optimize, document, add features that were initially postponed.
- Shift to agentic workflow: agents code, developers orchestrate
- Global consolidation of all projects into a single shared
workspace
Technical choices
Nuxt 3 over another framework? The Vue ecosystem was already mastered by the team, and Nuxt offered native SSR. Plus it’s French (acquired by Vercel in the meantime)
GraphQL Codegen? Non-negotiable. After maintaining manual TypeScript interfaces for years, auto-generation from the schema was a no-brainer.
MCP / SKILLs are mandatory if you’re working with an agent today - the gain is real and paves the way toward automating your workflow.
The SSR rules, we learned them the hard way:
useState()is mandatory, neverref()at module level. Otherwise, state is shared between requests, bugs impossible to reproduce locally.- Never call a composable inside a
watch(). The Nuxt context is no longer available, it fails silently. - Clean up watchers in
onBeforeUnmount(). Memory leaks guaranteed otherwise. - Plugins cannot touch
console.*orwindow.*server-side.
Results
Code volume
| Metric | Vue Storefront | Nuxt 3 |
|---|---|---|
| Application code lines | ~72,000 | ~48,000 (-33%) |
| Stores | 1 Vuex monolith | 8 Pinia |
| Manual GraphQL types | ~20 files | 0 (auto-generated) |
| Composables | 0 (mixins) | 75 |
| TypeScript coverage | ~50% | ~96% |
Less code for the same functional coverage. The Composition API and Nuxt 3 composables eliminate the boilerplate from mixins and the monolithic Vuex store.
Lighthouse Performance (mobile)
HP (Homepage)
| Metric | VSF | Nuxt | Delta |
|---|---|---|---|
| LCP | 18.9s | 7.0s | -63% |
| TBT | 910ms | 200ms | -78% |
| CLS | 0.131 | 0.011 | -92% |
| FCP | 11.4s | 4.0s | -65% |
| Speed Index | 18.8s | 6.1s | -68% |
| TTI | 19.0s | 16.8s | -12% |
PLP (Product Listing Page)
| Metric | VSF | Nuxt | Delta |
|---|---|---|---|
| LCP | 19.4s | 8.1s | -58% |
| TBT | 1,620ms | 640ms | -60% |
| CLS | 0.074 | 0 | perfect |
| FCP | 9.4s | 1.8s | -81% |
| Speed Index | 9.4s | 5.3s | -44% |
| TTI | 19.4s | 8.3s | -57% |
PDP (Product Detail Page)
| Metric | VSF | Nuxt | Delta |
|---|---|---|---|
| LCP | 18.8s | 8.0s | -57% |
| TBT | 1,380ms | 330ms | -76% |
| CLS | 0.057 | 0.003 | -95% |
| FCP | 3.7s | 1.5s | -59% |
| Speed Index | 7.1s | 4.0s | -44% |
| TTI | 24.0s | 10.5s | -56% |
Team productivity
| Period | MRs merged/week | AI-assisted ratio |
|---|---|---|
| July 2025 | ~7 | 30% |
| November 2025 | ~15 | 70% |
| January 2026 | ~27 | 90%+ |
x4 between July and January.
Technical comparison
| Aspect | Vue Storefront | Nuxt 3 |
|---|---|---|
| Framework | Vue 2.7 | Vue 3.5 |
| State Management | Vuex (104 line wrapper) | Pinia (8 stores) |
| Build | 3 Webpack configs | 1 Vite config |
| SSR | Manual Express.js | Native Nuxt |
| Middleware | Manual Express.js | Native Nuxt |
| Routing | Manual Vue Router | Automatic file-system |
| GraphQL Types | Manual | Auto-generated (Codegen) |
| Composables | 0 (mixins) | 75 |
What we learned
On migration
- GraphQL Codegen from day 1. The ROI is immediate.
- Don’t migrate, rebuild. Copying legacy code means copying its debt - but integrating it into the codebase as documentary reference is essential.
- Share insights and resources. It prevents each dev from discovering pitfalls alone.
On AI
- Adopt early, iterate fast. We changed tools three times in 6 months. Each pivot saved us time.
- AI doesn’t replace expertise, it amplifies it. You need to know what to ask and validate what comes out.
- Context is king. A tool that understands the entire codebase is worth ten times an intelligent autocompleter.
- Train the team together. AI adoption is a team sport, not an individual initiative.
- Many secondary tasks integrated. AI never gets tired - the result is generally higher quality code because we don’t take shortcuts.
On the mental shift
We’ve entered a new era. Developers no longer just code - they now wear all the hats.
We feel like we have superpowers, and we do. It’s as if we developed a cheat code that lets us unlock and do everything.
No problem seems out of reach anymore, no topic seems blocking. In short, we leverage our base capabilities and AI multiplies them to a level we struggle to describe.
One team developer sums it up well: “Now I couldn’t go back.”
The message
Six months. Three developers. Three project managers.
We did it when everyone said it was impossible.
AI is not a gadget. It’s a force multiplier. And those waiting to see “how it evolves” are falling behind every day.
The industry hasn’t shifted yet. There’s fear, hesitation, a lack of boldness.
We made the choice. We took the risk. And today, we can state: it works.
Be bold. Take the leap.
You won’t be able to go back.
The new site launches soon, but we’re not spoiling the date!
“We went from ‘how do we make it work’ to ‘how do we make it good’. That’s the whole difference.” — The Z&V Tech Team