Most SEO agencies in London will audit your meta tags, restructure your heading hierarchy, and hand you a PDF full of recommendations. That is table stakes. The real performance gains — the kind that move you from page two to position three, from position three to the featured snippet — live deeper in the stack. They live in your build tooling, your rendering strategy, and your framework architecture.
At DubSEO, we are a dev-heavy agency by design. We do not just advise on Technical SEO — we ship it. Today, we are pulling back the curtain on three React-ecosystem tools that define how we architect fast, crawlable, revenue-generating websites: Vite, Next.js, and Remix.
The Performance-SEO Connection Most Agencies Ignore
Google has made its position unambiguous. Core Web Vitals — Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative Layout Shift (CLS) — are ranking signals. Yet the majority of SEO companies in London treat performance as a frontend concern, something to hand off to "the developers."
That separation is the problem. When your SEO strategist cannot read a Lighthouse trace and your developer does not understand crawl budget, nobody owns the intersection. We do.
Every technical recommendation we make at DubSEO is deployed, measured, and iterated on by engineers who understand both the search algorithm and the JavaScript runtime. Here is how each framework fits into that philosophy.
Vite: The Build Tool That Eliminates Bottleneck One
What It Is
Vite is not a framework — it is a next-generation build tool created by Evan You (the creator of Vue.js) that has rapidly become the default bundler across the React ecosystem. It replaces Webpack in most modern stacks and is now the recommended tooling for React itself.
Why It Matters for SEO
Slow builds create slow feedback loops. Slow feedback loops mean performance regressions ship to production unchecked. Vite solves this at the root:
- Native ES Modules in Development: Vite serves source files over native ESM, which means your dev server starts in milliseconds regardless of application size. Engineers can test rendering changes — the kind that affect LCP and CLS — without waiting 30 seconds for a rebuild.
- Rollup-Powered Production Builds: In production, Vite uses Rollup under the hood, producing highly optimised, tree-shaken bundles. Smaller bundles mean faster Time to Interactive (TTI), which directly impacts INP.
- CSS Code Splitting: Vite automatically code-splits CSS alongside JavaScript chunks. This eliminates render-blocking stylesheets that inflate LCP on pages Google is actively evaluating.
How We Use It
For headless CMS architectures — Sanity, Contentful, Storyblok — we use Vite as the foundational build layer. When a client's content team publishes a new page, the rebuild and deploy pipeline completes in seconds, not minutes. That velocity means we can run Technical SEO experiments at a pace traditional agencies simply cannot match.
Next.js: The Full-Stack React Framework for Search Dominance
What It Is
Next.js, developed by Vercel, is the most widely adopted React framework for production websites. It provides server-side rendering (SSR), static site generation (SSG), incremental static regeneration (ISR), and — as of the App Router — React Server Components (RSC).
Why It Matters for SEO
Next.js is, in many respects, purpose-built for search engine optimisation:
- Server-Side Rendering: Googlebot can render JavaScript, but it does so on a deferred queue. SSR removes that dependency entirely. When Googlebot requests a URL, it receives fully rendered HTML immediately. No hydration delay. No content invisibility window.
- Incremental Static Regeneration (ISR): For large-scale ecommerce and publishing sites, ISR lets you statically generate pages at build time and then revalidate them on a per-page, time-based cadence. You get the speed of static with the freshness of dynamic — critical for product pages, pricing pages, and location landing pages.
- React Server Components: RSCs execute exclusively on the server. They send zero JavaScript to the client. For content-heavy pages — blog posts, service pages, case studies — this means dramatically smaller client bundles and faster LCP scores.
- Built-In Metadata API: The App Router's
generateMetadatafunction allows dynamic, type-safe meta tag generation that lives alongside the page component. No more orphaned SEO plugins. No more metadata drift. - Image Optimisation: The
next/imagecomponent handles responsive sizing, lazy loading, format conversion (WebP/AVIF), and priority hints out of the box. Misconfigured images are the single most common LCP offender we encounter in audits, and Next.js eliminates the problem architecturally.
How We Use It
Next.js is our default recommendation for businesses that need a balance of dynamic functionality and search performance. We deploy on Vercel's edge network, which places server-rendered responses within milliseconds of London-based users — and every other geographic market our clients target.
For a recent London-based SaaS client, migrating from a client-rendered Create React App to a Next.js App Router architecture reduced LCP from 4.2 seconds to 1.1 seconds across their top 50 landing pages. Organic traffic increased 38% over the following quarter, with no changes to content or backlink strategy. That is the power of Technical SEO executed at the framework level.
Remix: The Web Standards Framework That Rethinks Data Loading
What It Is
Remix, originally created by Ryan Florence and Michael Jackson (the team behind React Router), is a full-stack React framework built on web standards. It was acquired by Shopify in 2022 and now powers Shopify's Hydrogen framework for headless commerce. Remix has recently merged conceptually with React Router v7, but the architectural principles remain distinct and powerful.
Why It Matters for SEO
Remix takes a fundamentally different approach to data loading and rendering, and the SEO implications are significant:
- Nested Routing with Parallel Data Loading: Remix loads data for all nested route segments in parallel. In a traditional Next.js Pages Router setup, data fetching cascades — the parent layout fetches, then the child fetches, then the grandchild fetches. Remix eliminates the waterfall. For deeply nested pages (category > subcategory > product), this shaves hundreds of milliseconds off server response time, directly improving Time to First Byte (TTFB).
- Progressive Enhancement by Default: Remix forms and navigations work without JavaScript. If a bot, a user on a slow connection, or a user with JavaScript disabled hits your page, it still functions. This is not an accessibility afterthought — it is the architectural default. Googlebot encounters a fully functional page every single time.
- HTTP-Native Caching: Remix leverages standard HTTP
Cache-Controlheaders rather than framework-specific caching abstractions. This means CDNs, browser caches, and intermediary proxies all understand and respect your caching strategy natively. The result: faster repeat visits, reduced server load, and fresher content delivery — all without custom infrastructure. - Error Boundaries Per Route Segment: If a single component on a page fails, Remix isolates the error to that route segment. The rest of the page renders normally. This prevents the catastrophic blank-page scenarios that tank crawlability and user experience simultaneously.
How We Use It
Remix is our framework of choice for ecommerce clients — particularly those on Shopify — and for high-traffic content platforms where data loading performance is the primary bottleneck. Its web-standards-first philosophy aligns with how we think about Technical SEO: build for the platform (the browser, the HTTP spec, the search engine crawler), not around it.
Choosing the Right Architecture: Our Decision Framework
Not every project demands the same tool. Here is the decision matrix we apply at DubSEO:
| Consideration | Vite + SPA/SSR | Next.js | Remix |
|---|---|---|---|
| Content-heavy sites (blogs, magazines) | ⚠️ Requires custom SSR setup | ✅ RSC + ISR ideal | ✅ Strong with nested layouts |
| Ecommerce (Shopify, headless) | ❌ Not recommended alone | ✅ Excellent with App Router | ✅ Purpose-built via Hydrogen |
| SaaS dashboards with public marketing pages | ✅ Vite for app, framework for marketing | ✅ Hybrid static + dynamic | ✅ Parallel loading excels |
| Large-scale multi-market SEO | ⚠️ Limited i18n tooling | ✅ Mature i18n + middleware | ✅ URL-based i18n natural fit |
| Core Web Vitals priority | ✅ Minimal bundle overhead | ✅ Image/font optimisation built in | ✅ No client JS by default |
The common thread: every option is evaluated through an SEO lens first, an engineering lens second. That dual fluency is what separates a dev-heavy agency from an agency that occasionally talks to developers.
Why "Dev-Heavy" Is Not a Buzzword — It Is a Methodology
Most SEO agencies outsource development. They write a specification document, hand it to a third-party team or the client's internal developers, and hope the implementation matches the intent. Inevitably, something is lost in translation. The canonical tags are malformed. The structured data fires on the wrong template. The lazy-loading threshold is set too aggressively and LCP regresses.
At DubSEO, the strategist and the engineer are often the same person — or they sit in the same standup, review the same pull request, and deploy from the same pipeline. Our Technical SEO recommendations do not live in slide decks. They live in version-controlled code, behind automated Lighthouse CI checks, deployed through preview environments where we verify Googlebot rendering before a single URL touches production.
This is what it means to be a dev-heavy SEO company in London. It is not about having developers on staff. It is about engineering being the delivery mechanism for search strategy.
The Bottom Line
The frameworks have matured. Vite has made build tooling nearly invisible. Next.js has made server rendering the default. Remix has made web standards the foundation. The technical barriers to a perfectly optimised, blazing-fast, fully crawlable website are lower than they have ever been.
The barrier that remains is expertise — the ability to select the right architecture for the business problem, implement it without compromise, and measure the SEO impact with precision.
That is the work we do at DubSEO. If your current agency is still handing you PDFs, it might be time to work with one that ships pull requests.
Ready to architect your site for speed and search visibility? Get in touch with our technical team to discuss your stack.