article Part 3 of 6

What is Technical SEO?

Technical SEO refers to optimizing your website's infrastructure to help search engines crawl, index, and understand your content more effectively. While on-page SEO focuses on content and HTML elements, technical SEO deals with the behind-the-scenes aspects that make your site accessible to search engines and fast for users.

Think of it this way: you can have the best content in the world, but if search engines can't access it, or if your site is so slow that users bounce before it loads, that content won't rank well.

Technical SEO ensures your site's foundation is solid, making everything else—content, links, user experience—more effective.

The Technical SEO Stack

1. Site Speed & Core Web Vitals

Page speed has been a ranking factor since 2010, and in 2021, Google made it even more important with the introduction of Core Web Vitals—a set of metrics that measure real-world user experience.

Core Web Vitals: The Three Key Metrics

  1. Largest Contentful Paint (LCP) – Measures how long it takes for the largest content element (like a hero image or text block) to become visible. Target: under 2.5 seconds.
  2. Interaction to Next Paint (INP) – Measures the delay between a user interaction (click, tap, keystroke) and the visual response. Target: under 200 milliseconds.
  3. Cumulative Layout Shift (CLS) – Measures unexpected layout shifts that occur as the page loads (like when content suddenly moves because an ad loads). Target: under 0.1.

How to Improve Page Speed

  • Optimize images – Compress images, use modern formats (WebP, AVIF), implement lazy loading
  • Minify CSS, JavaScript, and HTML – Remove unnecessary characters and whitespace
  • Leverage browser caching – Store static resources in users' browsers
  • Use a Content Delivery Network (CDN) – Serve content from servers closer to users
  • Reduce server response time – Upgrade hosting, optimize database queries, use caching
  • Eliminate render-blocking resources – Defer non-critical CSS and JavaScript
  • Reduce redirects – Each redirect adds latency
  • Enable compression – Use Gzip or Brotli to compress text-based resources

Use tools like Google PageSpeed Insights, Lighthouse, and WebPageTest to measure your Core Web Vitals and get specific optimization recommendations.

2. Mobile Optimization & Mobile-First Indexing

Since 2019, Google has used mobile-first indexing—meaning it predominantly uses the mobile version of your content for indexing and ranking. If your mobile site is lacking, your rankings will suffer.

Mobile Optimization Checklist

  • Use responsive design – Your site should adapt to any screen size
  • Ensure tap targets are large enough – Buttons and links should be at least 48x48 pixels
  • Avoid intrusive interstitials – Pop-ups that cover content hurt mobile UX and SEO
  • Use legible font sizes – At least 16px for body text on mobile
  • Optimize for mobile speed – Mobile networks are often slower than desktop
  • Test on real devices – Not just desktop browser resize
  • Ensure parity – Mobile and desktop versions should have the same content and structured data

Use Google's Mobile-Friendly Test tool to check if your pages meet mobile usability standards.

3. HTTPS & Security

HTTPS (the secure version of HTTP) has been a ranking signal since 2014. Sites using HTTPS encrypt data between the server and browser, protecting user privacy and security.

Why HTTPS matters for SEO:

  • It's a direct ranking factor (small boost)
  • Browsers display warnings for non-HTTPS sites, reducing trust and traffic
  • HTTPS is required for modern web features like Progressive Web Apps (PWAs) and HTTP/2
  • Users expect it—seeing a "Not Secure" warning increases bounce rates

Implementing HTTPS:

  1. Purchase an SSL/TLS certificate (or get a free one from Let's Encrypt)
  2. Install the certificate on your server
  3. Update all internal links to HTTPS
  4. Redirect HTTP traffic to HTTPS (301 redirects)
  5. Update your sitemap with HTTPS URLs
  6. Update external resources (images, scripts) to use HTTPS

4. Crawlability & Indexability

For your content to rank, search engines must first be able to crawl (discover and access) it and then index (analyze and store) it.

Robots.txt

The robots.txt file lives at the root of your site (e.g., example.com/robots.txt) and tells search engines which pages or sections of your site they can or cannot crawl.

Example robots.txt:

# Allow all bots to crawl everything
User-agent: *
Allow: /

# Block all bots from specific directories
Disallow: /admin/
Disallow: /private/
Disallow: /temp/

# Point to your XML sitemap
Sitemap: https://example.com/sitemap.xml

Important notes:

  • Blocking a URL in robots.txt prevents crawling but doesn't prevent indexing (the URL might still appear in results)
  • To prevent indexing, use a noindex meta tag instead
  • Don't block CSS or JavaScript files—Google needs these to render your page properly

XML Sitemaps

An XML sitemap is a file that lists all the important pages on your site, making it easier for search engines to discover your content.

Best practices:

  • Include all important pages (but not low-value pages like thank you pages or login pages)
  • Keep sitemaps under 50MB and 50,000 URLs (split into multiple sitemaps if larger)
  • Update your sitemap when you add, remove, or significantly update pages
  • Submit your sitemap to Google Search Console and Bing Webmaster Tools
  • Reference your sitemap in robots.txt

Meta Robots Tags

The meta robots tag gives you page-level control over crawling and indexing:

<!-- Allow indexing and following links (default) -->
<meta name="robots" content="index, follow">

<!-- Prevent indexing but allow following links -->
<meta name="robots" content="noindex, follow">

<!-- Prevent indexing and following links -->
<meta name="robots" content="noindex, nofollow">

Common use cases for noindex:

  • Thank you pages
  • Duplicate content (filtered/sorted product pages)
  • Staging or development pages
  • Thin or low-value pages

Canonical Tags

A canonical tag tells search engines which version of a page is the "master" when you have duplicate or very similar content.

<link rel="canonical" href="https://example.com/product/blue-shoes">

This is crucial for e-commerce sites where the same product might appear under multiple URLs (categories, filters, sorting, etc.).

5. Site Architecture & URL Structure

A well-organized site structure helps search engines understand your content hierarchy and helps users find what they need.

Best practices:

  • Keep it shallow – Important pages should be 3 clicks or fewer from the homepage
  • Use logical hierarchy – Group related content together
  • Create a clear navigation structure – Main nav, breadcrumbs, footer links
  • Use descriptive URLs – URLs should indicate content hierarchy (e.g., /blog/seo/technical-seo)
  • Implement breadcrumb navigation – Helps users and search engines understand context
  • Build an HTML sitemap – User-facing page listing all important content

6. Structured Data & Rich Results

We touched on schema markup in the on-page SEO section, but it's worth emphasizing from a technical perspective. Implementing structured data correctly is a technical task that can significantly impact your search appearance.

Technical implementation tips:

  • Use JSON-LD format (Google's preferred format)
  • Validate your markup with Google's Rich Results Test
  • Monitor schema errors in Google Search Console
  • Implement schema site-wide where appropriate (Organization, WebSite, etc.)
  • Keep schema up to date with content changes

7. Internationalization & Hreflang

If you have content in multiple languages or for different regions, use hreflang tags to tell search engines which version to show to which users.

<link rel="alternate" hreflang="en-us" href="https://example.com/en-us/page" />
<link rel="alternate" hreflang="en-gb" href="https://example.com/en-gb/page" />
<link rel="alternate" hreflang="es" href="https://example.com/es/page" />

This prevents duplicate content issues and ensures users get the right version for their location and language.

Key Takeaways

  • Technical SEO focuses on optimizing your site's infrastructure to help search engines crawl, index, and understand your content.
  • Core Web Vitals (LCP, INP, CLS) are critical user experience metrics that impact rankings. Focus on speed and stability.
  • Mobile-first indexing means Google primarily uses your mobile site for ranking. Ensure mobile optimization is a priority.
  • HTTPS is a ranking factor and user trust signal. All sites should use SSL/TLS certificates.
  • Robots.txt controls crawling, meta robots tags control indexing, and canonical tags manage duplicate content.
  • XML sitemaps help search engines discover your content efficiently.
  • A logical site architecture with shallow depth and clear hierarchy improves both SEO and user experience.
  • Structured data (schema markup) can earn rich results in search and improve visibility.
  • Use hreflang tags for international or multilingual sites to serve the right content to the right users.

With solid technical foundations in place, you're ready to develop a comprehensive content strategy that attracts and serves your target audience.