Search Engine Optimisation (SEO)

Technical SEO involves optimising a website to improve search engine performance and user experience.

Technical SEO: Optimizing Your Website for Search Engines

Why is Technical SEO Important?

Technical SEO is essential to ensure your website appears in search results and gains traffic, revenue and business growth. It impacts your website's performance on Google and helps to improve user experience.

Common Tasks Associated with Technical SEO:

To optimise your website for technical SEO, some common tasks include:

  • Submitting your sitemap to Google
  • Creating an SEO-friendly site structure
  • Improving your website's speed
  • Making your website mobile-friendly
  • Finding and fixing duplicate content issues

And much more

Understanding Crawling:

To optimise your website for technical SEO, it is crucial to understand crawling. Crawling is the process by which search engines discover and index web pages.

Technical SEO is essential to improve search engine performance and user experience. By optimising your website's technical aspects, you can increase your website's visibility, gain more traffic and ultimately grow your business.

Crawling happens when search engines follow links on pages they already know about to find pages they haven't seen before.

For example, every time we publish new blog posts, we add them to our blog archive page.

So the next time a search engine like Google crawls our blog page, it sees the recently added links to new blog posts.

And that's one of the ways Google discovers our new blog posts.

If you want your pages to show up in search results, you first need to ensure that they are accessible to search engines.

There are a few ways to do this:

Create SEO-Friendly Site Architecture

Site architecture, also called site structure, is the way pages are linked together within your site.

An effective site structure organizes pages in a way that helps crawlers find your website content quickly and easily.

So when structuring your site, ensure all the pages are just a few clicks away from your homepage.

Like so:

In the site structure above, all the pages are organized in a logical hierarchy.

The homepage links to category pages. And then, category pages link to individual subpages on the site.

This structure also reduces the number of orphan pages.

Orphan pages are pages with no internal links pointing to them, making it difficult (or sometimes impossible) for crawlers and users to find those pages.

68% of online experiences begin with a search engine.

Submit Your Sitemap to Google

Using a sitemap can help Google find your webpages.

A sitemap is typically an XML file containing a list of important pages on your site. It lets search engines know which pages you have and where to find them.

Which is especially important if your site contains a lot of pages. Or if they're not well-linked together.

Here's what TICG's sitemap looks like:

Your sitemap is usually located at one of these two URLs:

  • yoursite.com/sitemap.xml
  • yoursite.com/sitemap_index.xml

Once you locate your sitemap, submit it to Google via GSC (Google Search Console).

To submit your sitemap to Google, go to GSC and click “Indexing” > “Sitemaps” from the sidebar.

Then, paste your sitemap URL in the blank and hit “Submit.”

After Google is done processing your sitemap, you should see a confirmation message like this:

Understanding Indexing

Once search engines crawl your pages, they then try to analyze and understand the content of those pages.

And then the search engine stores those pieces of content in its search index—a huge database containing billions of webpages.

The pages of your site must be indexed by search engines to appear in search results.

The simplest way to check if your pages are indexed is to perform a “site:” search.

A few things can keep Google from indexing your webpages:

Noindex Tag

The “noindex” tag is an HTML snippet that keeps your pages out of Google's index.

It's placed within the <head> section of your webpage and looks like this:

<meta name="robots" content="noindex">

Ideally, you would want all your important pages to get indexed. So use the “noindex” tag only when you want to exclude certain pages from indexing.

These could be:

  • “Thank you” pages
  • PPC landing pages

To learn more about using “noindex” tags and how to avoid common implementation mistakes, read our guide to robots meta tags.

Canonicalization

When Google finds similar content on multiple pages on your site, it sometimes doesn't know which of the pages to index and show in search results.

That's when canonical tags come in handy.

The canonical tag (rel="canonical") identifies a link as the original version, which tells Google which page it should index and rank.

The tag is nested within the <head> of a duplicate page and looks like this:

<link rel="canonical" href="https://example.com/original-page/" />

Google (+ Google Images) currently holds 92.58% of the total search engine market share, followed by bing, Yahoo!, Baidu and YANDEX.

Technical SEO Best Practices

Creating an SEO-friendly site structure and submitting your sitemap to Google should get your pages crawled and indexed.

But if you want your website to be fully optimized for technical SEO, consider these additional best practices.

1. Use HTTPS

HTTPS is a secure version of HTTP.

It helps protect sensitive user information like passwords and credit card details from being compromised.

And it's been a ranking signal since 2014.

You can check whether your site uses HTTPS by simply visiting it.

Just look for the “lock” icon in the address bar to confirm.

If you see the “Not secure” warning, you're not using HTTPS.

In this case, you need to install an SSL certificate.

An SSL certificate authenticates the identity of the website. And establishes a secure connection when users are accessing it.

You can get an SSL certificate for free from Let's Encrypt.

Important: Once your website moves over to HTTPS, ensure you add redirects from HTTP to the HTTPS version of your website. This will redirect all the users visiting your HTTP version to the secure, HTTPS version of your site.

2. Make Sure Only One Version of Your Website Is Accessible to Users and Crawlers

Users and crawlers should only be able to access one of these two versions of your site:

  • https://yourwebsite.com
  • https://www.yourwebsite.com

Having both versions accessible creates duplicate content issues.

And reduces the effectiveness of your backlink profile—some websites may link to the “www” version, while others link to the “non-www” version.

This can negatively affect your performance in Google.

So only use one version of your website. And redirect the other version to your main website.

3. Improve Your Page Speed

Page speed is a ranking factor both on mobile and desktop.

So make sure your site loads as fast as possible.

You can use Google's PageSpeed Insights tool to check your website's current speed.

It gives you a performance score from 0 to 100. The higher the number, the better.

Here're few ideas for improving your website speed:

  • Compress your images – Images are usually the biggest files on a webpage. Compressing them with image optimization tools like Shortpixel will reduce their file size so they take as little time to load as possible.
  • Use CDN (content distribution network) – CDN stores copies of your webpages on servers around the globe. It then connects visitors to the nearest server, so there's less distance for the requested files to travel.
  • Minify HTML, CSS and JavaScript files – Minification removes unnecessary characters and whitespace from code to reduce file sizes. Which improves page load time.

92.1% of internet users access the internet using a mobile phone.

4. Ensure Your Website Is Mobile-Friendly

Google uses mobile-first indexing. This means that it looks at mobile versions of webpages to index and rank content.

So make sure your website is compatible on mobile devices.

To check if that's the case for your site, head over to the “Mobile Usability” report in Google Search Console.

The report shows you the number of pages that affect mobile usability.

Along with specific issues.

If you don't have Google Search Console, you can use Google's Mobile-Friendly Test tool.

5. Implement Structured Data

Structured data helps Google better understand the content of a page.

And by adding the right structured data markup code, your pages can win rich snippets.

Rich snippets are more appealing search results with additional information appearing under the title and description.

Example:

The benefit of rich snippets is that they make your pages stand out from others. Which can improve your CTR (click-through rate).

Google supports dozens of structured data markups, so choose one that best fits the nature of the pages you want to add structured data to.

For example, if you run an ecommerce store, adding product structured data to your product pages makes sense.

There are plenty of free structured data generator tools like this one, so you don't have to write the code by hand.

And if you're using WordPress, you can use the Yoast SEO plugin to implement structured data.

6. Optimize for Core Web Vitals

Core Web Vitals are speed metrics that Google uses to measure user experience.

These metrics include:

  • Largest Contentful Paint (LCP) – Calculates the time a webpage takes to load its largest element for a user
  • First Input Delay (FID) – Measures the time it takes to react to a user's first interaction with a webpage.
  • Cumulative Layout Shift (CLS) – Measures the shifts in layouts of various elements present on a webpage

To ensure your website is optimized for Core Web Vitals, you need to aim for the following scores:

  • LCP –2.5 sec or lower
  • FID – 100 ms or lower
  • CLS – 0.1 or lower

You can check your website’s performance for Core Web Vitals metrics in Google Search Console.

To do this, visit the Core Web Vitals report in your Search Console.

Need help with your Technical SEO

Get in contact and fill in our form we can review your .