Ahrefs: how to score 100% in tech SEO

We recently tackled technical SEO optimization using Ahrefs. Here is how we scored 100% in tech SEO.

Fidel Chaves
10 min read

What is tech SEO?

What is the meaning of tech SEO or technical SEO? It is the building block to optimize a site’s architecture, code, and content to help search engines find it, understand its pages, and index them.

 

Tech SEO is not a discipline set in stone. As search engine parameters and preferences change over time, sites need regular maintenance and discovery of new issues.

 

The basics around tech SEO focus on letting engines access, read, understand, link, and index pages and content. Here, how content is presented and what it contains is relevant toward getting a healthy score on a tool like Ahrefs.

 

I know that adjusting results for a robot is not ideal when planning to post something of value online. But hear me out, if a site’s page does not rank or is not searchable, there will be no humans to judge its content.

Why tech SEO?

As I said before, tech SEO is about getting the best site with the best content. It is the marriage of these two pillars that will make a site stand out from the rest.

 

How to start with tech SEO? First, be sure to know if the website appears on search engines. If tech SEO is not implemented correctly, Google and other search engines will not even find it. How would customers, readers, users, humans do?

 

Because of this, the easier it is for bots, the better. They are not especially intelligent, you know, so we have to help them out. That is until they take over.

 

If you are interested in getting the best SEO results for a website, I will tell you all about our journey to get a top health score in Ahrefs, a dedicated tool for tech SEO.

First, a basic SEO audit

If you are in charge of SEO and analytics at your company and want to step up your knowledge or, if you are a solo entrepreneur and have to do everything yourself, this guide is for you.

 

First, you should do a basic SEO audit. I am not talking about paying for a specific tool; you can do a lot just by looking at search results in Google and your HTML code (if you work with developers, reach out to them for help).

 

Before doing anything, you want to be sure to check if your pages are getting indexed. If not, nothing else really matters. For this purpose, take any search engine (I will use Google) and search for your site typing site:yourdomain.com. Let’s try it out:

Awkbit SEO search for optimization Awkbit SEO search for optimization

As you can see, we have 169 results, meaning that those pages are indexed. When looking for your site, if nothing comes up, there are a few things to consider.

 

If nothing comes up, check these five things to be sure not to have removed your website from Google’s index and diluted your page’s backlinks (this step might also require fetching a front-end developer):

1. No index metatag

If you find a tag that looks like this (<meta name=”robots” content=”noindex”>) in your HTML, you should know that it indicates robots (the search engine crawlers) that should not index this specific page. It is an immediate blocker, so be sure to assign those correctly.

2. Robots.txt

There is a file on your root domain called “robots.txt”. Like the no index metatag, it works as a guideline for robots to crawl or exclude specific pages or the whole site. Be sure to allow them to see the pages you want to index and block them to private or data-sensitive pages.

3. Sitemaps

Sitemaps are XML files that list the important URLs on your website (pages, images, videos, and other files). They help search engines like Google better crawl your site and ideally don’t need to be maintained manually. With a growing number of pages, this can be taxing.

4. Redirects

A redirect takes visitors and bots from one URL to another. Their purpose is to consolidate signals and collect them all in one place. For example, it is common to redirect the HTTP to HTTPS (secure) or “yourdomain.com” to “www.yourdomain.com”.

5. Canonical tag

Canonical tags (<link rel=”canonical” href=”https://yourdomain/slug” />) let robots know which is the original content piece on your site. It helps when you have duplicate content that crawlers can interpret as plagiarism. Canonical tags were an issue for us when we started to post our articles on Medium and our blog simultaneously. If not, you leave the decision up to the search engine, and it might make the wrong call.

 

If all of this is good, then your pages should be indexed in Google anytime soon. To be prepared for organic traffic, let us review some recommendations to set foundations for growth.

Second, the pillars of good SEO

You probably know that content is king. At least, the content writer on your team should. It is a pretty common catchphrase, but king how? Having a keyword-optimized site that does not say anything relevant to users or having paid for backlinks while you still have a nightmare-inducing UI is not the way.

 

Assuming that you (or the devs on your team) checked those issues that we talked about before (go on, fix them, I will wait), let us review some content and format considerations.

1. Create awesome content

Content is the most challenging part of having a site. Consider copy, content, and UX writing (a little self-promotion), be sure to put on original and useful content out there.

2. Create awesome content

Have we talked about this before? Create awesome content. There is no way around it. If not, you can be a cheap internet site full of inserted tweets and gifs. Nobody will stop you.

3. Keyword-optimize your content

Keywords can be the building blocks from where you get your relevant content out there. Be sure to check out what people want to read and pop some into your planning. There will be a lot of topics that might suit your company North Star.

4. Optimize your content for users

It is all good with great content, but how you present it can make a difference. An inspiring user experience and user interface are necessary besides creating awesome content. Balanced as all things should be. You might want to get a UI and UX designer for this.

5. Build backlinks

Backlinking is one of the most consistent ways to get noticed out there. When authoritative sites point at you, expect to get more traffic. Link building is probably one of the biggest challenges we face at Awkbit as we want to grow it organically.

6. Use unique media

Unique media is favored by Google and users alike. We all love to use Office gifs and get images out of Unsplash, but getting your own media, whether photos or screenshots, can make the difference.

 

So this is pretty much it in terms of content. There are probably many other things that I missed, but that’s how we manage at Awkbit. We are in the process of learning these things as much as you do, so be sure to message us if you have any recommendations or corrections.

 

Before wrapping up, let us look at how content and code, developers and creatives, can work together for better SEO performance.

Third, developers and marketing working together

Developers and marketing often feel miles apart, but good communication and a joint effort are necessary to create the best software projects out there. We’ve talked about bot-friendly HTML. Both the development and the marketing teams have to deal with this language. Just press ctrl+shift+i and let us see what is inside:

Awkbit site's HTML for SEO optimization Awkbit site's HTML for SEO optimization

When hitting those keys, you might quickly notice some things:

 

  • The HTML markup on the right
  • Inside, the “head” separated from the “body”

 

Invisible to the user, essential to bots.

 

That’s why inside the head and body, you will define and include a lot of metadata that makes it easier for search engines to interpret it. You can do this with tags taken from schema.org or generate rich snippets.

 

Adding metadata to content not only helps your SEO but also makes it accessible. For example, when using alt images info.

 

Be sure that all the needed information is on the “head”. This data does not show directly, but choose them wisely as things like the “Title” will be displayed on the SERP (Search Engine Results Pages). Other metatags, such as the description, feature image, author, canonical URL, and social media tags, should be optimized for SEO performance as all of this can control your CTR rating.

 

If you are in charge of SEO, you are probably familiar with its terminology, but just in case, let us review the terms.

 

SEO performance is measured with several stats. One of them is CTR: click-through rate. CTR indicates how many impressions you get (how many people see your page as a result in a search engine) and how many clicks go through. Higher is better.

 

In the second place, there is the bounce rate, which indicates if the user clicks and immediately leaves your site. High bounce rates stem from poor UI and UX, short content, or misleading titles. The lower, the better as search engines mark it as not relevant content.

 

Lastly, there are metrics of dwell time, session duration, and pages viewed per session. These indicate how much people engage with your site. Ideally, these should be as long and as much as possible, making visitors stay forever and ever. Yeah, maybe those were not the best metrics to design the internet upon, looking at the state of addictive social media, especially.

 

Which are other crucial things to consider when working as a multidisciplinary team of developers, creatives, and operations? Let me list them out quickly.

 

  • Load content fast: this can change according to who renders the HTML. Be sure to check if you will use a client, server, cache, or CDN solution. Each of them has pros and cons that will require a separate article.
  • Ensure that your site structure follows a logical hierarchy: think of how you organize your content on your website like a mind map. Every page on your site should have inbound and outbound internal links to avoid orphan pages. Site structure helps search engines crawl more efficiently, but it can get tricky with blog posts, category pages, or product pages.
  • Run scheduled audits on your sites: We used the Ahrefs tool as an overview to fix many of our mistakes as 404’s. With an audit tool, you can review issues in detail and learn how to fix them.

 

Awkbit 404 errors in Ahrefs for SEO optimization Awkbit 404 errors in Ahrefs for SEO optimization


For Awkbit’s site, there were several minor issues like these that we had to fix to go from a 2/100 score (really) to a 100/100. Title tags, backlinks, anchor text, internal links, h1, h2, h3 tags, alt text, keywords, rich snippets, missing tags, tags to spare, contradicting tags, language, and reflang optimization (as we publish content in more than one language), interlinks, canonical tags…

 

It is a long and winding road, and I am not even talking in the past tense as we had deployments that broke all of our work overnight. SEO optimization is a never-ending story. I know it went through code, content, and marketing issues, so having a multidisciplinary team at your service is life-changing.

 

At Awkbit, we consider that optimizing technical SEO is the way toward delivering an outstanding product. That is why we maintain a growth mindset and invest time and effort in learning new technologies, techniques, and tools. Do you want to lay the foundations for an SEO-optimized site? Are you willing to rise to Google’s top?

Reach Out!

Sources & further reading