IF your site is not fully compatible with Google’s crawlers and free of errors, how can you expect search traffic to grow? IF search engines can not EASILY understand your site, all of your SEO and link building efforts will be seriously held back.
IF a search engine is able to crawl, index, and render your web pages correctly, it increases your chances of ranking in search results.
The first step in improving your technical SEO is knowing where you stand by performing a site audit. The second step is to create a plan to address the areas where you fall short. We’ll cover these steps in-depth below.
Improve your website with effective technical SEO. Start with one of the best technical SEO guide you can find on the internet or order your complete SEO audit.
What is Technical SEO
Technical SEO refers to optimizing all the technical elements to make your site easier for search engines to crawl, index and render. Technical SEO, content strategy, and link-building strategies all work in tandem to help your pages rank highly in search.
But that’s just scratching the surface. Even if Google DOES index all of your site’s content, that doesn’t mean your job is done. That’s because, for your site to be fully optimized for technical SEO, your site’s pages need to be secure, mobile optimized, free of duplicate content, fast-loading… and a thousand other things that go into technical optimization.
That’s not to say that your technical SEO has to be perfect to rank. It doesn’t. But the easier you make it for Google to access your content, the better chance you have to rank.
Technical SEO vs. On-Page SEO vs. Off-Page SEO
Many people break down search engine optimization (SEO) into three different buckets: on-page SEO, off-page SEO, and technical SEO. I wll quickly cover what each means.
On-page SEO (also called on-site SEO) is the process of optimizing webpages and their content for both search engines and users. It can help rank pages higher on Google and drive more organic traffic. Common tasks associated with on-page SEO include optimizing for search intent, title tags, internal links, and URLs.
Off-page SEO covers anything you can optimize outside of your site (or externally) in an attempt to boost your rankings. Backlinks are arguably the biggest off-page SEO factor. Their quality and quantity boost a page ranking. Other examples include social media and PR.
Technical SEO is within your control as well, but it’s a bit trickier to master since it’s less intuitive. Technical SEO is about making sure that your website’s code is clean and effective. It also improves your website’s user experience, directly affecting your rankings and conversions.
The Elements of Technical SEO
Technical SEO is a beast that is best broken down into digestible pieces. If you’re like me, you like to tackle big things in chunks and with checklists. Believe it or not, everything we’ve covered to this point can be placed into one of five categories, each of which deserves its own list of actionable items:
- Crawalability
- Indexability
- Renderability
- Rankability
- Clickability
1. Crawlability
Technical SEO is a beast that is best broken down into digestible pieces. If you’re like me, you like to tackle big things in chunks and with checklists. Believe it or not, everything we’ve covered to this point can be placed into one of five categories, each of which deserves its own list of actionable items.
Create a XML sitemap
The site structure should be included in an XML Sitemap, which is used to assist search engines comprehend and index your web pages. It functions similarly to a map on your website. When it’s finished, you’ll submit your sitemap to Bing Webmaster Tools and Google Search Console.
Stick to the following best practices when implementing an XML Sitemap:
- Keep the XML Sitemap up to date with your website’s content.
- Make sure only indexable pages are included.
- Reference the XML Sitemap from your robots.txt file.
- Don’t list more than 50.000 URLs in a single XML Sitemap.
- Make sure the (uncompressed) file size doesn’t exceed 50MB.
- Don’t obsess about the lastmod, priority and changefreq properties.
Maximize your crawl budget
The pages and resources on your website that search bots will crawl are referred to as your crawl budget. Make sure you’re allocating your crawl money wisely by giving your most crucial pages top priority.
The following advice can help you make the most of your crawl budget:
- Canonicalize or delete duplicate pages.
- Redirect or fix any broken links.
- Verify that the Javascript and CSS files can be crawled.
- Keep a close eye on your crawl metrics and take note of any unexpected rises or decreases.
- Verify that any website or bot you’ve forbidden from crawling is intended to be prohibited.
- Maintain an up-to-date sitemap and upload it to the relevant webmaster tools.
- Eliminate pointless or out-of-date content from your website.
- Be cautious when using dynamically created URLs, as they may cause your website’s page count to explode.
Optimize your site architecture
Your website has multiple pages. Those pages need to be organized in a way that allows search engines to easily find and crawl them. That’s where your site structure — often referred to as your website’s information architecture — comes in. In the same way that a building is based on architectural design, your site architecture is how you organize the pages on your site.
Here are some of the website structure best practices for SEO:
- Use an SEO-friendly URL structure
- Plan your navigation menus
- Use category pages
- Plan the depth of your site’s key pages
- Use internal linking strategically
Set a URL structure
URL structure refers to how you structure your URLs, which could be determined by your site architecture. Once you have your URL structure buttoned up, you’ll submit a list of URLs of your important pages to search engines in the form of an XML sitemap. Doing so gives search bots additional context about your site so they don’t have to figure it out as they crawl.
Here are a few more tips about how to write your URLs:
- Use lowercase characters
- Use dashes to separate words.Make them short and descriptive
- Avoid using unnecessary characters or words (including prepositions)
- Include your target keywords
Utilize robots.txt
When a web robot crawls your site, it will first check the /robot.txt, otherwise known as the Robot Exclusion Protocol. This protocol can allow or disallow specific web robots to crawl your site, including specific sections or even pages of your site. If you’d like to prevent bots from indexing your site, you’ll use a noindex robots meta tag.
When implementing robots.txt, keep the following best practices in mind:
- Be careful when making changes to your robots.txt, as it can make big parts of your website inaccessible for search engines.
- The robots.txt file should reside in the root of your website (e.g. http://www.example.com/robots.txt).
- The robots.txt file is only valid for the full domain it resides on, including the protocol (http or https).
- Different search engines interpret directives differently. By default, the first matching directive always wins. But, with Google and Bing, specificity wins.
- Avoid using the crawl-delay directive for search engines as much as possible.
Add breadcrumb menus
Breadcrumbs are exactly what they sound like — a trail that guides users back to the start of their journey on your website. It’s a menu of pages that tells users how their current page relates to the rest of the site.And they aren’t just for website visitors; search bots use them, too.
When designing breadcrumb navigation for your website, follow the best practices below.
- Use Breadcrumbs to Support Primary Navigation
- Use Separators Between Individual Levels
- Don’t Include a Link to the Current Page
- Keep the Design Simple
- Regularly Check Your Links Work Properly
Check your SEO log files
You can think of log files like a journal entry. Web servers (the journaler) record and store log data about every action they take on your site in log files (the journal). The data recorded includes the time and date of the request, the content requested, and the requesting IP address. You can also identify the user agent, which is a uniquely identifiable software (like a search bot, for example) that fulfills the request for a user.
SEO log file analysers can help you understand:
- How many pages of your site is Googlebot able to crawl within their crawl budget
- If you maximize your crawl budget or is there room for improvement
- How often is Googlebot visiting your site
- If your site links look healthy for search engine bots and users
- If you have too many redirects on your site
- Which pages on your site are slower and harder for search engines to crawl
- If there are there pages on your site that search engine crawlers are unable to find
2. Indexability
As search bots crawl your website, they begin indexing pages based on their topic and relevance to that topic. Once indexed, your page is eligible to rank on the SERPs. Here are a few factors that can help your pages get indexed.
Unblock search bots from accessing pages
You want to ensure that bots are sent to your preferred pages and that they can access them freely. You have a few tools at your disposal to do this. Google’s robots.txt tester will give you a list of pages that are disallowed and you can use the Google Search Console’s Inspect tool to determine the cause of blocked pages.
If you’ve done everything above and Google still isn’t indexing some or all of the pages you’d expect them to, there’s probably a bigger issue so you need to run some checks:
- Check for rogue noindex tags
- Check for manual actions and security issues
- Check that your content is actually valuable for searchers
- Check for indexable pages not in your sitemap
- Check for crawl blocks in your robots.txt file
- Check for rogue canonical tags
- Check for nofollow internal links
- Check for internal link opportunities
- Check for crawl budget issues
Remove duplicate content
Duplicate content confuses search bots and negatively impacts your indexability. Ideally, you want to tell Google which version of the page to prioritize and then send all of the SEO juice from the duplicate pages to that canonical page.
There are a few different ways to do that:
- The rel=canonical tag: this is an HTML tag that you can add to a certain page, which then tells search engines that this page is the one that you want Google to index. Then, when Google finds any duplicates of the page, it will attribute all of the SEO juice from those duplicates to the canonical page.
- 301 redirect: These puppies allow you to tell search engines that whenever someone tries to visit page A, you want them to send those people to page B instead. However, a 301 redirect still doesn’t delete page A. It simply redirects any visitors to page B instead.
- Set passive parameters in Google Search Console: it can be a helpful short-term strategy. When you set certain URLs as passive to Google, that tells Google’s crawl bot to basically ignore that URL.
Audit your redirects
Verify that all of your redirects are set up properly. Redirect loops, broken URLs, or — worse — improper redirects can cause issues when your site is being indexed. To avoid this, audit all of your redirects regularly.
Follow these tips to encounter fewer, if any, problems with redirects:
- Redirect to the most relevant & similar pages
- Avoid redirect chains
- Update internal links
- Avoid meta refresher tags
- Avoid soft 404’s
- Use 404’s sparingly and get creative
Check the mobile-responsiveness of your site
If your website is not mobile-friendly by now, then you’re far behind where you need to be. As early as 2016, Google started indexing mobile sites first, prioritizing the mobile experience over desktop. Today, that indexing is enabled by default. To keep up with this important trend, you can use Google’s mobile-friendly test to check where your website needs to improve.
Here’s a handy responsive design checklist to help you test your website:
- Test across devices & browsers
- Revise the device/browser mix every few months
- Test for Content Order Based on Importance
- Performance Test
- Test the Website Visually
- Test Website Typography
- Test Device Fonts
- Test for Smooth Navigation
- Test Website Popups
- Test for Interactivity
Fix HTTP errors
HTTP errors can impede the work of search bots by blocking them from important content on your site. It is, therefore, incredibly important to address these errors quickly and thoroughly.
Since every HTTP error is unique and requires a specific resolution, here is a brief explanation of each ot them:
- 301 Permanent Redirects are used to permanently send traffic from one URL to another
- 302 Temporary Redirect is a way to temporarily redirect traffic from a URL to a different webpage
- 403 Forbidden Messages mean that the content a user has requested is restricted based on access permissions or due to a server misconfiguration
- 404 Error Pages tell users that the page they have requested doesn’t exist, either because it’s been removed or they typed the wrong URL
- 405 Method Not Allowed means that your website server recognized and still blocked the access method, resulting in an error message
- 500 Internal Server Error is a general error message that means your web server is experiencing issues delivering your site to the requesting party
- 502 Bad Gateway Error is related to miscommunication, or invalid response, between website servers
- 503 Service Unavailable tells you that while your server is functioning properly, it is unable to fulfill the request
- 504 Gateway Timeout means a server did not receive a timely response from your web server to access the requested information
3. Renderability
Rendering is the process where Googlebot retrieves your pages, runs your code, and assesses your content to understand the layout or structure of your site.
All the information Google collects during the rendering process is then used to rank the quality and value of your site content against other sites and what people are searching for with Google Search.
Below are the website elements to review for your renderability SEO audit.
Server performance
Server side performance can affect technical SEO heavily. Server timeouts and errors will cause HTTP errors that hinder users and bots from accessing your site. If you notice that your server is experiencing issues, use the resources provided above to troubleshoot and resolve them. Failure to do so in a timely manner can result in search engines removing your web page from their index as it is a poor experience to show a broken page to a user.
Some of the best practices include:
- Avoid N + 1 Queries
- Follow framework and language best practices
- Server cache for frequent requests
- Remove old / duplicate code (Clean code)
HTTP Status
Similar to server performance, HTTP errors will prevent access to your webpages. You can use a web crawler, like Screaming Frog or DeepCrawl to perform a comprehensive error audit of your site.
The most common status codes you are likely to encounter when a site cannot be crawled and the steps to troubleshoot these, can be found below:
- 0 – Blocked By Robots.txt:
- 0 – DNS Lookup Failed
- 0 – Connection Timeout
- 0 – Connection Refused
- 0 – Connection Error / 0 – No Response
- 200 – OK
- 301 – Moved Permanently / 302 – Moved Temporarily
- 400 – Bad Request / 403 – Forbidden / 406 – Not Acceptable
- 404 – Page Not Found / 410 – Removed
- 429 – Too Many Requests
- 500 – Internal Server Error / 502 – Bad Gateway / 503 – Service Unavailable
Load Time and Page Size
If your page takes too long to load, the bounce rate is not the only problem you have to worry about. A delay in page load time can result in a server error that will block bots from your webpages or have them crawl partially loaded versions that are missing important sections of content. Depending on how much crawl demand there is for a given resource, bots will spend an equivalent amount of resources to attempt to load, render, and index pages. However, you should do everything in your control to decrease your page load time.
Here are some quick tips aimed at optimizing your website ‘s loading time:
- Optimize Image Size and Format
- Clean and Compress Your Code (HTML, CSS, JavaScript, and any other code)
- Implement a CDN
Activate Browser Caching - Reduce Cookie Size
- Upgrade Hosting
JavaScript Rendering
Google admittedly has a difficult time processing JavaScript (JS) and, therefore, recommends employing pre-rendered content to improve accessibility. Google also has a host of resources to help you understand how search bots access JS on your site and how to improve search-related issues.
Here are several techniques to enhance JavaScript rendering:
- Avoid search engines having to render your pages
- Include essential content in initial HTML response
- All pages should have unique URLs
- Include navigational elements in your initial HTML response
- Send clear, unambiguous indexing signals
- Let search engines access your JavaScript files
- Remove render-blocking JavaScript
- Leverage code splitting and lazy loading
- Implement image lazy loading with loading attribute
- Leverage JavaScript caching and use content-hashes
Orphan Pages
Every page on your site should be linked to at least one other page — preferably more, depending on how important the page is. When a page has no internal links, it’s called an orphan page. Like an article with no introduction, these pages lack the context that bots need to understand how they should be indexed.
Some common reasons for orphan pages include:
- Poor or incomplete internal linking structure
- Poor housekeeping
- Trouble tracking
- Regular updates and site migrations
- Lack of updating
- Keeping outdated campaign or landing pages after they are needed
Page Depth
Page depth refers to how many layers down a page exists in your site structure, i.e. how many clicks away from your homepage it is. It’s best to keep your site architecture as shallow as possible while still maintaining an intuitive hierarchy. Sometimes a multi-layered site is inevitable; in that case, you’ll want to prioritize a well-organized site over shallowness.
Here are step-by-step guidelines to improve the crawler of search engines, emphasizing the importance of Page Depth:
- Ensure that all links on your website are functional and free from errors
- Remove the NoFollow attribute from internal links to facilitate the movement of Google bots
- Prioritize pages that generate significant traffic by placing them on the home page or making them accessible within a few clicks
- Adopt a horizontal structure with no more than four levels (home page, main categories, subcategories, and specific content)
Redirect Chains
When you decide to redirect traffic from one page to another, you’re paying a price. That price is crawl efficiency. Redirects can slow down crawling, reduce page load time, and render your site inaccessible if those redirects aren’t set up properly. For all of these reasons, try to keep redirects to a minimum.
Fixing redirect chains is crucial in optimizing your website’s SEO and ensuring a smooth user experience. Here’s a step-by-step guide on how to fix redirect chains effectively:
- Analyze the chain of necessity
- Simplify redirect paths
- Implement direct redirects
- Test thoroughly
- Monitor and adjust as necessary
4. Rankability
Now we move to the more topical elements that you’re probably already aware of — how to improve ranking from a technical SEO standpoint. Getting your pages to rank involves some of the on-page and off-page elements that we mentioned before but from a technical lens.
Remember that all of these elements work together to create an SEO-friendly site. So, we’d be remiss to leave out all the contributing factors. Let’s dive into it.
Internal Linking
Internal links help search bots understand where a page fits in the grand scheme of a query and gives context for how to rank that page. Links guide search bots (and users) to related content and transfer page importance. Overall, linking improves crawling, indexing, and your ability to rank.
To get the most out of internal linking, it’s important to create high-quality content and find ways to include internal links where they’re natural:
- Link to and from content-heavy pages
- Create Text Links Using Anchor Text
- Add an Appropriate Number of Links Per Page
- Update Old Articles With New Internal Links
- Add Links Where It Makes Sense
- Only Add Dofollow Links
- Take Site Navigation and Information Architecture Into Consideration
- Regularly Audit Internal Links
External Linking
An external link, also known as an outbound link, is a hyperlink on a website that points to a different domain. They help search engines and users understand the topic and niche of a site, and provide additional value and resources. External links can also boost the ranking and reputation of a site if they come from authoritative and relevant sources.
Here are few tips to enhance external linking:
- Link to Relevant Sources
- Link to Authoritative Sources
- Optimize Anchor Text
- Avoid Link Schemes
Backlink Quality
Backlinks — links from other sites back to your own — provide a vote of confidence for your site. They tell search bots that External Website A believes your page is high-quality and worth crawling. As these votes add up, search bots notice and treat your site as more credible. Sounds like a great deal right? However, as with most great things, there’s a caveat. The quality of those backlinks matter a lot. Links from low-quality sites can actually hurt your rankings.
There are many ways to get quality backlinks to your site, like:
- outreach to relevant publications
- finding broken links
- providing relevant publications
- claiming unlinked mentions
- providing helpful content that other sites want to link to
Content Clusters
Content clusters link related content so search bots can easily find, crawl, and index all of the pages you own on a particular topic. They act as a self-promotion tool to show search engines how much you know about a topic, so they are more likely to rank your site as an authority for any related search query.
Here are a few suggestions to help you organize and create topic clusters:
- Map out five to ten core problems that your buyer persona has
- Do some secondary research to gather the data
- Group each of the problems into broad topic areas
- Build out each of the core topics with subtopics using keyword research
- Map out content ideas that align with each of the core topics and corresponding subtopics
- Validate each idea with industry and competitive research
- Create content, measure the impact, and refine
5. Clickability
While click-through rate (CTR) has everything to do with searcher behavior, there are things you can do to improve your clickability on the SERPs. While meta descriptions and page titles with keywords do impact CTR, we’re going to focus on the technical elements because that’s why you’re here.
Ranking and click-through rate go hand-in-hand because searchers want immediate answers. The more your result stands out on the SERP, the more likely you’ll get the click. Let’s go over a few ways to improve your clickability.
Use Structured Data
Structured data employs a specific vocabulary called schema to categorize and label elements on your webpage for search bots. The schema makes it crystal clear what each element is, how it relates to your site, and how to interpret it. Basically, structured data tells bots, “This is a video,” “This is a product,” or “This is an article,” leaving no room for interpretation.
A few of the most important markups in the structured data repository are listed below:
- Organization
- Local Business
- Event
- Article
- Product
- AggregateRating
- Breadcrumb
- FAQ, How-to, Q&A
- Jobs.
- Recipe
- Video
Win SERP features
SERP features, otherwise known as rich results, are a double-edged sword. If you win them and get the click-through, you’re golden. If not, your organic results are pushed down the page beneath sponsored ads, text answer boxes, video carousels, and the like.
While you can still get clicks from appearing in the top organic results, your chances are greatly improved with rich results. Let’s figure out a few tips to get your content on top of the Google SERP:
- Optimize Your Content Around Rich Keywords
- Search Intent is the Focus
- Create Content People Can Link to
- Reduce Your Content Bounce Rate
- Build Backlinks
- Track Your Results
Optimize for Featured Snippets
One unicorn SERP feature that has nothing to do with schema markup is Featured Snippets, those boxes above the search results that provide concise answers to search queries.
Featured Snippets are intended to get searchers the answers to their queries as quickly as possible. Here is how to optimize for featured snippets:
- Add a ‘What is’ heading
- Use the ‘is’ sentence structure
- Fully define the topic in 2-3 sentences
- Match the featured snippet format
- Never use your brand name in featured snippet text
- Don’t use first-person language
- Scale featured snippets when possible
- Prioritize opportunities where you rank in the top 5
- Iterate your optimizations
Consider Google Discover
Google Discover is a relatively new algorithmic listing of content by category specifically for mobile users. It’s no secret that Google has been doubling down on the mobile experience; with over 50% of searches coming from mobile, it’s no surprise either. The tool allows users to build a library of content by selecting categories of interest (think: gardening, music, or politics). You should optimize for google discover to increase visibility, drive traffic and get more return visits.
To be eligible to appear in Google Discover, content needs to be indexed by Google and meet Google Discover’s content policies. Follow these five tips to optimize your content for discover:
- Create High-Quality Content
- Optimize Your Titles
- Use Compelling, High-Quality Images
- Improve E-E-A-T
- Optimize Your Content for Mobile
A Complete Technical SEO Review | Index Management | Site Structure | Internal Linking |
---|---|---|---|
I will go through your site and check hundreds of on-page factors to identify all of your SEO problems. Depending on the size of your site, it take’s around 20-30 hours of manual labor to complete a review. | I will make sure that you are maximising your crawl efficiency by reviewing every page in the index to highlight opportunity and to make sure we are not suffering from any issues like duplication and cannibalization. | Your site structure plays a vital role in your search visibility and it’s something we pay close attention to because a poor site structure seriously anchors your sites ability to rank. | One of the most underused strategies in SEO is internal link building. I will review your internal link structure to find weaknesses and look at how I can leverage authority to increase search traffic to key pages across your site. |
Content Audit | Backlink Audit | Site Speed Optimisation | Your Complete SEO Audit Report |
Written content is the face of your digital business which either helps you or hinders you. I will review all of your sites content from a quality & relevancy perspective to highlight any problems and areas you need to improve on. | I will review your backlink profile and provide recommendations based on what we find. I will take a look at referring domains, topical relevance, anchor text distribution and other metrics to uncover any link based issues. | Website speed is a confirmed ranking factor so it is no secret that Google loves fast websites. I will review your sites current standing and make recommendations on the easiest ways to increase your websites speed. | Finally you will receive a report that guides you through every SEO issue along with a custom checklist that any developer or in house staff member can begin working through with integration support for 30 days. |