Gone are the days where you only need great content, backlinking, and on-site search engine optimization (SEO) to get your site to rank well in the search engine results pages (SERPs). Now, you need to go through technical SEO as well.
While you can get by without dealing with all the metrics and best practices in technical SEO, you must remember that some aspects are non-negotiable. Meaning, forget about getting in the first ten pages if you don’t do the following. Here are some of those aspects:
1. Optimized robots.txt
Remember that search engines only allot a limited number of resources when indexing websites. That means, depending on your site’s size and rank, crawlers can only check a limited number of your site’s pages. Because of this limited number, you must make sure that crawlers will only work on pages vital to you, your visitors, and the algorithm.
To make that happen, you need to optimize your robots.txt. In there, you can control crawlers to only check the crucial pages and ignore the ones you’ll disallow. Fortunately, the syntax for robots.txt files is as simple as prefixing ‘Disallow:’ or ‘Allow:’ before a target URL—for example, ‘Disallow: /wp/wp-admin/.’ This will prevent crawlers from accessing the login page of a WordPress site. (1)
Essentially, you should optimize your robots.txt to primarily disallow or prevent crawlers from checking these types of pages:
- Admin dashboards and pages
- Temporary files and pages
- Cart, checkout, and user settings pages
- Search-related and some unimportant GET pages
If you want to get this done quickly while you’re still learning how to do it, you may want to use a free robots.txt generator online.
Before anything else, you must become familiar with Web Content Accessibility Guidelines (WCAG). It’s been a while since it was first drafted and posted on the web, but all the things written there still ring true and crucial in SEO. It’s non-negotiable that you always check the latest version of WCAG. In 2021, version 2.2 has been released.
Now that’s out of the way, the WCAG is there to help developers and web owners alike ensure that the websites they’ll develop will be highly accessible to everyone including people with disabilities.
For example, the WCAG outlines how you can utilize colors to display visual information, how much contrast should you apply, and even the thickness of lines and spaces. Thankfully, many useful online apps, like color converter tools, are readily available for you to use in order to comply to the guidelines.
3. Site speed, HTTPS, and mobile-friendliness
Thankfully, fixing site speed issues isn’t an impossible task anymore for regular web owners. Some free tools and services online can make it easy to detect speed problems and learn how to alleviate them.
You should also make sure that you migrate to Hypertext Transfer Protocol Secure (HTTPS) as soon as you can. Being in HTTPS is now part of the list of things search engines check when indexing and ranking sites. Note that this is part of the recent movement of search engines to incorporate page experience in their algorithms. (2)
Keep in mind that switching or migrating to HTTPS can be a tedious or straightforward task, depending on your hosting provider and the size and build of your website. For example, if your site is running on WordPress (WP), you can enable HTTPS on your WP site without getting an SSL certificate.
On a different note, a website without a mobile version won’t cut it nowadays. Know that most search engine algorithms include mobile-friendliness as one critical factor and signal HTTPS security and safe-browsing.
4. Crawl errors
The last thing you want to happen is to keep search engine crawlers from indexing your site. Crawler errors may cause that dreadful situation. Some of the common ones you must correct are 3xx (redirects), 4xx (client), and 5xx (server).
Fixing them is easy. Establish a 301 or 404 redirect page on your site, and make sure that all 3xx and 4xx errors redirect to that page. When it comes to 5xx, make sure that your web host or server is always up. If it’s not always up, you may want to check with your service provider or your stack or backend developer or engineer.
5. Optimized XML sitemap
Together with making sure your site doesn’t have crawl errors, you should also optimize your XML sitemap. Depending on your setup, you may not need to do this manually. For instance, some content management systems (CMSs) automatically update and generate optimized versions of your XML sitemap whenever you make any changes.
However, if you build your site from the ground up, you may want to manually update your sitemap or create a script to generate new ones for you. The important thing here is that your sitemap must constantly contain the following:
- Link to all newly published content (product page, blog post, product category, etcetera)
- Accessible pages (returns 200 status code)
On the other hand, it mustn’t contain the following:
- Duplicate content
- GET/POST pages or URLs with parameters
- Pages that return 3xx, 4xx, and 5xx errors
Aside from adequately including and excluding URLs in your sitemap, you should also limit the entries to 50,000. If you have more URLs, you may want to create an additional XML sitemap file. Also, be sure to get your sitemaps verified and validated by search engine tools to ensure you won’t have trouble with them once your site’s crawled. (3)
There’s no reason to fear working on the technical side of SEO. Thankfully, there are informative articles on the web that can guide you on what to do. Just be sure to invest some of your time learning, and you can say that technical SEO’s a piece of cake.
- “What Is Robots.txt?” Source: https://www.cloudflare.com/learning/bots/what-is-robots.txt/
- “How Google’s Page Experience Update Announcement Can Benefit Search Engine Optimization”, Source: https://www.forbes.com/sites/theyec/2021/04/12/how-googles-page-experience-update-announcement-can-benefit-search-engine-optimization/
- “How To Add A Sitemap For More Than 50,000 URLs – SEO Snippets”, Source: https://www.youtube.com/watch?v=y0TPINzAVf0