Technical SEO

technical search engine optimisationAn important but often overlook ranking factor is the technical seo process. Every Search Engine Optimisation (SEO) project should have tasks that are part of the Technical SEO process.

When you look at your website from Googles point of view or basically from any search engine than their clients are the searchers and they want fast and optimised websites.

How often do you come to a website that takes more than 10 seconds to load? Do you stay and wait these 10 seconds?

With the speed we can reach now on bandwidth and computer power there is no reason to wait that long. A waiting period of 2 seconds is almost too long already therefore optimising your website is a must and indeed is part of the ranking algorithms of the search engines.

The main problem is for business owners like you that you have absolutely no clue what the technical aspects are of your website. Most likely the website has been built by a 3rd party or you have been using one of the free website builders but is your website really optimised?

 

Website Builders

Over the years it is apparent that developers who develop a website do not look at the technical aspects of websites and therefore your website might not perform well. Often images are too big, CSS files and JavaScript are not optimised. The technical aspects of the Search Engine Optimisation process are often overlooked and for you, as a business owner, it is not your job to optimise your website. Your job is to run your business not optimise your website.

The foundation of your website

Having a website is great to market your business to the online community. Looking at your website that you build yourself or have it built by a dedicated website development team did take a lot of effort and a big chunk out of your budget but is the foundation of your website good enough to handle traffic?

Do I not get penalised because of a non-performing site? And here we come in because we will be able to audit your site and optimise it for any search engine to accept it without any errors.

When it comes to “technical” many webmasters are basically no webmasters any more but rather should be called “content” masters. The content is indeed very important but it is just like building a house.

Are you buying furniture when the foundation of your house is showing cracks? No, you do not, the first is that you need to solve these cracks and removing these cracks or better errors for your website called Technical Search Engine Optimisation.

Do Search Engines really care?

There was a time that search engines just “read” any website and updated their index and although they still do that, there is a difference between just a website and a website without errors.

  • Just imagine that your website is loading in 10 seconds and your competitor’s website is loading in 2 seconds, clearly that your competitor’s website has an advantage over your website and that is how the search engines see it also.
  • Just imagine that you have written 100 pages but the search engines only index 50 where that same competitor has also written 100 pages and all 100 pages are indexed. Again, your competitor is taking an advantage over you.
  • Just imagine that you have removed 20 pages out of the 100 but these 20 pages are still in the search engines, resulting in a 404 error for the visitor where your competitor has removed these 20 pages and your competitor’s visitors do not get any 404 error.

Search Engines care about the searchers, the search engines want them to have a good experience and land on a relevant page Here are Keenclick we do Technical Search Engine Optimisation and are happy to help you.

Website structure

The structure of your website is important and we will help you that this structure of your website is working like a dream. There is no need for you to worry about it because we will take care of all the technical issues for you to concentrate in what you do best and that is taking care of your business.

We will check each page of your website on the following:

  • For each page we will find the HTTP status codes
  • We will find broken links
  • If there are errors, then we will find the HTML code errors
  • The size of pages should be low enough to load quickly
  • Every page has HTML and we will understand the size of the code that is used and update and minify where needed.
  • The title of each page should be unique, but is that indeed the case?
  • The meta descriptions should be unique because every page is unique isn’t it?
  • We remove the “keywords” from the pages because these are not needed anymore and you do not want your competitor to know which keywords you are targeting.
  • Do you know how many internal and external links your website has?
  • Are there many nofollow links, the links that do not really help in ranking in the search engines?
  • And there is a lot more we will focus on but it is better to contact us and discuss the options.

At Keenclick we will work with you to make your website better and beat your competitors technically. We do understand that this is all something that you do not see. The groundwork and foundation of a house is not seen anymore after it is built, luckily with Technical Search Engine Optimisation you can solve problems afterwards.

Do It Yourself

Often, we hear from prospects that they will do it themselves which is absolutely fine but especially with smaller businesses we found out that these businesses just do not have the skills for the Technical SEO and therefore the DIY is failing.

There is plenty of information available on the internet and following that will give results but the problem is still the same, lack of knowledge and skills might result in poor performance and therefore a negative result in the search engines.

Site Audit

technical seo audit Every Technical SEO project start with a site audit. This audit is necessary to understand the state of the website and infrastructure. An audit will give the following information upon a plan can be made to get rid of the errors and warnings.

Indexing-Crawlability

indexingandcrawlability technical search engine optimisationThe indexing and crawlability give us the information about errors like 404 errors, most like pages that do not exist anymore but also if an XML sitemap file has been created. And understanding of the resources being restricting from indexing.

It can be deliberate that you do not want certain resources being indexed by the search engines but you need to understand why that is.

Just assume that you have an eCommerce site selling products online and some of these product pages are not indexed, then you need to know why that is so it can be resolved.

Do you have removed pages which is obvious when the content is not interesting anymore. Your business has moved on and therefore content has been removed however because these pages are still indexed visitors get a 404 page (page does not exit) and it would be good to explain to the visitor that the page does not exist by creating your own 404 page.

The robot.txt file is a small file that you can use to help the search engine robot that indexes your site to give an understanding if certain pages should not be indexed.

Does your site have an XML-SITEMAP file which is basically a list of all the URLs within your website and makes it search engine robots easier to understand your website?

You also will see that we check on for example http and https. The search engines consider the security a ranking factor and since it is possible to have a free SSL certificate to secure the communication between your visitors’ browser and the website, there is no reason not to secure it.

You see in the example above that there are some issues with www and non-www versions. This is important to fix because www and non-www are search engines see two different websites and you only have one.

A consequence of not solving that is that these same search engines can punish you for duplicating content.

Titles

A common problem is duplicate titles which might be okay for you but how to tell a search engine what the page is all about, because that is what the title should suggest, where there are more than 1 page having the same title. Again, are you duplicating content?

Meta Description

encoding technicalfactors website auditOften the meta description is left empty and as a consequence the meta description is taken from the main page or the first 300 characters from the page to which the meta description belongs. The meta description is the part in the search engine just under the link to the website.

The text has a blue colour and gives the reader the advantage to read before clicking. You might want to have as many people to come to your website as possible however do you really want everyone coming to your open house that include people who are absolutely not interested but only came over because they read that your house was something they are interesting in.

It is misleading visitors when the meta description does not write a summary of the page. You can see from the images that have been uploaded that it is easy to see where there are errors, warnings or other issues. We at Keenclick will address these and make sure that the Technical Search Engine Optimisation is doing it parts to increase the ranking of your website in the search engines.

In the case of Keenclick, there are no language versions but it might be that you have a multiple language website and then hreflang is becoming an important factor to increase the satisfaction of your visitors.

Use Case Technical Search Engine Optimisation

There are many websites that you can use to see how your own website performs and one of these sites is GT Metrix.

GT Metrix really gives a lot of information that is necessary to know before you start optimising any website.

On the 24th of November in 2018 we run the following report: https://gtmetrix.com/reports/keenclick.com/lJHqxBdg

Clearly to see is that we do very well but at the same time we can optimise the Keenclick.com website especially when we look at the images.

gtmetrix keenclick 24nov2018

When we look in more detail of especially the images then we see the following:

gtmetrix keenclick 24nov2018 images

There are some images that we simply cannot do and these are the SUMO images because they are 3rd party and are shown only because of the engagement we would like to have with our visitors. However, you also see images that we certainly can optimise and these images are always stored locally on the server. We even can see the reduction in percentage for every image.

Now, we have to be aware that not always optimising images is the best thing to do. We have had a client who just did not want to optimise their image for the very simple reason that the optimisation of the images made them blury and therefor there was less quality to the visitors.

It was for them very important to have very good images because of the watches they were selling and the details of the watches was not shown after optimisation. It was also clear that the client had a very good monitor because on "normal" laptop the blurry images were not that blurry. The customer is king so we kept the larger images, makng it a little slower but so be it.

Here, we can however optimise images and that is what we have done. The load is terrible but the images have been improved, there are only now issues with Sumo and as mentioned before. We cannot do anything on that, unless you want to download these images, update the JavaScript but it might be better to just remove Sumo or just leave it as is because there is still more to do than just the images.

gtmetrix keenclick 24nov2018 images sumo

 

It takes time to optimise a website and basically it never stops. This article that is written right now has some images that are copied using Microsoft Snippet and for sure are not optimised. Next to that, the laptop is not a HD laptop therefore the quality of the images is already less although we still should optimise it.

On the other hand for a technical search engine optimisation use case this is just great. We just need to remember that we also will need to optimise other pages than just the home page because that is often what webmasters forget.

This article will be updated once we do more search engine optimisation work for our website.

 

 

 

Post comment as a guest

0 / 500 Character restriction
Your text should be in between 10-500 characters

Comments

Subscribe:
  • No comments found