Technical SEO is becoming an essential part of SEO (search engine optimization). Google uses more then 200 algorithm to index a page for a certain keyword. This article will explain some well-known algorithm
Since 2015 a responive website is more or less mandatory to be in the mobile ranking index of Google. To get a website responsive is not a big deal and most parts are is done in teh CSS stylesheet. For the WordPress users it is even more simple. Just get yourself a new Theme.
Technical SEO = Links (off page SEO)
When Google started, the company was called Backrub as its page rank algorithm ranked a webpage based on how many back links it had. With this kind of history you can imagine that links are seen as important.
Links for the robots are important as this is the way to find new pages. ( next to the sitemap of course). Links for Google are important as they use to tell the indexing program how popular a webpage was. Due to spam behavior the counting of links is at least questionable.
Still Google uses the links as a indexing factor and is thus a huge part of the technical SEO.
Google is using at least two programs. One to index your page for a keyword or a long tail keyword and the second one is to identify spam behavior. As soon as a webpage is identified by one program as spam behavior, this program will tell the index program as one of their algorithm that the index ranking should be reset.
There are at least three factor which are seen as spam.
- Too many keywords
- Not readable keywords by the user
- Paid links.
This link building was and still is, the foundation for technical seo for bloggers. The posts in your blog back are used to earn thrust and links links. Using a blog in combination with social media like Google+, Facebook, Twitter and LinkedIn is nowadays common practice for marketing purposes. You can explore large online webshops and find out that they also use a blog. Most of the time they invite customers to write content for them. For example bol.com has this implemented but ofcourse also a platform like LinkedIn has its own blog full of articles.
As far as I can see, these social links will be the next links which counts most. Simple as a normal company cannot earn links directly to there page. Well at least I have never seen a company who will point out to their competition. And Google only see the same branch links as usefull links.
Technical SEO = Bounce Rate
Bounce rate is the percentage of users who will visit only one page of your website and then leave your website. From your homepage it is better not to link to the outside world, but as much as possible to your own pages. Your homepage is the most important page for search engines and will gather more points. Those points you want to keep in your own website to make it stronger. Linking out to other sources is more for your posts.
A homepage normally doesn’t have much words and it just a portal to give the most important parts of your business. It is normal that you will find more information about the services on the page where you are linking to. And of course it is normal that you link to your own business page.
There is only one problem with bounce rate: the customer can find all what he/she wants on this one page. So if the customer stays 2 minutes on one page and exit to find something else on the web, then a high bounce rate is not a real problem. So for your businesspage bounce rate is normally not a real SEO issue, but for your posts where you give more information, bounce rate is an indication that the information is not that good as promised in your title.
Your bounce rate can be found in google analytics.
Technical SEO = Speed
One of the factors a robot can measure is the loading speed of your page. Nowadays nobody likes to wait a few second before they see any reaction on their screen. Speed of loading a page is therefore becoming one of the factors which are used for your ranking your page on a certain keyword. Google has a tool for this and you can easily find out what the speed of your webpage is. The speed depends on several things:
- How far is your computer located from the server host
- What kind of server host are you using (SSD can speed up the performance a lot)
- For the user experience his connection is important, but with a lot of mobile devices we often have a slow connection. Loading minimized sites for mobile version will help to speed up things.
- In beginning of 2016 the word AMD is becoming popular in SEO land. It is special for mobile devices. Lots of your scripts will not taking in consideration for a page. In WordPress you can already find plugins who will handle this part. This plugin is meant for posts and not for widgeted areas. Like my homepage is build up by widget and will not show if I use a AMD plugin. You can work around it by creating a special homepage for mobile.
Index priority for news and governments
If you look for an answer then you like to find content which is up to date and coming from a trustful source. A website from a government has a high status in trust and will rank therefore high for almost anything on their website.
News from newspapers are normally fresh and can jump to the first page within a few hours. The Newspapers just inform the engine that they will have news on an hourly basis. In this case the crawling robot will visit their sites more often.
Robots are informed by the sitemap for new pages and how often they refresh the page. The content from this news page will receive automatically a high priority but this priority doesn’t last long. This is logical as news on a normal paper also doesn’t last long.
As robots just bringing the information you typed in the sitemap to one of Google’s computers where it is indexed by some program, I assume that the input of time in a sitemap is used and should be used wisely.
Helping the search robots
You can inform search engines in several ways:
- The robot can find your site via a link from another site. So link to a new page will help
- You can inform via webmaster tools of Google and Bing that you own a site. You can inform the search robot about a new sitemap.
- Uploading a sitemap to your server. With the sitemap the robot will be informed about all the pages in your website, how the ranking of the pages is and how often you will refresh the pages.
- Uploading a robot file to your server. You can tell a robot that some parts cannot be accessed or some webpages should not be indexed.
- Using .htaccess on your server. You can redirect webpages to other webpages or a complete website to another website. I am using a plugin for 301 re-directs
- Using canonical in your webpage: In this case you want to inform that should use the www. Version as much as possible. This is important for analytical reasons as robots apparently see http://www.maritime-mea.com and http://maritime-mea.com as two different websites. Example: <link rel=”canonical” href=”http://www.maritime-mea.com” />
Google started in 2015 to promote HTTPS. At this point in 2016 Google will prefer same content from https above http. You can argue that same content only exists in the same blog and that you should redirect this with a 301, but to my opinion lots of business homepages have almost similar content. So starting with https would probably give you a benifit. And if this will not be in 2016 than 2017 will be the year you have to change.
Double content is not smart because you will compete with yourself. A redirect should be made to reduce your double content
You will find all 404’s in your Google Search Console. Redirect them with a 301 to the correct page. This will help your customer to stay on your website
Technical SEO: the conclusion
Technical SEO is becoming a big issue and it is impossible to describe it all on one page. In 2016 AMD, HTTPS and not blocking scripts seems to be to upcoming issues. Speed is normally determined by your host server and the technical code of your website.
Link building which is still the way to get a higher indexed page and is nowadays more social marketing .
To measure this technical SEO for bloggers, Google gives several tools. Google Analytics and Google Search Console are nowadays important to check.
If you read the on page SEO (= content SEO) than your SEO is more or less covered.