Search Engine Tools and Software

Free Search Engine Tools You Can Use

Welcome to chapter 8 of the beginners guide to SEO. In this chapter you’ll learn about search engine tools and software.

In this guide to search engine optimization we cover the following:

Introduction to SEO

How Search Engines Work

How People Interact With Search Engines

Why Search Engine Marketing Is Necessary

How to Create a Search Engine Friendly Website

How to Conduct Keyword Research

Why You Should Focus on User Experience?

How To Get Backlinks

Search Engine Tools and Software

Fake News Around Search Engines

How To Track the Success of your SEO Campaign

Now, lets get into it.

Free tools for SEO

Everyone in the marketing world, including SEOs, use a bunch of tools and software to help them out.

You didn’t think we did all this manually did you?

Some of the most useful tools in a SEOs arsenal, are actually the ones provided by the search engines.

The reason why search engines provide tools, is because they want webmasters to create content that is easily accessible by users and search engines.

These tools allow us to get an insight in what the search engines can see, in terms of visitors, content, and user experience.

In the next section i’ve laid out some of the elements the major search engines support.

Standard search engine protocols

Sitemaps

A sitemap contains a list of files and information that allows search engines to get a better understanding in how to crawl a website.

This dramatically helps your web pages getting indexed, as sitemaps help search engines find web pages which they may have never of found by themselves.

Sitemaps are not a one trick pony, they also allow webmasters to help search engines find different types of content, like videos and images.

Sitemaps come in three versions:

XML: XML stands for extensible markup language. This is the recommended format for the vast majority of webmasters.

Most people use XML as search engines can easily parse, and webmasters can generate XML sitemaps through hundreds of different sitemap generators. The only issue with XML, is that it can lead to big files being stored in your sitemap. This is because XML sitemaps have to have an open and close tag.

So be prepared to store large files in your sitemap.

RSS: RSS stands for Rich Site Summary or Really Simple Syndication.

The great thing about having a RSS feed on your blog, is that it’s easy to configure, so it automatically updates when new content is added on your site.

Even though I just complemented a RSS feed, here I go talking sh*t about it. No seriously, RSS can be hard to manage due to the way its updating properties are configured.

Txt: Txt stands for Text file. Simple enough.

It is very easy to use. The format of txt means that there is one URL per line, and each line has up to 50,000 lines. Yes, 50,000 lines. However, you can add meta data in a text file.

Robots.txt

A robots.txt file tells search engines and crawlers what to do when it visits your website.

The file is stored in the root domain of your website directory. For example, you can edit the robots.txt file of your website to let the bots know what web pages to crawl, and which web pages not to crawl.

Robots.txt file can also tell the bots where the sitemap of the website is located.

You can do the following in the robots.txt file:

  • Sitemap: Specify where the sitemap is located.
  • Disallow: Inform the bots not to crawl a web page, or file.
  • Crawl Delay: Informs the bots at what speed they can crawl a server.

NOTE: Not every bot has good intentions.

What do I mean?

Some people set up bots to collect private information. For example, they use scrapers to find contact info, or email addresses.

So I’d recommend using meta robots tag (read below), to keep private information away from prying eyes.

Meta robots

The meta robots tags allows us to create page per page level instructions for the search bots to use.

The best place to put the meta robots tag is in the head section of the web page.

Like I mentioned above, it’s best to put private information here so scrapers cannot access it.

Rel=Nofollow

Remember in Chapter 8 were I mentioned how links act as votes?

Well, using the Rel= “Nofollow tag allows you to link to  a site, without giving your “vote”.

Search engines should not follow the link, meaning very little link juice, if any, is passed to the site that is receiving the link.

Rel=Canonical

rel=canonical

Sometimes on a website the same content could appear, but just on different URLs.

In the eyes of the search engines, all the duplicate content is seen as separate pages; when in fact they are have the same content, but under different URLs. This confuses the search engines, as they don’t know what URL to present to the user.

This in turn, devalues the content that exists on the separate web pages, and can lower your rankings.

You can solve this problem by using the Rel= canonical tag. This helps instruct the search bots, in identifying which web page should be considered the original (the most authoritative), and the one that should appear in the SERPS.

Tools provided by Google webmaster

Main features of Google webmaster tools:

Geo targeting: If a website owner decides to target a specific location, webmasters can tell Google how they want the information to appear, for any given country.

URL Parameters: You can inform Google about every parameter that appears on your site. This helps Google crawl your site faster.

Crawl Rate: The craw rate impacts the speed in which the Google bot crawls your site. However, it does not affect the number of times it crawls your site.

Preferred Domain: The preferred domain, as the name suggest, is the domain in which webmasters would prefer to index all of their pages. For example, my preferred domain is https://serpjump.com.

However, if Google finds a link pointing to https://www.serpjump.com, then Google will act as if the link was pointing to https://serpjump.com.

Crawl errors: When the Google bot is crawling your site, and it stumbles across an error on your website, like a 404, then it will report this issue to you.

Malware/Virus: Google can inform the webmaster if their website has any viruses or malware. Malware and virus will have a negative impact on user experience, so it’s best to sort the problem out ASAP.

HTML suggestions: Google can give suggestions for ways webmasters can improve unfriendly search engine HTML elements.

How is your site doing on the web

social media icons for Search Engine Tools

Luckily for us, search engines give us valuable information as to how our website is doing in the search results.

Search engines give us data like, how many impressions have we got?, how many of those impression come from specific keywords?, our click-through rates, and which pages appear the most in the SERPs,etc.

Site Configuration: This section in webmaster tools allows us to do many things like, update sitemaps, and test our robots.txt files. The site configuration section also helps us to make sure our site is accurately giving the Google bots the right information about our website.

The +1 factor: When people share your content on Google+, it has an effect on how our content appears in the SERPs. The more +1s you get, the higher your content will appear in the search results.

Labs: The lab section shows you tests that are still on-going, but it can let webmasters know how our sites performance is doing, like our site speed.

Bing Webmaster Center

Tools provided by Bing webmaster center.

Main features of Bing webmaster center:

Site overview: You can take a look at how your websites performance is doing by looking at the dashboard. All of the metrics are based on how your website is performing, according to the Bing powered search results.

You can measure metrics like the sites click-through rate, impressions, and number of pages indexed, just to name a few things.

Index: This section informs webmasters how many webpages have been indexed, and lets us tell the Bing bot how to crawl new webpages.

We can also submit URLs for crawling, as well as remove URLs that appear in the search results. Just to be clear, this is a brief overview; there are many more things you can do in this section.

Crawl statistics: You can view how many web pages have been crawled, and find any errors that may have appeared. And like Google webmaster tools, you can also submit sitemaps.

Traffic summary: In this section you can see how much traffic you have received, which pages appear the most, and how many impressions you have for a particular keyword. You can also discover the average position of a keyword, as well as a rough estimate it would cost if you were buying ads for each of those keywords.

Overall, webmaster tools provided by the search engines are essential for us if we want to improve our sites performance. However, using these search engine tools and software can only get us so far. The best way to stay ahead of the game is to take it upon ourselves, to learn SEO.

Now you know what search engines tools and software to use, read our next chapter in the beginners guide to SEO, myths about search engines.

If you like this post, share it on social media & subscribe to our newsletter, for more exclusive content straight to your inbox.

Tahir Miah
 

Tahir Miah is an entrepreneur, digital marketer and SEO enthusiast. He is the founder of Serpjump, a Bedford link building agency. Tahir currently advises companies on the best way to increase revenue and brand awareness through SEO, content marketing and link building. He can be found on @serpjump

Click Here to Leave a Comment Below 0 comments