The Fundamentals of a Search Engine Friendly Website
How to Create a Search Engine Friendly Website
Welcome to chapter 4 of the beginners guide to SEO. In this chapter you’ll learn about how to create a search engine friendly website.
In this guide to search engine optimization we cover the following:
Now, lets get into it.
The basics of making a of search engine friendly website
Search engines have a specific way they crawl the web, and understand content. They are very restricted in this way.
Search engines’ will never be able to look at a web page the way we can, making our job harder in designing a web page that is built for the user AND for search engines.
That’s why it’s important to know, how to build websites that are both search engine friendly and user friendly.
How to make content indexable
If you want a better chance of getting indexed by Google or Bing, one of the most important thing to do is to have that vast majority of your content in HTML text format.
Although search bots are very advanced, they still struggle to index or give value to documents that held in Java applets, flash files, and other non-HTML text formats.
The best thing to do if you want to get your content indexed, is to format content in HTML.
However, if you are want a more visual display to your website, you may have to use some advanced techniques like the following:
- Stop using search boxes and make life easier for bots, by using navigation drop downs.
- Insert a transcript for videos and audio, to help with the content getting indexed.
- Make sure to include ALT tags for all images used, as it helps with image bots identifying what the image is about.
- Avoid using flash files and java applets.
How to check if the search engines have indexed your web page
If you have ever created a website or blog, you’ll know that getting your content indexed is a b*tch.
This is a problem every new site owner has, and one that can be made easier by actually seeing what the search engines can see.
You can see Google’s text cache to get an insight into what the search engines see, or use tools like SEO-browser.com for this.
Creating a link structure
Search engine don’t just need to see content to add to their keyword-based database, but they also need to crawl links to index the content in the first place.
Having a website that has a good link structure enables search engines to crawl the entire site, and index it efficiently.
Many webmasters struggle doing this, which is why most websites don’t get indexed properly.
Reasons why pages can’t be reached by search engines:
Content behind submissions forms: Any unique content that sits behind any type of online form will not get indexed by search engines. This is because search engines can’t complete CAPTCHA or enter a password, so whatever content you have might as well not exist.
Robots don’t conduct searches: Similar to the comments made above, search engines can’t make a search, so it’s advised that you don’t leave important content that needs searching for.
Links embedded in plugins, and Flash. Links that are embedded in this format make it hard for search engines to understand the structure. Any information behind plugins are useless to search engines.
Don’t worry only 3 more reasons left!!!
Crawling directive error: You may, unintentionally may it be, have blocked the search engine bots from crawling your site. If you have, you need to go into the robots.txt file to edit this. A little side note; make sure you don’t have any links pointing to sites that have been blocked by the webmaster. It could result in the search engine bots not crawling the link.
Web pages with large amount of links: Search engines may not crawl all of the links on a single webpage if there are “too many” links on the page. This is because bots will crawl only a certain amount of link on a single page, in order to reduce spam and help with rankings.
Frames and iframes: Links can be crawled when they are embedded in frames and iframes, but they’re useless in creating a good link structure. This makes it harder for search engines to crawl.
If you want your site to get crawled quickly and easily, avoid these common mistakes.
No follow tag
Having rel=”nofollow” attribute in a link tells that search engines not to follow this link.
If a webmaster gives you a nofollow link, it means that the webmaster isn’t endorsing the page, and doesn’t want to pass any link juice.
An example of this is blog comments.
WordPress sites automatically use no-follow tags when users leave a comment. This is so it doesn’t pass any link juice, de-value a sites authority, and more importantly prevents blog comment spamming.
Should your website have nofollow links?
Every website, especially popular websites will inevitably have large amounts of nofollow links.
This is because the big websites have tons of inbound links, which will mean their nofollow links will be greater.
Nofollow links are also very important in creating a diverse link profile.
Keywords are very important in the way people make search.
They’re the foundation in the way people string words and phrases together, to make a search query.
When search engines index content they organize it in their keyword based database.
The reason why search engines do this, is so they can then quickly retrieve the information they need when a search query is made.
So is you want to rank for the keyword “beginners guide to SEO”, you need to include the words beginners guide to SEO.
See what I did there.
Keywords rule the way we search and communicate with search engines.
We enter keywords into the search bars, to allow search engines to retrieve the relevant web pages as quickly as possible.
Minor adjustments of keywords, like re-phrasing words and bolding words, all impact what we see in the SERPs.
Search engines are smart enough to asses if a page has any keywords that might be relevant to a query.
The best thing you can do to improve rankings, is to include your keyword in the title tag, meta data, URL, and within the text itself.
A general rule of thumb is; the more specific a keyword is, the more likely you will rank for it.
Since search engines have been around, SEOs have abused keywords and crammed them in web pages, in order to trick the search engines to rank the website higher.
This is called keyword stuffing.
My advice would be to use keywords naturally in the tex,t and get in the mind-set of ‘less is more’.
I personally use keyword strategically, and put them in the places that actually help with rankings.
If you want to read more about keyword research, read my ultimate guide to keyword research.
On- Page Optimization
One of the most important aspects of SEO is on-page optimization.
Having a well optimized page will help you website rank better in the SERPs.
If you own a website,make sure to follow these steps.
Use the keyword:
- In the title tag.
- At least once at the beginning of the text.
- Use the keywords at least 3-5 times in the body of the text. To give the web page an extra boost, use LSI (latent semantic indexing) words. LSI keywords help the search engines get a better understating of what the content is about.
- Include the keyword at least once in the ALT tag of an image; this can help with ranking in the image search result.
- Add the keyword in the URL.
- Add the keyword in the meta description. Including the keyword in the meta description isn’t a ranking factor, but it helps improve click-through rate.
The purpose of a title tag is to help the search engine and users, describe what the content is about.
It’s best advised to follow these steps on how to optimize your title tags:
Be careful of how long your title tags are: Search engines only show the first 65 -75 characters of a title tag in the SERPs, anything else after that will not show up in the SERPs. Make sure your keyword is at the beginning of the title, and at the right length. However, if you are targeting multiple keywords, then a longer title tag should be considered.
Placement of keywords: The closer the keyword is to the beginning of the title the quicker the search engine will recognize what the page is about. This goes the same for the meta description.
Consider click bait: It’s all the rage and everyone does it, that’s because it works. Try to get an emotional response from the visitors, and get the visitors to click on your web page. SEO is not only about optimization but also user experience.
Meta tags are there to describe what the web page is about, mainly for the user.
Below are some meta tags that you should become aware of:
Index/noindex: This tells the search engines whether they should index the site in its database or not.
Follow/nofollow: This tag tells the search engines not to crawl the site.
Noarchive: Noarchive is used to inform the search engines not to store a cached copy of the file.
Nosnippet: This informs the search engine not to display rich snippets in the SERPs.
The meta description of a tag is there to give a brief explanation of what the web page is about.
It’s located underneath the title tag, and is basically an advertisement for your web page.
The purpose is to get people to read the meta description, and click on your webpage.
Writing an accurate, descriptive meta description can drastically increase your click through rate, so mastering this is crucial.
Similar to the title tag, meta descriptions have a certain amount of characters that will be shown in the SERPs, around 150-160 characters.
Don’t worry about meta keywords
Meta keywords had some value back in the day, but now have no value at all. Don’t worry about meta keywords.
Think of URLs as the location of your web page, like an address.
URLs are very important to SEO. They can help you with your rankings, especially if you have your keyword in them.
They can be found in many locations:
- In the SERPs, under the title. And because they are displayed in the SERPs, they have the ability to increase click-through rates.
- On an actual web page, in the address bar.
- URLs can also appear in the body of text, when referencing something. For example, why not check out this article by Tahir Miah.
How to create a proper URL structure
Be accurate: Put yourself in the shoes of your readers, and imagine what you would expect to see from an URL. Before writing the content, think of how the URL should look. Try and make it descriptive, but don’t go overboard with it.
Keep it short: URLs can also have an effect on user experience, so it’s best to keep them short and sweet. This way your URL can be easily seen in the search bar, and can be copied and pasted wherever, like in emails and in the search bar.
Make it simple: Try not to add any extra parameters, symbols, and numbers. It confuses people visiting the website. Instead, keep it simple and static. If you want to separate words use a hyphen instead.
Be smart, slip in a keyword: If a specific web page is targeting a keyword, try to add it in the URL. But remember, don’t stuff multiple keywords in there, as Google will see it as spam.
Canonicalization and duplicate content: Canonicalization is when a website has two or more duplicate web pages, on different URLs. For example, search engines may be trying to access your website, via different domain names. Google may be trying to crawl https://serpjump.com/ and https://www.serpjump.com
This is a terrible SEO practice, as it sends mixed messages to the search engines as to what content they should show to the users.
Duplicate content: Duplicate content is another bad SEO practice, as search engines can de-value your website if duplicate content is present.
Stop confusing the search engine
The search engines ultimately want to give the users the best experience, by showing them results that will answer a query.
If you have duplicate content, search engines don’t know what page to show to the user and it will guess as to which page is the original.
What canonicalization can do, if used correctly, is make sure each web page has one unique URL.
The best way to remedy this is to 301-redirect one of the pages, and point it to one single web page.
You now have a very strong page with higher popularity and relevancy, which is search engine friendly, and will be able to rank higher in the search results.
How to use canonical tag?
Canonical tags are another way in which you can reduce duplicate content.
It can allow you to canonicalize from one web page to a single URL.
The great thing about canonical tags are that you can even canonicalize a URL on website, and point it to another website on a different domain.
When using canonical tags, make sure you use the tags within the web page that contains the duplicate content, and point it to the domain you want it to rank for.
If you hadn’t notice already, a canonical tag is very similar to a 301-redirect, in terms of reducing duplicate content.
But canonical tags are also differ from 301s, as they don’t require your visitors to be redirected to another URL.
Have you ever searched for the ingredients to a recipe?
I bet you saw the list of ingredients appear in the SERPs, didn’t you?
Well, that can be easily explained.
This is achieved with something called rich snippets, which can be embedded on a webpage.
Rich snippets are structured data that allows website owners to mark up their content. This allows search engines to use this data, and present it in the search results.
It’s not mandatory to use structured data, but it can give you that extras edge in the SERPs.
You can use structured mark up for the following:
- Including people.
If you want the extra advantage of your competition try add structured data on your content wherever possible.
Well, that’s chapter 5 over.
Now you know how to create a search engine friendly website, read our next chapter in the beginners guide to SEO, how to conduct keyword research.
If you like this post, share it on social media & subscribe to our newsletter, for more exclusive content straight to your inbox.