What is SEO? [infographic]
What is SEO?
Search engine optimization (SEO) is a process that includes a few steps aiming at improving the rank on the Search engine results page (SERP) and visibility of a website on search engines such as Google and Bing.
As opposed to Search Engine Marketing (SEM) which uses paid services (text ads and banners) to improve visibility, SEO aims at improving it through organic search without having to pay any fees to search engines.
If you’re not sure about some terminology, use this page.
SEO services started in the mid 90′s when the first few search engines started becoming popular, all you could do back then is submit your website to a bunch of those search engines and wait for results to appear. The process now is more thorough and complicated than it was more then 10 years ago, it includes lots of various steps and requires lots of patience depending on your product, audience and competition.
Why do I need SEO?
10 years ago when we built a website, we used to say “if you build it, they will come”. Now, and with more than 180 million websites on the internet, who’s to say that this is still the case?
No matter what your website is about, what your product is or who your audiences are, you will need to make your website stand out between the huge competition online. Therefore you will need to perform SEO to make sure that when someone searches for your product, you will be appearing on the first few pages of the SERP, if not the very first.
SEO is not necessarily the best option to market website, as it takes time to see actual results and keeping in mind that search engine algorithms can change at any time; but it’s definitely a step that every website needs to undergo before being launched and even after that and on the long run.
How is SEO performed?
There are various methods that different industry professionals use. While some are very effective on the long, other methods promise to give you fast results in a short amount of time. These methods (sometimes referred to as Black hat SEO) will eventually get reported by search engines and you’ll end up losing most of the work because those search engines will ban the website. It’s very important to keep in mind that you can’t easily fool search engines, and even if you could, it won’t be for long and the results won’t be satisfying.
To understand how to perform effective SEO, you’ll need to understand how search engines actually index pages, what information do these engines look for and how they relate it to your website’s content.
Search engines send out “Crawlers” (Also called bots or spiders) which are tiny applications that browse the web collecting information about each and every website, image, video, article or file there is.
These bot collect Meta-data which is basically data about data; for example, the type of information a web crawler would collect from a standard HTML page, would range from basic information like the URL or the title tag on the page, to more complicated data like keywords and their frequency, how relevant are they to the contents of your page, and links coming in and out of the page.
It might seem overwhelming but these are not the only factors that contribute to your page rank; there are lots of other factors as well, like the number of pages that link to your and (more importantly) the rank of those pages and how relevant they are to your page contents.
Here’s an infographic that i prepared to help understand the process of SEO:
For a good, long term, white hat SEO, you will need to follow a few steps according to the main guidelines of major search engines such as Google, Yahoo!, and Bing.
To summarize those steps:
- Spend enough time researching keywords:
This is a vital step to start your SEO process with, make sure you are using the right keywords that represent your product, service, region, and other information that might relate to your website. Using the wrong keywords would be a waste of the whole SEO process. Try searching for those keywords using the destination search engine and take a look at the SERP, if this is where you want to get listed then you’re on the right track.
- Use proper HTML coding that follows SEO guidelines:
- Make use of the <title> tag and meta description: make sure that they have different and relevant values for each page on your website.
- Use heading tags properly: make sure that your heading (H1, H2 etc.) tags have the relevant keywords and are properly located on your pages according to content.
- Use better site structure:
Make sure you use file and folder names that contain the right keywords for your website. For example a page that has the URL websitename.com/my_product_category/my_product_name.html would be much better than a page with this URL websitename.com/353za.html
- Make use of the Alt description for the <img> tag:
Again, you need to use the proper keywords and phrases in this property to make sure images are indexed properly.
Each webserver contains a file called robots.txt, this is mainly a note that your website leaves for web crawlers to read when they first arrive. You can ban bots from searching specific folders on your server that might contain files that are irrelevant to your website contents. For example if you’re using a folder on your site to share files with others, then you need to disallow crawlers from indexing that folder, some people might disallow images folders or downloadable files folders. It really depends on the contents of your website and whether or not you think these contents are relevant.
One of the (often overlooked) attributes of the anchor <a> tag is the rel=”nofollow”, this tells the search engine that these links on your website have content that is irrelevant to yours and should not be used to collect meta-data about your site. For example, if you run a web design company you will be linking to a while bunch of other websites that would have various types of content, you should tell crawlers not to follow those links by using the nofollow attribute.
Have an XML sitemap ready on the root folder of your website with the name sitemap.xml, this would help crawlers determine your website structure and proper linkage between different pages and ensure maximum and proper indexing of all pages. you can also submit that site map to webmaster tools services provided by search engines such as Google webmaster tools and Bing webmaster tools.
Links on other websites with similar content may improve your rank with search engines as long as those links and permanent. Networking is always a good tool to success.
Make sure that you have a script or a server side setting to handle your 404 errors (page does not exist).
Having web crawlers try to find pages listed on the sitemap or robots.txt and not finding them can end up with bad results for your ranking and even banning your page.
As SEO might seem like a lengthy, time consuming process, be sure that it’s definitely worth the while. By performing better SEO before you launch your website, you’ll have a better shot at making it to the first page of search results. Keep in mind that this will not happen overnight, as many people would offer you quick results, this would eventually end up with your site being banned or not reaching satisfying results. Take it slowly, be patient and seek help when needed.
If you still need help with your SEO feel free to contact me at any time!