What is SEO?
Search Engine Optimization is the process of increasing the number of visitors by achieving a high position within search results when relevant keywords are searched for. It is common knowledge that people rarely look passed the second or third page of the search results. Optimally you'll want a first page position or even the first result in the first page. However, to accomplish this, you'll need to optimize and code accordingly
Choose the Right Keywords
Choosing the right keywords can be painless or extremely tricky depending upon the scenario. You'd want to avoid the generic ones since it is going to be extremely difficult to optimize for them. Try to pick keywords that are just as specific as they need to be. For example if you are a freelancer based in Melbourne, your optimal keywords would be
freelancer Melbourne or
web development services Melbourne. Going for the generic
freelancer or
web development keywords isn't going to do you any good.
Research your keywords. Know which ones are probably going to be searched for most and go from there.
Focus on the Content
Content always comes first. It doesn't matter if you perform some dark voodoo to get your site the top place in the results. You'll still need solid content to back that up since the visitors are going to be leaving pretty quickly if they don't find what they are looking for.
Having good, relevant content is the most important aspect of SEO. Your content needs to be suitably useful for the people who you'd want to find your site. You need the content to make sense to the reader. The content needs to appeal to people and make them come back for more.
Having original content is very important. Don't expect to just copy-paste some text from another site, throw in some keywords and call it a day. You need lots and lots of original content with the keywords in the content itself. If people searching for jQuery come to your page, they expect something related to jQuery to be found on your page. Throw in relevant keywords within the content of the page. But don't just spam them sporadically like with tweeting. Your keywords need to be in the appropriate position and of appropriate density. Throw too much keywords around in the content and you are going to be flagged for spam.
Just as important as having original content is having regularly updated content. Fresh content will bring in people and bots alike which in turn will let you get your site indexed with much more frequency which will in turn return fresher results to the search results. But don't update just for the sake of updating. Bots have little incentive to come back if all the updates you perform are only marginally incremental. Just try to have something fresh for the visitors and you should be alright.
Get a Proper Domain Name
This is a hard to obtain part. If at all possible, get a domain name with the keywords in the domain name itself. www.webdevelopmentaustralia.com is going to have a lot more weight with search engines than www.somecompany.com. Obtaining a domain with the proper keywords should be difficult though.
Domains with the keyword as part of it do look ugly but keep in mind that keywords in the domain name carry very great weight.
Create Pretty URLs
Using a URL scheme where parameters are passed as a query string through the URL make it difficult for search engine spiders to look through your site. More importantly, when you are passing the session ID as part of the URL you are essentially creating a separate URLs for each session with almost the same content. This is probably going to get you penalized for duplicate content. We'll talk about that later on.
Human readable, bot parsable URLs are generally preferred over traditional parameter filled URLs.
www.somecompany.com/games/2009/callofduty6 is generally preferred to
www.somecompany.com/index.php?cat=game&year=2009&name=callofduty6. When crafting pretty URLs try to avoid days, months and years.
www.somecompany.com/blog/seobasics is preferred to
www.somecompany.com/blog/2009/09/09/seobasics
Dish out Relevant Page Titles
The text within the
title tags: the text that is displayed on your browser's title bar is amongst the most important elements of a page. Actually I'd venture so far as to say it's the most important part of your page after the actual content itself.
Make sure the title is unique for each page and contains relevant keywords. With regards to the title's structure itself
Page Title -> Site Name is vastly preferred to
Site Name -> Page Title. There are no reasons for you to feel the title needs to be as terse as possible but on the other hand don't try to make it too long. 60 characters is the accepted limit.
Also whilst using keywords in your title text, please don't try to spin it too much. If the search bot sees too many keywords, you are going to be flagged for spam. Remember, the title text is what appears on the search engine result page. You need to convey as much information as possible without sounding too spammy.
Tweak the Meta Elements
The meta elements used to matter eons ago when search engine bots were less sophisticated and relied on the meta description and keyword attributes to help them. When this was taken advantage of by spammers, search bots started giving less importance to meta elements.
Having said that, it doesn't hurt to include the meta description element. This is the text used in the description of your site. Try to limit yourselves to 200 characters, keep it simple, grammatically correct and include relevant keywords. Keep the descriptions unique for each page.
Optimize the Page Structure
Layouts vary. Positions of your sidebar or navigation vary too. But with respect to the core markup itself, it's best to put your main content as close to the body tag as possible. If your other elements have to be placed before the content, use CSS to position it before the content instead of moving the markup itself before the content.
Footers are wonderful places to link to other content on your site. Don't just ignore your footer. Place links to recent posts or popular posts in the footer. Having said that, try to not make it look like a link farm.
Use Appropriate Tags
Use the appropriate tags when developing a site. The heading tags are widely under used. People are instead using generic div tags to encapsulate important information. This is wrong. Strictly looking at the markup alone, the heading tags lets us see the informational hierarchy of the page and this applies for the bots too. Use
h1 for the title of the post,
h2 for each section's heading and so on.
If you are including some code, use the pre tag. If you think some information is important feel free to make it bold. Bots tend to place value on bolded text just like we immediately see what the bolded text. As always, use it sparingly. You don't want to be flagged for spam.
Craft Proper Links
When creating links, try to stay away from the generic read me text. It's not very SEO friendly. Try to include a part of the child link's title to the anchor text itself. This is not as hard as it sounds. For example, instead of using
read more, use
read more about seo. It doesn't take that much time to change but yields a lot of SEO benefits.
While linking to page on your site, try to make the anchor text as focused as possible.
Portfolio is a better candidate than
What I've Done. The latter sounds more catchy but the former represents better SEO.
Link Internally
Courtesy of Opera
Don't be scared to interlink the pages in your site. If the number of pages is small, putting it all up on the navigation bar is the way to go. If yours is a big site with a ton of pages, just put all the main category pages on the navigation bar. One way or the other, make sure your pages can be found through links on your site.
Thinking outside the box, you could just as easily include a popular post section on each page. This way you get the interlinking SEO needs and at the same time your visitors can get to see some of the popular posts on your site. It's a win-win situation.
Make your site Accessible
Remember, search engines are meant to bring people to your site. Which means your site is primarily for human parsing. Design with them in mind.
Include alt attributes for all images on your site. This is not only good practice but also a necessity if you want valid markup. If it's appropriate include relevant keywords in the alt text. Remember, search bots can't really look at a picture and decide whether it's relevant or not. Appropriate keywords lets it make that decision. As always don't go overboard on the text. Keep it simple and to the point.
Please don't hide your content behind obnoxious JavaScript or Flash. Spiders can't go through those to get to your content. And without content, the entire point of your site fails. Miserably. Avoid this unless you absolutely have to.
Avoid Duplicate Content
Google is very strict about duplicate content and severely penalizes sites which do so. This is regardless of whether the content is on different domains. If the same, exact content appears on different pages, the page last indexed is going to be penalized.
This is mostly common sense: don't have the same content on each page. The footer text can be repeated with no penalties but not if your footer text is big enough to qualify as an article.
Also, your site may dish out alternative print capable pages which might be seen by the search engine as duplicate content. In this case, use robotx.txt to disallow indexing on these pages.
Use robots.txt
Create a robots.txt file to allow/disallow spiders from certain parts of your site. You just create a file named
robots.txt and place it at the root of your web site and all co-operating spiders will respect the rules you've mentioned in the file.
You can do everything from disallowing all bots from accessing a specific folder to disallowing bots from a specific search engine. Read up more about it
here.
Create a Site Map
Courtesy of Opera
A site map lets the search engine know about the existence of pages it might not have discovered through spidering through your site normally. Ideally, you should create a normal HTML site map for your users and an XML site map for the search bots. If at all possible, link both.
Avoid Frames
I can't say this enough: frames are bad. Both from a web developer perspective and a seo perspective. Content inside frames are virtually invisible to search engines.
More disturbingly, even if one frame of the page gets indexed and is returned as result the result would take you to just the frame without all of its supporting frames inside the parent document. Frames cause undue confusion to people and virtually stop spiders from crawling through your site. Unless you absolutely have to, don't use frames.
Reduce Code Bloat
And by this I mean 2 things:
Move your JavaScript and CSS to their own separate files. Spiders have no business with them and it is best practice to remove them from the core markup. Create separate files and include them later.
No presentational markup. This is not only SEO friendly but also best practice. Your HTML markup is no place to define how the content should look and similarly the bots have no reason to know how your site is programmed to look. Format the document to your heart's content in your CSS and leave the markup pristine and clean.
Avoid using a Flash Only Navigation
This is common sense but a lot of designers and developers tend to overlook this. Bots can't crawl through flash based content and if the only navigation is flash based, the bot has nothing to crawl through.
If your entire site is flash based, it makes sense to create a text only version for spiders and bots to crawl through and find your content. It'll take extra time to create that but without a text version to fall back on your site will be virtually invisible to search engines.
Use a Common Domain Naming Scheme
Decide on a common naming scheme and stick to it. Personally I prefer www.somename.com but others may like http://somename.com. Decide on a format and stick to it. Use URLs of this format while linking other pages on your site.
Also decide on a whether trailing slashes are required or not. Search engines considers www.somename.com/seo and www.somename.com/seo/ to be different URLs and there is a possibility you are going to be penalized for duplicate content. To get around this, modify your .htaccess file to redirect to the format you like with a 301 redirect. This tells the bot that the page has been moved permanently.
Submit your Site
If your site is newly hatched and hasn't been indexed yet, it's a good idea to get the ball rolling by submitting it to search engines and inspiration galleries. This not only let the search engines get to your site early but also brings in a ton of new traffic and back links.
Do not resort to link submitters unless you absolutely trust it. A lot of these submit your links to a number of link farms, an activity which might get you penalized. Just stick to the big search engines and galleries.
Check for Broken Links
Nothing stops spiders dead in their tracks quicker than broken links specially in the home page. Check thoroughly for broken links to ensure the bots have something to start crawling through your site.
Create a proper 404 page in case the search engine leads the visitor to an old URL. Include appropriate links in the error page.
Get Linked by Peer Sites
This is the massive step that is going to take you a lot of time to get right. Ideally, you'd want a lot of sites linking to your site and your posts . Each link to your site is considered as a vote to your site by the linking site. Getting inbound links from sites catering to the same user base is extremely vital since the current way of ranking relies on the fact that if a lot of sites link back to you then the site must contain relevant information.
Unfortunately, this is a long, arduous and never ending task and only one thing can assure you this: good content. Provide good content and sites will automatically start linking to your content. The more sites link to you, the higher your rank is going to be.
Do not resort to illegal means to get back links. This includes link farms and so. Doing anything like this is going to get you kicked out pretty quickly. Accepted means of getting back links includes reciprocal linking where a site places a link to another site in exchange for that site linking back to the original site.
The way I prefer is to write for Net Tuts. Each article I write nets me a back link and Net Tuts being as large as it is, these contribute heavily to my rankings. Plus it brings in a ton of interested new visitors. :)
Use Appropriate Tools
Tools like Google Analytics helps you analyze and track a number of data including from where your traffic comes from, which pages visitors look at, how much time they spend at each page, how many pages and so on. Use this data to fine tune your site.
Don't forget Google WebMaster tools. It lets you look at the search queries which bring visitors to your page, whether the spider encountered any error while trying to crawl through your site, which sites link to you and more. Invaluable when you are trying to optimize.
Avoid Black Hat Techniques
I can't say this enough: don't try to cheat. Sooner or later, most probably sooner than you think, you are going to be caught and kicked out with no chance of getting listed again. This includes legit sounding techniques like link farms or cross linking to keyword stuffing and keyword dilution.
Just don't do it.
Wait for the Results
At this point, you've hopefully done everything right. The only thing you need to do is sit back, generate some quality content and wait for the rankings to increase. Be patient, this doesn't happen over night but it definitely happens once you have the basics nailed down.