Sunday, September 9, 2007

A Day in the Life of a Search Engine Friendly Web Page

In a previous article (which I cannot link to here) I wrote that a search engine friendly website is not the same as a search engine optimized website. I'm using that as a jump off point for this article on the topic of building a search engine friendly website. The purpose is to simply highlight a few of the aspects that comprise a SE friendly (not necessarily optimized) website.

The Domain Name

For this series I figured we'd start at the top: the domain name. Many argue about the value of using keywords in the domain name and whether that will make any difference at all in the ranking algorithms. My opinion is if it does make a difference, it's not much. But I also believe that the full process of optimization is largely about doing a whole lot of "not much".

The smartest thing to do is make sure your business name uses your keywords. And no, I don't mean you should name your business No Doc Home Mortgage Loans Company. But if you sell mortgages it makes sense to use mortgages in your name, being sure you can secure the domain name as well.

Since there is really not a lot you can do to "optimize" your domain or even make it search engine friendly so I'll leave you with links to series of articles I've written previously on Securing a Marketing Rich Domain Name. That'll give you some food for thought.

Title and Meta Tags

To reiterate again, I'm making a distinction here between "search engine optimized" and "search engine friendly." They are two very different things. Making your website search engine friendly is largely a one-time task, while optimization of that website is an ongoing process. But in order to be effective with the optimization, your site must first be search engine friendly.

This installment I'll focus on the Title and meta tags, most specifically the description meta, but this can include the keyword meta, though that one is largely irrelevant.

I can pretty much sum up search friendliness of the Title tag and meta description as being two things

1. They are present

2. They are unique

You might be surprised to find how how often number one is not done, even by experienced web developers. About a year ago we signed a client for SEO only to find that their programmers did not create a way to program a unique title for each page. But it gets worse. They didn't even program a way to add any text whatsoever into the title tag. Their code looked like this:

One of the first changes we requested was to add a title to their pages to which they told us that such functionality is not available and it will take a few months before the programmers can add that functionality. Can we say, "Your Fired?"

They chose not to fire their programmers so we fired them!

Many ecommerce systems we see use a global title tag across all pages. Well, step one is complete, the title is present. But now each of those titles needs to be unique for each page in order to accurately represent the content of the individual page each is on.

When working with a database system, the smartest (read: most search engine friendly) thing to do is not to just make the title and meta tags editable for each page, but to allow for unique default verbiage to be automatically generated for the pages until keyword optimized text is created.

When looking a potential client's website the other day we were concerned when we saw that the title tags of each page looked to be typical default text. We contacted the developers to find out if these were editable and sure enough, they gave me the answer I was looking for. They are editable for each page but default text is in place until those fields are edited by the client. Perfect!

If you don't use a e-commerce system for your website then you simply want to go add unique title tags, description tags and possible keyword tags to each page. Don't worry so much about using keywords, that'll be your SEO's job, but for now, just make these elements search engine friendly by getting them in place.

Code Bloat

Code bloat is one of my minor amusements when I'm evaluating websites. I enjoy looking at the source code of a web page and then scrolling down to see how long the code for the page is and mentally compare it to how long it should be. What I really enjoy most is when I see a really well designed page with very little code. That makes me happy, but then I'm pretty easily amused anyway!

I'm not a coder myself, so if you ask me to develop a web page I'm going to use a program such as Dreamweaver to build it. But just because I can't develop code on my own, doesn't mean I don't know how to strip down the unnecessary junk from HTML in order to produce a cleaner coded page.

The problem with programs such as Dreamweaver is that they don't always create the best or most streamlined coding structure. Unfortunately, many professional designers don't even know enough about code to go in and fix what these development programs create. To be fair to Dreamweaver, it isn't even close to the biggest offender of code bloat. That title, from my experience, goes to any product from Microsoft. Especially those programs with a "turn this into a web page" feature. (If you ever want to see some of the worst code imaginable, create a "web page" using Microsoft Word.)

What does code bloat have to do with search engine friendliness?

Reducing code bloat no only cuts down on page download time, but it also makes it easier on the search engines as well. When spidering a page the search engines pull the entire code (or the first 100kb) of the page. Only later is that information parsed. By reducing download time the spiders, which are already fast, can burn through many more pages more quickly, quite possibly indexing more pages than they would otherwise.

Once the pages have been spidered, the reduced code then makes it easier for the engine to parse the data. While engines have gotten much better about getting through the junk code, reducing the amount of code they have to sort through will only streamline their processes and potentially giving you an additional, albeit insignificant, advantage. Of course, we can argue about this all day, but the powers behind the search engines have stated numerous times over the years that anything site owners can do to make the spider's job easier, the better. Take that however you want.

There are a number of things you can do to reduce the code bloat of your website.


CSS has many benefits for programmers. For this article the most relevant one is that CSS greatly reduces the amount of code on your page. This is especially true if you use external CSS files but we'll get to that in a bit. First and foremost, however is that CSS can be used to eliminate duplicate on-page styles from the code. I'll provide links to some how-to CSS references but for now, just know that the amount of code that can be replaced using CSS instead of tags is pretty significant.

One of the other great things about CSS is that none of it actually has to be in the page code itself, but can be called from an external CSS file. The web browser simply has to download the file once and then that CSS document will apply to every page on the website (assuming only one CSS document is used.)


JavaScript itself is not a code saver but what you can do, like your CSS file, is move the JavaScript into an external file. Again, just like CSS, that single JavaScript file will then be used for every page it is required to function on without having any additional download. Moving the JavaScript off the page reduces code length and page download time significantly.

There are a lot of other ways to reduce code bloat and increase page download time. Some of these include compressing images, using text instead of images, removing nested tables, etc. I don't need to go into all of these just so long as you understand that bloated code simply isn't necessary and the benefits of reducing and eliminating garbage code is worthwhile.


Some SEOs will argue whether using keywords in Hx tags actually helps your search engine rankings. I'm going to bypass that argument because there is an altogether different reason for using proper heading tags. Simply put, it helps the search engines understand the relative importance of different textual areas of the page.

Many sites are content with just using a bold or maybe a bit larger font for their paragraph headings. But that tells the search engine very little other than that particular line is of slightly more relevance. After all, any text can be bolded, colored different, or be made bigger in order to create various inflections of tone, page scanability, or to call out certain points that are helpful. But ultimately, these things weigh small in the overall scheme of things.

No comments:

Share on Facebook
Kata-kata Hikmah..! Jelang Pemilu, Jangan Golput ! Di Pemilu 2009