Blog Categories
Google / AdWords, Bing / Ads, and even Facebook...

Onsite SEO Checklist – Basic (Draft 1)

November 10, 2011.Finn.0 Likes.0 Comments
Home/Blog/SEO/Onsite SEO Checklist – Basic (Draft 1)

An onsite SEO checklist is an evolving document given the fact that search engines are constantly updating their algorithm. Trying to keep all the factors in one place becomes daunting. But it’s better than trying to keep it all in your head.

I use this checklist as much as I have it out there for others to learn. It’s why I leave it on the website. Some folks get worried about giving away all they know. I don’t. I use the stuff as much for my knowledge and personal use as much as anything else. It’s one thing to know the stuff; it’s another to know how to use it.

I’ve loosely prioritized them in order of what I consider important. Each site, market, industry & niche has different priorities so it adjusts accordingly.

Please look it over. If you think I missed something, let me know. If you think the order is jacked, let me know.

Finn’s Onsite SEO Checklist

  • SEO Design – in industry terms, this can be a taboo subject because some snake-oil SEO salesmen will tell you you need to redesign your website in attempts for quick cash, but in the end, SEO Design is still the most important aspect: Why? Because if the Search Engines can’t read your content, most all other onsite SEO efforts will be for naught.As I pointed out in the Content portion of the Anatomy of a Search Engine section, Search Engines can only read static content: html text, image names and tags, .swf tags.

    Typical Website Languages search engines can read:

    • HTML
    • PHP
    • .aspx
    • .pdf (they can read the documents)
    • .js (javascript – pending format)

    Programming Languages search engines can Not read:

    Quick tricks to see if a Search Engine can see your website:

    1. View Source: with your mouse, right-click just off to the side of the website to pull up the mouse menu. Towards the bottom (IE9), there’s a selection, “view source,” (“view page source” in firefox). Highlight and left-click that selection. This will show your the source code to the web page. If you can see your website copy and image names in the web source, so can the search engine bots. This typically means you’re off to a good start (unless there is the rare “noindex” on the page or in your robots.txt file, but more on that later).
    2. copy and search for a snippet of your content – highlight and copy a section of the website text (32 characters or less) and past it into a Google search. If you web page appears, obviously the search engines found it. Also, check to see if that exact phrase comes up. If you wrote that bit of content originally and other pages are show they exact phrase then either A) great minds think alike or more like B) someone is scraping (copying and re-using) your content. Get’m!

HTML Design – Most sites and CMSes display pages in HTML format. HTML stands for Hyper Text Markup Language. To put it simply, it’s the code that tells your web browsers (and search engine bots) what goes where. This picture goes here, this content goes there…

HTML Design is a whole different animal. W3Schools is a great online tool that provides not only free training but also provide templates where you can put in your code, see how it looks, and then copy it to your website. Great cheater tricks I still use. Start on their side columns on the left, do each one one-at-a-time. A little dedication and sweat equity and you’ll be pretty damned good at html coding and design. Even if all you want to learn is how to put in Anchor Text in a blog post or something simple, go there! It’s what I use.

HTML Design Sections for Onsite SEO:
For onsite SEO purposes, you’ll want to make sure you know the following:

  • Metadata – Metadata sits underneath your website page, in the website’s source. Metadata tells web browsers many different directions and commands. For SEO, the three most important ones are: page Title Tag, Meta Description, Meta Keywords. Filling out these sections will help improve website flexibility, helps clarify page themes for the search engines and, 9 times out of 10, are the sections the search engines use to display your page’s listing in their results.
  • HTML Headers – Ever outline notes of a college textbook? You have a title, subpoints, and little points underneath? Standard Website content is formatted in much the same way. HTML uses it to section pieces of content; search engine assume the H headers are there to section the content. W3 Schoolsshows some great stuff on HTML Headers. Here’s a couple things to note:
    • Only use 1 H1 Header per page – it signifies the primary intent of the content.
    • Only use a couple H2 Headers per page
    • Don’t just use H Headers for styling – they have a purpose. Search Engine bots assume this purpose. You’ll make the bots go dumber than they already are if you start randomly styling bits of content with H Headers because you like the way they look.
  • Create Unique Content – Content must be original! If you are not the originators of the content, write new content using keyword themes of which you will want to rank. If you are the originators of the content, consult the Include canonicals & source tags section (below).
  • Unique content to the home page: Adding a little description on the home page will help the websites know what the website is about. Too many people miss it.
  • Privacy Policy / Terms of Use: Especially if you are taking contact infomercial, running 3rd party ads and/or e-commerce, having a Privacy Policy / Terms of Use on your site is essential. This complies with Google’s Quality guidelines.
  • Postal Address / Phone Number: Once again, especially if commercial or data is transferred between the site and visitors, having verifiable contact information helps assure Search Engines that you are an official website and are not just spamming search engines. This becomes more important as the site gains more traffic.
  • Add keyword-rich content to existing pages – Rule of thumb: a minimum of 200 non-template words on a page (template: header / sidebar / footer content). Adding keyword rich content would increase their potency. Best practice: get a minimum of 400 words. That seems to be most effective post-Panda.
  • What does keyword-rich mean? Marketing has shifted around keywords. Marketers know that if their products services match up and rank well for keywords and phrases users use in searches, then they get more relevant and, thus, sales. Being keyword-rich means your intended website/pages are themed with this relevant keywords and phrases.
  • Where should the keyword go? Not only should the keyword be in your copy, but also in an image, in an H Header, in the metadata and in the URL.
  • “How many keywords?” I hate this word, especially in a post-Panda world. In the copy, assuming 400 words, you don’t need them as much as one thinks. It’s more about the quality of the content. If your keyword is a theme, in the proper locations, then write good content. You should be okay. If that doesn’t help…we used to do 10 for every 30. Nowadays, you can get away with 4-5. But REMEMBER: use derivative keywords as well.
  • Page Speed – Page speed became huge for Google in 2009 and continues to be a major factor today. In short, supposedly you don’t get penalized for a slow page speed so much as you get rewarded for fast page speed. Google Webmaster will show you when he checked you site last, has an online location. Places like will break down the elements of your website and tell you what part loads slow and why. When in doubt, believe Google – but note that the test will take close to 4 weeks to run again.
  • Blog – The most important SEO / Social Tool is the use of a blog on a website. Adding one on the subpages of the site, updating it with news, stories, features…and making sure it corresponds with your email marketing and social strategies will significantly help SEO, traffic and brand awareness.
  • URL structure – website URLs should be written with keywords separated by dashes of underscores.
  • URL Taxonomy – URL, Blog, CMS… – Search engines consider how high up a subfolder structure a website goes. What is a subfolder. It’s a term that describes where on a webserver specific content is held. In the old days, content would sit in a folder that resided in a folder and that path would be represented on the URL – much like folder structures on a computer (unless you just use the search feature to find stuff). For checklist purposes, say you were using a blog / cms and wanted to know where to put pages / blog posts. I would recommend the following:
  • Best – In my opinion, the optimal way to run the posts would be at SEOMoz has a “rule of 100s” which states that according to their data, Search Engines stop pay attention to only 100 pages per subfolder level. Using this structure gives you 9,901 pages from which to play. This is my favorite strategy long term
  • Best of what’s – If you are posting at a rate of 1-2 a week, this structure is becoming all to popular. I won’t call it the optimal structure because of Rule #1: When in Doubt, Plan Broad. If everyone starts putting all their pages up here, Google will switch the priority.
  • Better than nothin’.com/blog/postname – all the posts sit in the blog subfolder, but at least they’re in the top two.
  • Worst to
  • Internal Linking This should get props higher up the priority list. Linking to other pages with full path URLs (including the http://www….) and with keyword-optimized anchor text, internally linking the pages of your website helps show search engines which pages are prioritized.
  • Social Media Integration – Use of sites like Twitter, Facebook, Digg, Tumblr & Reddit will increase viewership, interaction and provide additional links to your site.
  • Change the home page link to .com/ – One Page, 1 URL. Having a home page with .com/index or .com/default.asp (yes, even W3 Schools does it wrong)… means your home page has more than one URL indexing it. That means duplicate content. You want to avoid duplicate content.
  • Add bread crumbs – For ease of user navigation to and from subsections, add bread crumbs. Bread crumbs also improve the site’s internal linking.
  • Alt tags on page links – Originally a way for visually-impaired users to better read a web page, title & alt tags tell search engines more about website links as well as images. Filling these out provides great little places for additional keywords.
  • Include canonical & source tags – Put the canonical tag in the header to verify a particular page as the original source of content in order to show the page to be the original source of content. There are two ways to install it:
  • Redirect .com to www.comconsolidate the SEO potency of the pages.
  • Image naming – the proper way to name images is with keywords separated by hyphens/dashes. For example, name-it-like-this.jpg. This structure will increase keyword richness and improve its chances of being properly indexed in Google Image search. Also, make sure the alt tags are filled out on the images (as mentioned in the Alt Tags on page links section.
  • Social Integration: Google is paying more attention to social sites. Adding Facebook Like, Twitter RT & Google +1 buttons to the site and blog posts will help increase visibility, potential for engagement & increased potential in viewership.
  • Fat Footer: adding a fat footer to the bottom of the site that links to the site’s sections will help increase internal linking and follow Google’s guidelines to ensuring that ever page gets at least one static link.
  • Blog: The most important SEO / Social Tool is the use of a blog on a website. Adding one on the subpages of the site, updating it with news, stories, features…and making sure it corresponds with your email marketing and social strategie will significant help SEO, traffic and brand awareness.
  • Add XML sitemap – Adding an XML sitemap will help the search engines understand the architecture of your site and quickly inform them of changes.
  • Utilize robots.txt – if there are sections of the site that are no longer included or are outdated, make sure to update the robots.txt file so as to let the search engines know those files no longer exist. Once those files have been deleted from the site and removed from the index, you can update the file again. This method will help expedite the removal of outdated content from the system, thereby increasing the potency of your site.
  • I Hope This Helps

    Let me know what more I can do to make this easier for you to read.

    What Did I Miss?

    Guys, lemmme know!

    Categories: SEO
    Privacy Policy | Copyright 2008-2020