- Words or categories that identify what the page is about
- As time has progressed, the misuse of keywords has encouraged search engines to rely less and less upon them. There is still debate about how much of an impact keywords have on SEO.
- Some SEO experts have recommended putting your important keywords first. This can’t hurt.
- Google, and many other search engines, stores location information for all hits and so it makes extensive use of proximity in search.
- Include common plural forms of your keywords
- Include common misspellings of your keywords
- A good rule of thumb for the keyword tag is 1000 characters or less
- Repeating your keywords too many times can do more damage than good. Once or twice is fine, with different spellings or with misspellings.
- <meta name=”keywords” content=”HTML meta tags metatags tag search engines internet directory web searching index catalog catalogue serch seach search engine optimization techniques optimisation ranking positioning promotion marketing”>
SEO Tip – Meta Tags Part 2 – Description
- The Description Meta Tag should be a brief summary of the contents of the page
- Keep this concise, as if it gets too long it could be truncated.
- A good rule of thumb for the description tag is 200 characters or less
- Here is a good example:
- <META name=”description” content=”Search Engine Optimization Best Practices”>
SEO Tip – Meta Tags – Part 1 – Overview
- Meta tags are page elements that help a search engine to categorize your page properly. They are inserted into the HEAD tag of the page, but a user cannot directly see them (other than by viewing the HTML source of the page).
- Meta tags should be applied to each page, should be unique to the page, and should match the page’s contents
- Any keyword phrases that you use that do not appear in your other tags or page copy are likely to not have enough prominence to help your listings for that phrase
- Meta tags are not the be all and end all of SEO, and are not a magic bullet. However, they are one tool in an entire toolbox that you can use together to optimize your pages
- Overuse or misuse of Meta tags can do more damage than good. Keep meta tags simple, relevant, and concise
Web Analytics Life Cycle – Phases
1. (Re)Define
While reading Avinash Kaushik’a Blog, I found this article on defining the business purposes of your web site – http://www.kaushik.net/avinash/2007/02/getting-started-with-web-analytics-step-one-glean-macro-insights.html.
Defining the business goals of your web site boils down to answering one simple question – What do you want them to do on the web site? Here are some questions to help you answer that question…
- Why does your web site exist?
- E-commerce
- Promotional material
- Contests
- Etc.
- What are your top three web strategies that you are working on?
- paid campaigns
- registered users
- affiliates
- updating content on the site
- trying to get digg’ed
- effective merchandising
- etc.
- What do you think should be happening on your web site?
- This is where you define your key performance indicators. Your KPIs need to correlate directly to the web strategies that you have defined.
- I will spend more time talking about KPIs in another post, but here are three basic questions you should be answering with your key performance indicators:
- How many visitors are coming to your web site?
- Where are they coming from?
- What are they actually doing?
- Your key performance indicators also are an indication of how mature your web analytics process is. I will take the time in another post to discuss the Web Analytics Maturity Model.
When you go through this phase after the first iteration, take this opportunity to re-evaluate and re-define your business goals, your KPIs, and their definitions.
2. Collect
There are lots of tools to help you collect your Web Analytics data. There will be many decisions that you will have to make regarding the collection of your data. The KPIs should be at the heart of your decision on how to collect data. You will also need to keep in mind who your users are.
Tools are split into two major categories – web logs and site tagging. Web logs obviously measure the activity on the server, based on the requests of your site’s pages, images, PDFs, etc. Common tools for web log data analysis are WebTrends and ClickTracks. Site tagging measures actual user activity on the physical web site itself in their browser. Common tools for site tagging are Google Analytics and CoreMetrics. I will take a deeper dive into the differences between the collection methods and a review of the different tools at another time.
When you enter this phase of the process beyond the first iteration, take the time to re-evaluate whether your tools are satisfying your needs, and how the tool collects your data.
3. Analyze
Now that you have collected your data, you will need to analyze it. You should define reports that correlate back to your key performance indicators, and to your business goals. The first time you go through this cycle will be your benchmark. Future iterations should be geared to wards optimizing and improving your results.
During your analysis phase, you should review:
- your business goals
- which KPIs you collect
- the definitions of the KPIs
- whether the tools are right for your needs
- how the tool collects your data
- whether the results are better or worse than expected
- how your data is presented
- who sees your results.
Once all the analysis is complete, you should develop a list of recommended changes to each of these areas. These recommendations should be both technical and business in nature.
4. Adjust
In this phase, you should take each of the areas that were reviewed in the Analyze phase, and the recommendations that were made, and start to make adjustments as necessary. This could be redefining your business goals, adjusting your KPIs, making changes to your tool set, or rebuilding your reports.
Each iteration through the Adjust phase will be different. As you iterate through the lifecycle, the changes that are made in this phase will typically decrease in size and complexity.
Web Analytics Life Cycle
I got the idea one day in the car as I was driving home that Web Analytics is a continuous improvement process. This is not a profound idea, but struck me at the time as being very important. It is not a process that you go through once. The value of Web Analytics is to cycle through the process more than once. This is what makes your web sites better at achieving their goals. Going through the cycle just once and getting the results has almost no value.
I have looked online, and I have not found anyone who has defined a lifecycle for applying Web Analytics. So I have put one together here, very briefly. It is based on lots of other methodologies, such as the software development lifecycle and iterative development methodologies. The standard Deming Continuous Improvement Cycle phases are Plan, Do, Check, Act. I have mirrored these steps in my idea of a Web Analytics Lifecycle
- (Re)Define business goals
- Collect data to measure those goals
- Analyze the results of the metrics
- Adjust your strategy depending on your results
My next few posts will be discussing each of the phases in more detail.
Please leave feedback with your ideas about this fairly new concept. It is still in its infancy, and your constructive ideas are very important.
New Book – Web Analytics: An Hour A Day by Avinash Kaushik
Avinash Kaushik is a leading Web Analytics expert and practitioner. His first book has been highly anticipated and well received. You can go to the book’s web site at http://www.webanalyticshour.com/ , or read reviews and buy the book on Amazon at http://www.amazon.com/Web-Analytics-Hour-Avinash-Kaushik/dp/0470130652.
He is also the author of a famous Web Analytics bog called Occam’s Razor at http://www.kaushik.net/avinash. I plan on both getting the book, and subscribing to the blog.
SEO Tip – The Title Tag
- The title tag is one of the most important SEO tools in the toolbox.
- Changing the title tag is one if the easiest changes to improve page rankings
- The title of your page is stored in the HEAD tag of your HTML page
- It should describe the specific contents of the page, and be as unique as possible
- This will be the title of the page that is shown by the Search Engines to the users
- Important things for the title tag to contain are Company names or Brand names
- Other important things to include are keywords from the keywords meta tag that are relevant to the page that fit naturally in the title
- Here is a good example:
- <title>SEO Article – Make a title tag that search engines will like</title>
SEO Tip – Use robots.txt file
- Robots.txt files tell Search Engines what should and should not be crawled
- NOTE – This is very different from the Robots Meta Tag. The crawler will see this file before it tries to call the page, so this file will override the Robots Meta tags on the pages.
- Robots.txt files should be stored in the root directory
- Remember, the point of the robots.txt file is to exclude pages from being crawled. So if a page or directory is banned, it will never even get to see what code is on those page(s). Accordingly, no code of those pages could change the bots behavior to re-index the page. So, robots.txt will overwrite the meta and robots tags on the page.
- More information regarding robots.txt can be found at http://www.robotstxt.org
- Sample robots.txt file to allow all pages to be crawled:
- User-agent: *
Disallow:
- User-agent: *
- With one minor adjustment, you can prevent all robots from indexing your site:
- User-agent: *
Disallow: /
- User-agent: *
- Here is a sample that will not index a specific directory for the Googlebot crawler:
- User-agent: googlebot
Disallow: /seo/
- User-agent: googlebot
All Grown Up…
Well, it was bound to happen. I have had many hobbies over the years, and since I have gotten married my hobbies have not gotten the attention that they used to. I used to collect comic books, baseball cards, hockey cards, play Warhammer 40K, Magic: The Gathering, and read massive amounts of Sci-Fi and Fantasy novels.
This weekend, while at my parents’ house, my brother found an old box of Transformers. The ones from the mid 1980’s. The movie is coming out very soon, if it is not out already, and we both thought it would be a good idea to put them up on eBay. A few months ago, I sold all my old camera equipment from college, and it sold very well. So I spent this past weekend posting photos of all of our old Transformers hoping that it will do the same.
This has inspired me to start to clean house. I still collect coins (a very light collector, compared to most other numismatists), but I figured it was time to get rid of most of my other collections. So, I have been photographing my comics, and posting them on eBay in lots of 5 or so. We will see how well they sell. You can check out my online store here. I will be posting more comics as the wee goes on. There are a couple I will keep, frame them, and hang them in my office real nice (McFarlane Spider-Man #1 ,Spawn #1, the Donatello OneShot from Mirage Studios, things like that). At some point, I will be going to my parents’ house again real soon and cleaning out all my stuff from their attic. I will be selling my last bit of photo equipment from college, and all of my baseball and hockey card sets.
SEO Tip – Use Sitemap.xml files
- Site Map files are a new standard for search engines. You can create an XML file as part of your site, search engine crawlers can find them, and the file helps define pages and their relation to each other
- More information is available at http://www.sitemaps.org
- The Sitemap file should typically go in the root directory of your site
- Each Sitemap file that you provide must have no more than 50,000 URLs and must be no larger than 10MB
- If you need to provide more than 50,000 URLs or the file goes over 10MB, you must provide more than one sitemaps file. You will then need to list each Sitemap file in a Sitemap index file. Sitemap index files may not list more than 1,000 Sitemaps and must be no larger than 10MB.
- All URLs listed in the Sitemap must use the same protocol (http, in this example) and reside on the same host as the Sitemap
- Some ideas on automating the creation of the sitemap.xml file
- These pages can be generated dynamically as part of the build process
- These files can also be submitted dynamically to the major search engines for indexing
- These files are a huge benefit to Search Engine optimization
- This may be something that can be built into the continuous integration process