seo ebook...keyword stuffing artificial traffic ... to help understand how content is processed...

35
SEO eBook Understanding just enough to be more than dangerous... Jenn Mathews CEO & Founder Jenn Mathews Consulting JennMathewsConsulting.com

Upload: others

Post on 05-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews

CEO & Founder Jenn Mathews Consulting

JennMathewsConsulting.com

Page 2: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

A Note From the Author Before we get started I want to take a moment to explain that this eBook is not the one source you should look at if you are wanting to become a search engine optimization professional. What this eBook does is help individuals that are professionals in other aspects of business to have a basic understanding of search engine optimization in order to make educated decisions. These strategies are not to be used as the one way to optimize and no other strategy works. There are many ways in which one

can approach optimization from link building to quality website content and all that is in­between. What this SEO eBook includes are the strategies that have worked for me since 1997 and what I continue to build on as search technologies evolve. You may find someone saying “That’s not right, you need to do…” or “I find that doing X works better.” which is very common in the search industry, and is perfectly acceptable. It’s up to you to take what you learn from this eBook and apply it where it fits for you. Whether you are a CEO of a company that has an SEO team making decisions, a marketing professional tasked with SEO or managing someone (or people) tasked with SEO, or even a small business owner that want to try some simple SEO strategies for your own website to grow your business. This eBook will provide you with the tools to make decisions that are carefully thought out and to understand what your SEO is talking about. By combining these basic SEO strategies and understanding what other people might try to attempt to trick the search engines, you will find that not only will you start to see results, but your site will survive through the many updates that Google performs regularly that filter out spamming attempts.

Page 1 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 3: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Table of Contents A Note From the Author Table of Contents How Search Engines Work

Looking at Pages Looking at Content Storing All That Data

How People Use Search Engines

ReSEARCHing The Results

What The User Looks At Keyword Research

The Keyword Analysis Using Google’s Keyword Planner

Choosing Keywords Downloading Your Results

Keyword Placement

Broad Terms Categorical Terms Specific (Longtail) Terms Additionally

It All Comes Together There is Still So Much More… SEO Checklist

Keywords Keywords in <title> tag Keywords in URL Keyword density in document text Keywords in anchor text Keywords in <alt> tags Keywords in metatags

Page 2 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 4: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Links ­ Internal Content

Unique content Frequency of content change Age of document File size Poor coding and design Duplicating Content = NO Invisible text = NO

Domains & URLs URLs and filenames Sitemap Website size Domains vs. subdomains Hyphens in URLs URL length

Other robots.txt Server Codes

200 ­ OK 301 ­ Permanent 302 ­ Temporary 404 ­ Not Found 500 ­ Server Error 503 ­ Unavailable

Links ­ External Quality of source of inbound links Links from similar sites Links from .edu and .gov sites Age of inbound links Links from directories = no Links from Social Media

What Not To Do Hidden text Hidden links Keyword stuffing Artificial traffic Cloaking scripts Buying links Link exchanges Spam blogs with comments

Page 3 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 5: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

How Search Engines Work

To fully comprehend how to best optimize your website you must have a firm grasp on how search engines work. It’s good to understand how they find your website, crawl it and grab the data. Then make decisions on how best to display when someone is searching the database of information they have of your website and all those other websites that are similar. This helps you best make decisions when a question arises or if you are deciding to develop a portion of your website and need to ensure it will show up appropriately. There are billions of websites on the internet and billions of users looking for information, products, news, pictures, videos, and so much content that it’s almost unfathomable to even begin to know where to find it. This is where search engines come into play. Users simply know to go to Google, Bing, Yahoo!, or the thousands of other search engines out there providing the best and closest match to what they are looking for. Google’s market share dominance (especially in North America and Europe) is primarily due to its ability to provide users the best, and most relevant match for their query, based on the search engine’s ability to find websites, parse data, and decipher content with their ever evolving algorithms. When understanding search engines always remember that they are a computer program written by humans, clever humans, but humans nonetheless. We begin with the basics of programming; the system to find a website’s homepage and then go from one page to the next looking at the content on each page.

Page 4 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 6: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Looking at Pages Search engines know the domain of a website is the website’s home page. Most search engines are programmed to prioritize the content and the links pointing to and linking from the home page. In addition, just as the domain is recognized as the home page, each URL under the domain is recognized as a directory (with the “/” as an end to mark the directory) and files (with a .html, .aspx., .php, etc) within that directory. The best way to envision this is to think of your computer and how you organize files into folders. Each folder has a subject and the files within are all representative of that subject. To help understand how content is processed remember that computers don’t read. We program them to recognize data whether it be letters, numbers, punctuation, etc. So for a search engine to truly recognize what a word is the program is developed to recognize letters separated by spaces.

Looking at Content When search engine optimizers mention content they don’t just mean the words on the page. What can be considered content is anything that a search engine bot is programmed to recognized as something of importance when a user is searching. Users will use words to search, but while they could be looking for information, they could also be looking for images or perhaps videos. Search engines are programmed to decipher the word that a user might use and try to understand if it is information in the form of an article as they are researching or if it is, perhaps, a product they could purchase. When you look at a website you see a nice layout with colors, images, buttons, and links to other pages. What a search engine sees are words on the pages and a limited list of html code that define important pieces of the page (such as links ­ <a href, images­ <img src, video­ <video, etc).

What We See:

What Search Engines See:

Page 5 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 7: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Storing All That Data One common misconception that’s out there, is the belief that when someone types a word into a search engine, the search engine quickly looks at all of the websites on the internet and presents them in a list in order of a predetermined priority. While looking at websites in real­time is a concept that is pretty impressive, it is not reality. Search engines store all of that data after encoding and compressing it to save space. To envision this in it’s most simple form is to think about an excel file with a website domain in each row with the cell next to it filled with encrypted words as a representation of the text on the page. The next row under the domain is the first page found linked to the homepage, then the directory as an index to the domain followed by the pages within that directory linked from the directory’s main page, and so on. Each page has a score assigned to it based on the words on the page, where it is placed on the server, how it is linked to from the homepage, and how other websites link to it. When a user searches with a word the search engine will dig into it’s database and pull up all the websites, directories, and then pages that include that word. Then display it in the results in order based on that search engine’s scoring system.

Page 6 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 8: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

How People Use Search Engines

We’re going to flip our brains from thinking analytically and technically to a more creative way of thinking. At this point, when I’m delivering this content to a live workshop, I usually recommend we take a break, so if you can put this eBook down for a moment, get a glass of water, or find a reason to turn your mind off from it for a few minutes before coming back, I believe it will help you as we continue forward. Understanding how people use search engines to find what they are looking for will help you as you continue to organize and develop your content for your website.

ReSEARCHing No matter what anyone is looking for on the internet, they always begin by researching. Whether it be a Flash Drive for their computer, someone to mow their lawn regularly, a company to help them move house, or they’re looking to find out more information after a diagnosis from their doctor. The research phase helps them to identify and focus on a specific product or person/company to provide the service they need. A perfect example of the process in which someone begins their search would be one where an individual is attempting to find a new computer.

They will most often start with a search engine they are familiar with and type “computer”. The results they see are for all types of computers, including desktops and laptops complete with specs and accessories. The user searching, learns through this research that they need a computer that fits their lifestyle of traveling and frequently working from home, so their search becomes more specific as they research with the two word phrase “laptop computer”. As the user continues to research the types and sizes of laptop computers, they eventually end up at the realization that a 1TB laptop computer will best fit their needs. Armed with this realization, they then go back to their search engine of choice, and type in “1tb laptop computer” which further filters their results to

show them only those that meet their needs. It is up to you to understand what phase your user is actively searching in, what keywords match that phase, and how you want your website to show up in response to that type of query. It is absolutely beneficial for your business to show up for more specific searches as users are in a buying phase rather than when they’re in the research/information gathering phase. It is also in your

Page 7 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 9: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

best interest to show up for all search phases as users will come to recognize your brand, your website, and any information you might have. People tend to purchase from brands they trust and know as a knowledgeable resource. By appearing at all stages of research, and providing helpful information that allows the user to make the best decision, they will most likely choose your business to purchase from than any other random site they might come across in the final stages of their buying process.

The Results Just as simply showing up for all phases of the researching phase, it is equally important to not only understand what your results look like, but also how your users will be viewing those results. Let’s use Google for an example. Google understands the user’s intent when searching and makes it’s best attempt to provide the user with the best experience based on that search. If a user is looking for “computers” as they are beginning their researching phase, Google provides them with a simple results page including a list of shopping suggestions (from Google’s shopping engine) some ads on the right and natural search results that are more of a broad match and in some cases provide research information. You will also notice there are further links to help the user onto their next phase (notice the “Laptops” and then “Desktops” links below the first “Best Buy” result there. Those are what we call “Site Links”.

As users get further down their purchasing phase and search for the “1tb laptop computer” they see Google’s shopping results focused more specifically on the exact type of computer they are looking for,

Page 8 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 10: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

in addition there are more ads in the main results as well as the right as more advertisers want to appear for those specific terms. The natural search results, or as SEOs tend to call them “Snippets”, provide more information including visual star ratings, number of reviews, pricing information, and in stock or out of stock information just before the description.

Before beginning any type of optimizing for your website, it is good to perform a search on what you believe your first, second, and third level search terms might be and get an idea of what your users might be seeing. When doing this make sure that you are not logged into your Google account as Google will provide more specific results based on your behavior, and therefore may not necessarily show you the same results your users would see. Also, check your location when performing your search by clicking “Search Tools” then “Location”. You can either set it to the “United States” (or whatever country you might be in) or perhaps the state you are in, maybe even the city (trying out different locations to see of the results change).

Page 9 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 11: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

What The User Looks At

When a user is searching and sees the results a search engine presents them, it’s not always clear what their eyes are drawn to first. More sophisticated marketers and companies have performed their own analysis to best understand what their core audience might focus on first. While spending my time at Classmates.com as their SEO Manager in 2006 I worked with the in­house Market Research team to understand what would drive a user to search and click on our results. We found 12 people that had never heard of Classmates.com and presented them with the scenario of “You are curious to find someone you had attended High School with and want to try to get in touch with them again. How would you go about it using the internet?” All 12 research participants said they would go to a search engine and type that person’s name with the school they had attended. We then showed them results we mocked up with what was showing up on the major search engines at the time as well as our result in the first position for half of them and then the third position for the other half. A majority of the people we interviewed clicked on the Classmates.com result regardless of what position it was in. One man immediately clicked on the first result without thinking it through, and one woman clicked one of the ads to the right. We determined the Classmates.com brand at the time (this was when Classmates.com was a well known social media network) contributed to the clicks. The practice itself was very beneficial as we learned some of our audience will just simply click the first result without thinking it through, and some will click the advertisement. This lead us to strive for that first result, always mentioning “Classmates.com” in the title and description, and spend money on advertising in addition to our SEO efforts. Performing the same, or a similar study is fairly easy even for smaller businesses with a limited budget. Ask a friend or family member to help you out. Have them open up a browser on their own computer and present them with a scenario that would match that of your audience. See what they would do from there without guiding them in any way. I have even been known to ask people I see in coffee shops, that appear to fit my client’s audience, if they have a minute to help me out. Then buy them a coffee for taking the time to help me out.

Page 10 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 12: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

If you do some searching, there are many white papers and other primary research data sources available from companies such as Forrester Research. The information within can give you some insight into your target audience and help you optimize for them better.

Page 11 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 13: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

The eye tracking screenshot below is the most recent one I could find on the web from February of 2013 by Robert Stevens showing that users generally look at the first to third results with the occasional glance at the advertisements to the right.

With new technologies and more users searching on multiple devices it is even more important to understand what your users will be searching, on what device, and what the results look like to best optimize your site for the optimal position and your snippet.

Keyword Research

Understanding what your users are searching on is the most important part of the search engine optimization process. Not only does it help you determine what words and phrases to optimize for, but it can also help you structure the navigation of your website. You can then set your URL hierarchy and help in making decisions with additional marketing efforts. The key to performing your initial Keyword Analysis is to find as many key terms as you possibly can. No matter what the focus is, or if it’s not 100% relevant to your business. The more terms you have to begin with, the better you can do to plan out your website and your marketing.

The Keyword Analysis A Keyword Analysis is simply a means of gathering up as many words and phrases as possible that users are using to find what they need on search engines. There are numerous sources to do this including:

Google Keyword Planner ­ Free to account holders Bing Keyword Research ­ Free to account holders Wordtracker ­ cost from $27 ­ $99 per month (free trial available) Keyword Discovery ­ Free $29 ­ $495 per month (free basic search) Wordstream ­ $199 ­ $399 per month (free trial available)

Page 12 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 14: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

I personally recommend setting up a Google Adwords account and using the Keyword Planner.

Using Google’s Keyword Planner

Google often changes the methods by which you can perform a Keyword Analysis. In the past you were able to use their tool for free without an Adwords advertiser account. Recently they have required that you sign up for an account, but the tool is essentially still free within. Remember that in performing this search for your keyword analysis you are accessing Google’s database of keywords users have typed into their search bar to find what they are looking for, and then providing you data around those words and phrases. It is also important to remember that the tool is set up for advertisers, so the data within and the way in which is works is catered to them. After you complete the process of setting up your new Google Adwords account you’ll want to login and click the “Tools” link in the top navigation and then select “Keyword Planner”. Once you open up the Keyword Planner you will see a few options that allow you to get started in your search.

Since you are starting with a completely fresh Keyword Analysis, select the “Search for new keywords using a phrase, website or category” under “Find new keywords”.

Page 13 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 15: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

From there you will be presented with several options:

The only selections you should worry about the first time you perform your Keyword Analysis are the first few settings, location and language. Any additional will simply narrow your search and possible leave out terms that might be valuable to you. Remember that the key is to find as many words and phrases as possible.

1. Enter in a word that best describes your product or

business. In this case I am going to use “computers” as our example.

2. While performing your initial Keyword Analysis enter your homepage for your “landing page”. In the future as you are building out categories later, or optimizing for a single page, you will want to use a more specific directory or individual page.

3. If you can find your product or business category then it helps Google show you more relevant terms. This first pass you want to keep it as broad as possible so that you see more words as options. (optional step)

Page 14 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 16: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

4. You can leave your location settings to the United States, set it for the world, or you can get more specific to your location by state or city. When performing Keyword Analysis for clients with a brick and mortar location I will often start with a statewide search and then change to a surrounding area and then city specific to compare search trends for the different locations.

5. Next select the language you would like to focus on. I almost always select “English” to avoid any common words in other languages. If you are building a website in Spanish for the Hispanic population, then of course you would select “Spanish”. In most cases, however, a website should be in English and optimized for English.

6. You can either select to see words from Google search only, or broaden your results by adding “search partners” as well. I usually just leave it to Google to avoid any odd searches that might happen in other sites.

Page 15 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 17: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

7. Skipping past the other settings and looking at the Keyword Options be sure you to have the “Show keywords in my account” and “Show keywords in my plan” selected if you already have Adwords advertising running so that you are getting all available keyword suggestions as possible.

Choosing Keywords

After you have clicked the “Get Ideas” button you’ll see a set of results categorized for you in what Google calls “Ad Groups”. Remember that Google’s Keyword Planner is developed for Advertisers, so as a result Google automates the process of categorizing the terms as the system feels makes the most sense. As we are using the tool for SEO you will want to click the “Keywords” tab at the top to see all of the keywords that Google suggests. After you click the “Keyword Ideas” tab you will see your terms broken up into two sections. The first block contains the words and phrases that you typed in. In this case we see just the “computers” word since we are performing a very broad search.

The second set just below are words and phrases Google thinks match your search. Here you will see a good selection of words that make sense, however being that Google is a program with algorithms, there will be a lot of terms that may not make sense or are not what you would like your website to show up for.

Page 16 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 18: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Carefully go through and add the terms that make the most sense. While you are adding your terms you can begin to organize them right away. Simply click the Ad Group that you are adding the terms to and change the name immediately. In this case I am using “Desktop Computers” and adding any terms I feel fit that category.

As you add more terms that don’t fit that category you can add a new Ad Group and give it a new name making the most sense for your new category.

In this case I am adding “Laptop Computers”.

Now keep adding key terms to your Ad Groups.

Page 17 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 19: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Be sure to double check the Ad Group you want is highlighted. More often than not I have run into issues with selecting the Ad Group I want and not getting it properly highlighted and adding terms to the wrong Ad Group. Your next step will be to download your list of key terms to Excel. Do take note that Google only allows you to download up to 2,000 terms at a time. When you get to 2,000 download what you have, delete your Ad Groups, and then continue.

Downloading Your Results

Once you have added all of the key terms you can find, download the results into an Excel CSV document. This allows you to categorize your keywords more efficiently, and play with the data to help you make the best decisions while optimizing. Click the little arrow that is located at the bottom of your list of Ad Groups on the right hand column.

Next select the “Include average monthly searches, competition, and other statistics”.

Page 18 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 20: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Then select “Excel CSV” and click “Download”. Now you have all your words and should have them categorized in ways that make sense to your users. From here you will be able to optimize your website in a way that makes sense and follows the way your users are researching. When I prepare a Keyword Analysis for clients or my own websites I will also pull referring terms that have driven traffic to the website in the past 12 months. I combine that with an additional set of terms that the website has gotten impressions for (but not necessarily clicks), as well as any search terms that the website has shown up for if the client has been running any Adwords or Bing Advertising campaigns. All of these reports help me to see what the website shows up for, what additional terms the website could get to show growth potential. This also shows any potential increase the site could see on current terms that aren’t generating clicks by increasing ranking position and testing title and descriptions (as well as ratings, reviews, pricing, etc) that show up in the snippet for those terms.

Keyword Placement

Keyword placement is very important to optimizing your website properly. During our process of categorizing our terms we placed them into buckets that make sense. Your terms should flow in the following hierarchy:

Broad Terms ­ Select a few broad terms that represent the whole website Categorical Terms ­ 2­3 word phrases describing each section of the website (directory or

microsite) Specific Terms (or longtail) ­ 4­5 word phrases for specific pages (or files)

Using the “Computers” example I broke out my categories into 4 main ones, with a couple of sub­categories for the “Accessories” category.

Computers (broad) Desktop Computers (category) Laptop Computers (Category) Computer Accessories (Category)

Monitors (sub­category) Keyboards (sub­category) Mice (sub­category)

In a robust Keyword Analysis there would be far more categories than what I am showing, however, to keep things simple while we are learning I chose just a few to get us started.

Page 19 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 21: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

The hierarchy determines our website, directory, and file structure. The word “Computers” will be focused on the homepage, and placed on every page of the website. Each category term will determine the name of each directory with the individual keywords located inside that directory as file names. Sub­categories belong within the directory above with another directory within followed by a file within those subdirectories. It should look something like:

For each level of your keywords and where they go on your site there are a set of core placements that every site should follow.

Broad Terms Those 1­2 word terms you chose that describe your website (product or service). Shouldn’t be more than 2 (maybe 3) terms and no more than 2 words each. Placement:

1. Homepage Title and within every title of every page within the website. 2. Within the Meta Description of the Homepage, and on every page of the entire website

(somehow) 3. Within the body of the homepage (preferably within a 200­300 word description of your

business, website, product, etc that is for the user to quickly understand what your website is about)

4. Within the body of every page of the entire website at least once – preferable once at the beginning of the content on each individual page, and again towards the bottom. (Do not just place the terms in commas, with periods, or standalone in the same spot on every page. They MUST be within the content and naturally flow with the rest of the copy)

Page 20 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 22: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

5. Within Image alt tags (in a way that still describes the image if possible, if not possible don’t do it)

Categorical Terms Those 2­3 word terms you chose your category to focused on. Shouldn’t be more than 1 term and no more than 3 words each. Placement:

1. Within the Title of the main page and the supporting pages in that section. 2. Within the Meta Description of the main page and the supporting pages in that section. 3. Within the body content of the main page and the supporting pages in that section. (preferably

at least once and up to 3 times – beginning, middle, and end of the unique content.)

Specific (Long tail) Terms Those 3­4 word terms you chose that fit within the category you have them in. Shouldn’t include more than 4 words. Placement:

1. Within the Title of the individual page. 2. Within the Meta Description of the individual page. 3. Within the body content of the individual page. 4. In the URL of the file 5. In the heading of the file 6. In or surrounding the anchor text of the link pointing to the page (internally and/or externally)

In general, the page should be focused on these terms

Additionally You will want to support your hierarchy not only with the URL directory and file structure, but with internal linking as well. By adding simple breadcrumbs on each page, as well as linking pages that correlate with one another will allow the search engines to crawl more efficiently.

It All Comes Together Your website is now organized in a way that can be presented to your user as they are in each process of their buying decision. By telling the search engines the story of how your site is structured, what your website is about, and what are your individual products and services you should show up for each level of search.

There is Still So Much More… Adding features to your website such as a blog, user generated content (reviews, questions and answers, memberships with profiles, etc), data driven content (lists of what users also viewed, suggested content, etc), or additional content and features will benefit your SEO and give your site an advantage above your competition. When optimizing a website for clients I will most often perform a full competitive analysis and present a strategy that outperforms those websites that are showing up in the first few positions. What I have given you here in this eBook provides you the basic foundation that will allow your website to have a great start and build up from there as you continue to optimize. I do recommend that you take the time and understand your competition as well.

Page 21 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 23: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

SEO Checklist

I originally had plans to stop this eBook with the keyword placement, however, in order to get you to the level to be more than dangerous in SEO it’s extremely important to fully understand all the aspects of the website and search optimization. In addition you will need to know what are considered strategies that are okay and those that could get a site banned. There are, of course, hundreds of thousands of algorithms that are programmed into a search engine, however, the list below is comprised of the core essentials that every SEO should look at when performing an SEO Audit, and what everyone should understand when optimizing.

Keywords

Placement of keywords is the core of optimization so it’s imperative that you check and double check that the keyword is in all of (or most) the following.

Keywords in <title> tag

When it comes to the broad keyword and the long tail it is imperative that the target keyword is located in the title tag. The best placement is first, but if you can’t get it listed first and still make sense, or it effects your click­through rate in a negative way, try to at least get it in there somewhere. The biggest reason being, the keyword the user is searching is highlighted in the snippet and studies have proven the bold keyword increases your click­through rate. Your category term might be a bit tough to get into the title tag of all of the files (or pages within), so don’t stress if it doesn’t make it as long as your long tail term the page is focused on is in there.

Keywords in URL

Not a high priority, but it is extremely beneficial if you can work the keyword into the URL. This will help the search engines understand just a bit better what the page is about. In the example I provided previously of the file structure for our computer’s website, you’ll notice the file names within the “desktop computers” directory didn’t repeat the word “computer” again. Instead I just called them “best” and “gaming” (i.e http://www.mysite­computer.com/desktop­computers/gaming.html). I mainly did this because the word “computer” was mentioned twice already in the URL string, and a third time would be overkill. A mention of your broad term just once in the string will work just fine if that’s how your URL works out. So don’t stress too much about trying to get the keyword mentioned over and over again. Too much repetition could be considered keyword stuffing in the URL and get your site knocked down a score or more.

Keyword density in document text

This one is extremely important. You want to ensure that your keyword isn’t mentioned too often within the words on each page.

Page 22 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 24: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

The idea behind density is to break up words throughout the page evenly. A page that has 300 words should mention the word a couple of times, once in the beginning and once towards the end. A page

that has 900 words with 3 paragraphs of 300 words each should spread the words throughout the page rather than have 6­9 mentions of the word in the first paragraph. A quick way to check density is to eyeball the page with a “find all” search within your browser. Microsoft Edge has a “Find on Page” setting that highlights all the mentions of the word you are searching. You can see how often the word is appearing and where very quickly.

Google’s Chrome browser has a “find” in the settings drop down, or you can type CTRL+F on your keyboard and the words will highlight just as they do in Microsoft’s Edge. Another way to check your densities is by copying all content and pasting it into Microsoft Word then performing a find and “highlight all”. There are many search engine optimization experts that say you shouldn’t worry about keyword densities. My take on it is that densities were once important, and still are looked at. They won’t benefit you if you focus on trying to beat your competition, but they are important when ensuring your own website isn’t inadvertently stuffing keywords within the content.

Keywords in anchor text

A common practice in optimization in the past was to include your keyword in the text that is linking to your website. That has changed quite a bit recently as search engines look at the words surrounding the link more than the anchor text itself. It’s still good to check your anchor text that you aren’t using the words “click here” or “more” as the link too often with your keywords mentioned just before the link. The link should be natural and make sense. If the keyword is in the anchor text, that’s perfectly okay to do as well. While performing your check list it’s good to have the practice of checking over the links, how and where keywords are mentioned, and avoid any potential tendencies to appear to be spam.

Keywords in <alt> tags

Images often are overlooked and forgotten when developing and even optimizing a website. The <img src="file.jpg" /> is as simple as can be, and spending time to add the alt="cheap laptop computer" in the string is just one more mention of your keyword on the page. It also allows Google Image search to rank your images for users that might be using image search to find what they are looking for. ECommerce sites will benefit even more with mentions of the product name specifically in the image alt tag.

Page 23 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 25: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Keywords in metatags

You meta tags are the title and description located in between the <head> and </head> section of your page’s code. I mentioned the title tag previously, but it is equally important to include your description tag. Search engines won’t use these to increase your rankings, but just as the title tag will display your keywords in the title of your snippet, the description tag will show up as the description in the snippet. A mention of your keyword in your description will highlight it in bold when a user searches and your page or website appears in the results improving your click through rate. If the description doesn’t make sense with the keyword in it, don’t worry too much about it. It’s best to have a description that makes sense when read than one that doesn’t make sense just so that the keyword can be in it.

Links - Internal Your internal linking allows the search engines to crawl from one page to the next. Your linking structure and URL hierarchy are what tells the search engines the story of your website. Beginning with your homepage as the most important page of your site, and links from there are considered the second most important. Links provided in your main navigation can be considered important, however if you use a mega menu or a means to link to every page of your website (or the majority of them) you will find that the search engines won’t take those links as seriously and focus more on the ones within your pages that link to one another.

Content It might almost seem obvious that content on your website is important, but sadly you will find websites created with almost no content and an abundance of images. While visually appealing as these sites may be, search engines have a difficult time understanding what the website is about. Remember they are computer programs developed to pull the words on a page and match those words up with similar words. If there are no words on a site then the program will simply move on.

Unique content

Sure it’s easy to just copy content from one site that is ranking and paste it into your website and get rankings, but with copyright laws, and the constant battle to fight spammers gaining rankings and flooding the search results with poor experiences for users, search engines don’t like it. There have been countless algorithms developed to push websites that contain content that is on other websites to the bottom of the results. In some cases syndicated content is acceptable if properly cited, however, the content used is not counted as unique content and therefore will not count towards rankings. If a site is using syndicated content, then I recommend adding data driven content that remains fairly stagnant to provide unique enough content to support the page. Sites that duplicate content on their own website from one page to the next will simply disappear from the results or block of similar content will just not be counted. This is why keywords listed in footers of websites, or keywords in the navigation for the website won’t be counted. I have even seen websites

Page 24 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 26: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

that inadvertently have trouble with duplicate content due to products listed on pages that are similar with different URLs or issues with parameters causing multiple URLs with the exact same content in each of them. Check your site that you aren’t inadvertently causing a duplicate content issue.

Frequency of content change

The frequency of how often your content updates is important to understand when it comes to your specific website. If your website is an online publication you will want to “train” the search engines to visit more often and crawl through your new content quickly and therefore ranking pages immediately. If your website is a static brochureware website that doesn’t update often then your site will still be crawled often, but your pages will sit tight and hold their rankings with the content as is. Check how often Google and Bing are visiting and crawling the website through their webmaster tools.

Age of document

Google has a tendency to get excited about new pages and websites immediately and rank them quickly (just in case they are a hot topic or a new site that might become popular fast). As time goes on the website and/or pages that were new fade into the distance. Rankings will rise and fall for a few weeks to a few months and eventually find their place. After they find their place they tend to stick fairly well (if optimized properly). The older the website and pages are, the better the scoring they get.

File size

Large files that take a long time to load are not only bad for user experience, but also bad for search engine optimization. If a search engine comes across a page that times out because it is taking too long to load the bot will just move on to the next page. Too many pages that take a long time to load on a website and the search engine will simply move on. No content will be looked at, and URLs are essentially non­existent as far as the engine is concerned. A great way to check a pages load time is at Tools.Pingdom.com. Just simply enter your website or URL of the page you want to look at and a report will be generated for you. You can even see, in detail, what elements are on the page and how long they take to load.

Poor coding and design

Search engines want to show the best results to users that are searching on their sites. If a website has a poor design causing users to bail quickly, then they wouldn’t want to show those pages in their results. Make sure your site is easy to navigate, that content is easy to read and access, and all important information is at your user’s fingertips.

Duplicating Content = NO

As I mentioned previously, content that is taken from other websites is considered bad. Check that you aren’t duplicating content from other sites, or on your own site. You can use a tool like CopyScape to help you see if there is other content like yours on the web.

Page 25 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 27: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Invisible text = NO

It almost seems silly that I should need to mention it, but you would be surprised at how many websites attempt to hide content on pages of a website with white text on white background, hidden divs, or in the alt text of a blank image. Check through your website that you don’t have any hidden text that a former SEO might have placed, or a developer that thought they could get you quick rankings might have snuck in.

Domains & URLs In all that you have learned up to this point you should know that your domain and the URLs of your pages are extremely important. Clean, easy to understand URL structures with a hierarchy that makes sense and follows your category flow will ensure that your site gets results.

URLs and filenames

Check through your website one more time and make sure that your URLs and filenames represent the keyword focus of the website, categories, and keywords.

Sitemap

Internal linking should be properly in place so that search engines can efficiently crawl your website. In addition, a standard site map with links to all of your pages of your site in an outline flow that represents your hierarchy one more time will help to ensure all of your pages are getting visited. It also is good for some users as they link to see pages of your site all in one place to quickly get to where they need without having to navigate from one page to the next.

Website size

The larger the website the more difficult it is for a search engine to be able to efficiently crawl every page. I have worked on many websites that had more than 80x the original amount of pages indexed in Google. Issues stemming from parameters, products that were discontinued, even test pages developed that staff had forgotten (or just didn’t know) were there were all contributing factors. I have also seen many websites that have more pages that should be indexed that are not. When looking over a website for SEO look at how many pages the search engines have indexed against how many pages actually exist.

Domains vs. subdomains

A subdomain is the letters (or word) that appear as a prefix to your domain. The letters “www” are actually considered a subdomain. When managing your domain in Google Webmaster Tools and linking to the full domain within your website you will want to choose whether Google ranks your domain with the “www” or without. I personally prefer it without in most cases, unless it’s an older domain or the branding of the domain is known with the “www”. Some websites tend to use subdomains as a way to organize the site as you would directories. Unfortunately this causes search engines to look at each subdomain as it’s own domain rather than

Page 26 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 28: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

looking at the entire domain and all it’s directories that support it. If a website has been using subdomains for years and has decent rankings for each one, then leave them alone. If not, and it makes sense to change them over to directories then a clear plan to 301 redirect them to the new directory under the domain should be made and properly executed.

Hyphens in URLs

Quite often I see websites that have been created with spaces in the file names which then cause a “%20” in between the words of a URL. Some website developers will also tend to use underscores to separate the words in a filename which then causes the URL to render that underscore. Search engines have a tough time parsing through the “%20”, and underscores are considered bad practice as search spammers used them often in the past causing Google to add them as an algorithm to filter out. Best practices for spaces in­between words in a URL is to add a hyphen. Most website tools like Drupal, Joomla, and WordPress will automatically add hyphens into the URL if you leave the space blank in between the page title. Check through all of the pages of the site to ensure that hyphens are being used, and no “%20” or “_” are showing up.

URL length

Your URL should never exceed 2,000 characters, but more importantly, the shorter more concise a URL is the better. Check your website for additional parameters or lengthy file names that could be causing your URL to appear to be just a little to long.

Other Some additional items to check on your website that are important for search engine optimization.

robots.txt

While we are all trying so hard to get our website indexed, why would we even want to block directories or pages from search engines? There are those few occasions that blocking whole directories or even pages is going to actually benefit your website. In some cases you might find content developed just for marketing with thin content and not a lot of lead in or value to the website as a whole. Perhaps you have parameters that are used when a search is completed that could cause issues with duplicate content and canonical tags are too difficult to implement properly. In some cases Ajax calls could cause issues if not implemented properly and the URL may need to be added to the robots.txt to avoid crawling. Whatever your reason may be, your robots.txt file should be treated with utmost of care. When performing an audit you want to look through the URLs added to the file and make sure there are valid reasons for why they are in there. The robots.txt file is a simple text file that is located on the server in the folder that your website is being hosted and pointed to. Every website should have one even if you are just adding directories that include files not relevant to your website. If you need to understand the robots.txt file more, there is a whole website about it at robotstxt.org.

Page 27 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 29: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Server Codes

When a user or search engine locates a website the server will return a code when the domain, directory or file is pulled into the browser. It is important to understand how many URLs are returning these codes, why some of them are firing, and what you should do in the case of any issues that might arise. There are a couple of ways you can check to see what code is being thrown on your website by opening up Google’s Chrome browser, or Firefox, right click on the page somewhere on your site and select “Inspect Element”, when the window appears click “Elements” and refresh your page. You will see a code display at the top for the URL.

Another way is to run a crawler like Xenu’s Link Sleuth that checks for broken links on your website as well as looks at metadata.

200 - OK The 200 code is telling the browser that the file is okay and the page is able to load. There isn’t anything you need to do if the 200 code is firing, unless it’s a page you mean to remove or has moved to another URL.

301 - Permanent A 301 code should fire only if a URL existed at one point and has moved to another URL. 301’s should be addressed if a website is under a redesign and a new site has launched with different URLs than what was there before. If you want to restructure your URL hierarchy from top level files to a directory category to page name after reading this eBook then you will want to set a 301 from your old URLs to your new ones. Do take note that too many URLs firing 301 codes to new URLs can send a red flag to search engines that the website might be attempting to trick for rankings. SEO spammers have used 301 codes to their advantage as they take websites or pages that have a high score with search engines and redirect it (or many) to the homepage or a page that they want to rank higher. The reason for this is that the score is

Page 28 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 30: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

passed directly to the new page. Google has since developed an algorithm that cuts down the score of the page being redirected, and in some cases will wait for a few weeks to a few months before allowing those pages to see results again. I generally say no more than 15% of your website’s URLs should be redirecting to new ones. In some cases that 15% might be too many. You also only want to redirect a page that has the exact same content as the new one you are directing to. A redesign of the page is ok, as long as the words are exactly the same. If the page changes completely and the content is different, but has the same subject you should be just fine. The key is to tread with caution, not to get too carried away with redirecting, and don’t use the strategy as a way to get rankings quickly.

302 - Temporary The 302 code is rarely used, but is available if necessary. The purpose is to let the search engines know that a URL has changed for a short period of time, and will be back at a later point. If you are checking a website during a SEO Audit and you find several URLs that are firing a 302, you’ll want to look into why they are and if it makes sense to set them to a 301 or a 404 (not found).

404 - Not Found It’s the code that most people know, even your simplest of non technical users visiting your website. It means that the URL that the browser (or search engine) is trying to access no longer exists. You will want to look at the 404 that is firing itself by typing in a bogus URL on your site and seeing if it returns a 404 code. If it doesn’t, then you need to talk with your webmaster to find out why, and get it fixed. You will also want to look at how many URLs Google and bing are seeing with the Webmaster Tools login that are returning a 404. If there are more than 10% of the pages that exist on your website that are 404ing then you will need to devise a strategy to find out why they are crawling to URLs that don’t exist (old links on the website to URLs that no longer exist, products that are out of stock, etc) and then correct all of the 404 URLs they are seeing.

500 - Server Error A 500 code fires when there is an issue with trying to access the server as a URL is being accessed in the browser or when a search engine visits the site. Google and Bing Webmaster Tools will usually tell you when they see a Server Error and other issues. If the search engines are seeing server errors, it is up to your webmaster to figure out why those URLs are having issues.Too many URLs that are linked to within the site that fire server errors (I say no more than 5%) will send red flags to the search engines and can be a detriment to your rankings.

Page 29 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 31: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

503 - Unavailable A 503 code is another server error that means that the server is unavailable. I have seen a vast majority of dynamic websites that run into server issues and firing 503 codes. It usually happens on a day that server crashes for any number of reasons, and Google and Bing’s Webmaster Tools send emails and warnings all over. Analytics will show a drop in traffic for a while, and it can affect the business in a negative way for businesses that rely on the website for revenue. If you ever find that your server is down, get it addressed immediately and do all that you can to ensure it doesn’t happen again. There are many more codes that your server can fire when a browser pulls it up, or a search engine crawls, but these are the most important ones everyone should know. For a more complete list visit our blog post on server codes.

Links - External Links that point to your website from other websites are what made Google the most popular search engine available. Their unique algorithm of looking at a link that points to your website and counting it as a “vote” and ranking websites that have the most links in order is what started the company. Agencies began to rely more and more on linking strategies as asking clients to make changes to their websites was more work than to simply go out to other sites and add links to them. More and more agencies began spamming or using trickery with low quality links, and causing Google and Bing to readjust their algorithm to combat the poor quality sites that were getting rankings. The algorithm has grown, been added to, and linking has become so complex that search engine optimizers constantly debate on what works and what doesn’t. The core of what started it all is still there with some basic rules and understandings added to it. As long as you understand what is a bad link, and what is a good link, for the most part you should be safe. There are a couple of ways you can look at links pointing to your website:

1) Google and Bing’s Webmaster Tools both have a report on links they see pointing to your website.

2) Moz.com’s Open Site Explorer tool allows you to plug your domain in and the tool then analyses

the links that point to your website.

Page 30 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 32: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Quality of source of inbound links

The website that links to your website, directory or page is carefully scored in the same way that your website is scored. Both Google and Bing consider all the content on all of the pages, how they link to one another internally, what the website is about, links pointing to that website, and all of the other factors that go into ranking a site and then pass that value onto your website. A website that is of poor quality that links to your website can be detrimental to your rankings.

Links from similar sites

When looking at links that point to your website look at the domain and homepage and ask yourself if it seems like a website that is similar to yours. Next look at the pages within, and perform a quick audit at a glance of the content on the site, how the URLs are performing and placed, etc. If the site appears to be tricking search engines or has a lot of issues you are seeing right away with inadvertent bad SEO you will want to consider asking the site to remove your link.

Links from .edu and .gov sites

Websites from trusted domains like educational resources or government sites can add quite a lot of value to your website.

Age of inbound links

Links that have been pointing to your website for years are going to hold a lot more value than ones that were just added a day or even a month or so ago. Older links to URLs that have been around for a long time are best left alone. If you are considering a redesign or reorganizing your URLs you will want to leave any of your older ones that have older links to them from valuable sites.

Links from directories = no

Directories were once recommended by Google as a great resource for links. Dmoz.org itself was the highest rated directory of them all up until 2006 when submitting your website to directories started to become bad practice. Unfortunately what was once recommended is now consider bad linking as low­quality directory or bookmark site links mentioned specifically in Google’s Webmaster Guidelines. Because of this many sites still have legacy links from these low­quality directories which drive down the scoring of the site. If you see these you can either 404 the URL and request the page be removed from the index, or ask the directory to remove the link.

Links from Social Media

Social media links are more valuable now more than they ever have been. Social networking and the value of sharing and discussing content is highly regarded with both Google and Bing. Microsoft has a partnership with Facebook that allows the search engine to connect with the social media giant and see a user’s content and connections when logged in. This allows Bing to provide insight into URLs that are shared on Facebook. Google uses it’s very own Google+ to use social signals in rankings, but with the removal of the +1 button and now the announcement that Google is phasing out Google+ altogether,

Page 31 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 33: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

the search engine is turning to it’s newly solidified relationship with Twitter in order to rank individual tweets and provide more information that allow the search engine to determine rankings. Of course this all could change tomorrow or next week, so the best route when it comes to using social media to aid in your site’s rankings is to encourage your visitors to share your content. If they show an interest in your product or service and are a part of your industry then provide them with easy to access tools that allow them to share your link on Facebook or tweet it with the post already pre­generated for them complete with hashtags and links back to your page and site. Don’t stress too much about the latest and greatest social media site that has popped up unless it makes sense to your core audience and they find value in sharing out your content. If you stay on that path then all the changes will pass you by and your strategy will forever hold true.

What Not To Do While it is important to know what to do when optimizing your website, it’s just as equally (if not more) important to understand what NOT to do. Most of the strategies that were once popular or even encouraged have been abused at some point by bad search engine optimizers. What us SEO’s call “black hat techniques” or spamming has shaped the face of our industry and kept those that focus on “white hat techniques” or honest SEO on our toes.

Hidden text

Somewhere around the early 2000’s search engine spammers found ways to beat the system and get rankings by hiding text on pages by making the font color the same color as the background. Beautiful photo and graphic heavy websites were being developed with little to no text with a scroll bar that seems to take you down to a large blank space at the bottom. Highlight the area and there you would see the copy on the page. Later on more fancier strategies were used such as hidden divs and divs that were 1 pixel high and hid the overflow content through cascading style sheets. As search engines caught up to these tricky techniques so did their negative rankings. Don’t do it!

Hidden links

As linking strategies became more and more popular webmasters would sell link space and hide links on websites. Website design companies would drive up their own rankings by adding links back hidden in footers or somewhere on the homepage of their clients. It’s a practice that is out­of­date but I still get asked “what if I just hide a link to my company’s website on my friend’s blog?”. Don’t do it!

Keyword stuffing

Remember what I mentioned earlier about your keyword densities. One of the reasons why that balance is so important is due to spammers stuffing keywords within content of websites in order to drive up rankings. The mere mention of a keyword on a page used to be good enough, and the more you had it on your page and your website the higher your site would rank. Google and Bing have added algorithms that now carefully look at the ratio of content to keywords and the location of those keywords within the page. I still have clients asking me to mention their broad term more often in their copy and

Page 32 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 34: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

add it to every link on their navigation in order to drive up rankings for that word. It may have worked for a little while, but it doesn’t work now, and can even cause you to lose your position. Don’t do it!

Artificial traffic

Search engines count how often a website is visited. More importantly, how many times a website in their results is clicked. Spammers have been known to create bots or pay human beings to search a keyword and then click the link over and over again. Bots will even change IP addresses so that it appears that the click is coming from different sources. Both Google and Bing have become very sophisticated in understanding what is real traffic to a website and what is artificial. I have even received emails from companies promising what appears to be real traffic to drive up my site’s rankings. If it isn’t natural, then it won’t work. Don’t do it!

Cloaking scripts

Cloaking is a technique used to show users a different page or content than what the search engines would see. Some examples of cloaking methods:

IP Collection – Using a database of IP addresses of users. This can be as simple as launching a campaign, pausing it, and recording any IP’s that visit the page and then displaying unique content based on their IP.

Referral URL’s – If someone is coming from a specific domain or URL then show them a different set of content.

GeoIP – The intent is to display content to users based on their location. In most cases this is perfectly acceptable to search engines if the location specific content makes sense (maps, city/state names, etc)

If you are using any sort of cloaking script on your website tread carefully. Google and Bing don’t like to see different content than your users would see and will most likely penalize your website for using this technique. When in doubt ­ Don’t do it!

Buying links

Buying links on websites was a very common practice in the early 2000’s. I myself was a part of it when working for one of Seattle’s most prominent SEO Agencies. Our strategy to gain first position rankings for our clients for words like “balloons” and “trophies” was to simply call up webmasters of websites with similar content and ask them to add a link. If they were reluctant then money would exchange hands. I was not happy with the strategy even though it was getting our clients results, and eventually moved away from agency life as a result. In 2007 Google announced that they were going to start cracking down on SEO’s that were buying links. I happily posted on my blog with a big “I told you so”. To this day I receive several emails a day with propositions to buy links for my website to obtain rankings. I have even seen them come through my blog as comments. It’s a strategy that is still being used and has long been added to the black hat list. Don’t do it!

Page 33 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com

Page 35: SEO eBook...Keyword stuffing Artificial traffic ... To help understand how content is processed remember that computers don’t read. We program them to recognize data …

SEO eBook Understanding just enough to be more than dangerous...

Jenn Mathews A.K.A SEOGoddess

Link exchanges

If someone has a website that is very similar to my client’s website and then I have a website that is similar to that person’s website, then why not exchange links? The problem is that Google and Bing will always find those similar links. Webrings and outdated strategies of linking back and forth to websites will only drive the score of a website down and negatively affect rankings. You could do all the best optimization mentioned in this document and never see a decent result and it would all be because of your link exchanging. Don’t do it!

Spam blogs with comments

If you have a website in WordPress, Drupal, Joomla or other content management system that is open source and blog based you will find that you receive a high number of comments to your blog with random links and the same irrelevant words mentioned repetitively. This is spammers leaving links in comments on your blog in order to drive up their site’s rankings. The problem with this is that the comment link has gotten so out of hand that it is being abused. In 2013 Google’s head of spam team Matt Cutts posted a video on YouTube talking about when it is appropriate and when it is too much. I say to do what feels naturally to you. If you follow a blog or three (or more) that is relevant to your industry and relates to your website, and that blog posts a topic that your site relates to and the link back then makes sense, then by all means, comment away. If you make it a part of your strategy to post comments on blogs you hardly ever visit, or are posting comments every day to, then it will trigger a red flag and get your site penalized. Just don’t do it!

Final Thoughts Now that you are a bit more than dangerous when it comes to SEO, go forth and optimize. There, of course, will always be questions and doubts as you make decisions with regards to your website. I recommend understanding Google’s Webmaster Guidelines and keep the site bookmarked. If there is ever a debate with a developer or designer, I will more often than not refer to that very same location and copy and paste verbatim with a link for them to look at. In addition, I will stay up­to­date on Google’s patents they file. This helps me gain insight into the algorithms they are working on and how Google works. Though most business owners or people that are responsible for other aspects of the business and just want to know enough about SEO to be more than dangerous, going through patents is a bit overkill. The one statement I repeat often during my workshop when asked “Can I…?” is “If it is something you want to do in order to get rankings, then you probably shouldn’t do it.” The idea behind optimizing using white hat techniques and the process I have outlined here is to tell the search engines a story, become the authority on your subject, and don’t inadvertently damage your website with regards to SEO. If you are ever in doubt, you can always contact me through my website: SEOGoddess.me/contact.html

Page 34 of 35 Get a Free Basic SEO Audit

JennM.co/Basic­SEO­Audit JennMathewsConsulting.com