I know I've written in the past that I primarily do "on site SEO", making sure my client's site is friendly to both human visitors, and search engine spiders, but once that aspect is done, all that is left to do is build up the number of relevant backlinks to your site, and of course continue to add quality content to it.
So, how do you do that? Well, there are numerous ways, one is to write witty articles and cross your fingers that some of your viewers will link to it. Great for organic (as in non-automated) backlinks, but not so good if you need to build up backlinks quickly.
You can also make your articles available for other webmasters to publish on their sites via RSS, or use one of the numerous article syndication sites out there.
Another way is to give away a freebie web widget, like a small but useful javascript widget that other webmasters can include on their own sites, of course including a plain text link in the copy/paste code you provide them. The drawback here is you have no control over what the subject, or the quality of the linking site is, but typically the majority of these types of links will be beneficial to your site.
And yet another way is to visit other sites related to your topic, posting in forums or commenting on blog posts, including an occasional non-spammy link to your own site. Emphasis on "non-spammy", because if those webmasters are anything like me, any comment spam will be removed within minutes of you posting it.
There are likely other ways, but the one remaining I can think of at the moment is to pay for them. Granted, search engines like Google have some pretty intelligent ways of detecting paid links, and if you set off the "paid link" alarm, you are basically throwing your money away. Worse, you might trigger a penalty and wind up worse off than you are now.
I won't go into detail about how to avoid setting off that alarm, because I truly believe in order to rank well in search engines you need to actually provide something that is useful, entertaining or informative (hopefully I do all 3 here)
One service that I know of, which works pretty well is called Text Link Ads. Says what it does, does what it says. I've been both a publisher and an advertiser using their services, and it does work, if you are willing to shell out some cash, and be picky about what sites your link will appears on.
Another method, although I've had mixed results, is to find a freelancer that specializes in back link building. Scriptlance is a good place to look, but be careful.
Hiring whomever has the lowest bid is likely to be a waste of time and money. It could also damage your site's search engine rankings if done incorrectly.
Whomever you hire, ask to see previous examples of their work. Are the links they've created grammatically correct? Is the syntax correct? Are they in context with the pages they appear on? Do those pages appear in Google's index ? Does the page have noindex/nofollow tags ? Do they look spammy? Automated ?
There are a lot of things to consider when going this route. It can be done, but results can vary, widely. The main thing to consider is the fact that Google has a legion of some pretty bright people whose sole purpose in life is to detect and devalue non-organic back-links. Actually they teach computers to do it, but you get my point.
This concludes today's witty, useful, entertaining and informative article.
Until next time. Cheers !
Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts
Wednesday, December 15, 2010
Wednesday, July 21, 2010
Top on site SEO blunders
You may not know it from looking at this blog, but I've been delving into on-site SEO since about 2003. In that time, I've discovered SEO is not so much about knowing what to do, it's primarily about knowing what not to do.
In fact, roughly 50% of my SEO related work has been un-doing what some other SEO "expert" did. Nothing ruins a web based project like getting penalties or banned from Google. And Google has a fleet of some pretty smart folks whose sole reason for being is finding attempts to artificially influence their search results and then penalizing or banning the offending sites.
I only do "on-site" SEO work because I think I'd rather wash dishes or dig ditches for a living than link build, never the less, link building is a crucial step, but wasted if the on-site elements squander it away. So before
you start with off-site SEO, probably best to work on the on-site stuff first.
Of course you'll want to ensure your pages display correctly, are readable, informative or at least entertaining. That they use valid HTML, CSS etc. After all what's the point in getting people to your pages if they don't look right. Instead, I'll try to focus on the things you just might not think of.
So, here we go. Ask yourself these questions ...
Some of the points above I touched on long ago here and here which might be worth reading.. maybe if you are so inclined.
There are also of course off site blunders which are either a waste of time & resources, or potentially damaging to your site, but I've rambled long enough. I'll try to point some of those out next time.
In fact, roughly 50% of my SEO related work has been un-doing what some other SEO "expert" did. Nothing ruins a web based project like getting penalties or banned from Google. And Google has a fleet of some pretty smart folks whose sole reason for being is finding attempts to artificially influence their search results and then penalizing or banning the offending sites.
I only do "on-site" SEO work because I think I'd rather wash dishes or dig ditches for a living than link build, never the less, link building is a crucial step, but wasted if the on-site elements squander it away. So before
you start with off-site SEO, probably best to work on the on-site stuff first.
Of course you'll want to ensure your pages display correctly, are readable, informative or at least entertaining. That they use valid HTML, CSS etc. After all what's the point in getting people to your pages if they don't look right. Instead, I'll try to focus on the things you just might not think of.
So, here we go. Ask yourself these questions ...
- Do all your pages have unique title, meta keyword and meta description tags?
Likely your site has more than one page and while you'll probably want your site name in the title, I'd suggest that each page have something unique in the title as well as the meta tags.
- Navigating your site, do you ever reach "dead-ends" AKA orphaned pages ?
Just like people, search engine spiders like Googlebot need to be able to get from one page to another, so ensure every page includes at least a link back to your home page. Creating a HTML site map page and including a link to it on every page is also a good idea.
- Viewing the source of your pages, do you have to scroll down through dozens of lines of javascript and inline styling to get to the legible content?
With all the nifty pointy clicky WYSIWYG web editors out there it is far too easy to create a page where the actual content is so far down in the HTML code that search engines don't give it the value it deserves. Using external stylesheets, and combining multiple javascript snippets into one or two external files will certainly help. Basically you want your textual content to appear as close to the opening BODY tag as possible. Using CSS positioning, you could create a div right after the opening body tag filled with your unique and quality content, and positioned to display after your site header, navigation and other non-unique page elements.
- Are your keywords "sprinkled" about ?
I know I've harped about this one before, but keyword over stuffing is likely the biggest cause of site's being penalized by Google that I've encountered. There is no magic number, but there certainly is a limit to how often your keywords should appear. If you just focus on writing informative content, sprinkling your keywords so that the text reads as if you are speaking to your visitors, you should do well in avoiding over stuffing penalties. Still, it doesn't hurt to check your keyword density, and that of your competitors to ensure you are in the ball park.
- Is relevant content to your site's topic actually text?
Try this little test. Open your website in a browser, use the edit menu to "select all", then "copy". Now open a text editor like notepad and paste your content in. Reading this as if you've never read it before, is it clear what the site is about? Are your keywords near the top of the page? If much of your site's content is conveyed with images, javascript and Flash widgets which aren't seen by search engine spiders, you are shooting yourself in the foot. This test is actually the first thing I do when looking to improve site's search engine performance. It is also how I usually discover sites using the decade old black hat technique called text cloaking, like white text on a white background. This is just plain bad. Google and likely most other search engines will see it for what it is, and if you are lucky, ignore it. Not so lucky? Banned.. next?
- Are you honest?
Ok, I admit, that's a loaded question. What I mean is do you have overly self important elements in your pages? Like the Revisit meta tag, meta name="Revisit-After" content="1 Days" or perhaps in an XML sitemap with "changefreq" set to "daily" Do you actually update your site every day? Another common blunder in sitemap.xml files is having every URL with "priority" set to "1.0"
Simply put, not all pages are created equally. Be honest and realistic. Incidentally, the Revisit-After meta tag is completely ignored by Google, but may be used by other search engines, so a realistic value there certainly won't hurt.
Some of the points above I touched on long ago here and here which might be worth reading.. maybe if you are so inclined.
There are also of course off site blunders which are either a waste of time & resources, or potentially damaging to your site, but I've rambled long enough. I'll try to point some of those out next time.
Labels:
SEO
Friday, February 12, 2010
Search engines hate my site !
Ok, not actually my site, but some clients I have worked with in the past seemed convinced the cigar smoking bigwigs at certain search engine companies had a personal vendetta against them. I can apreciate the desire to increase traffic to your website, I have my own as well that I'd like to see improved (and it is gradually) but a word of caution. In the case of SEO, or more specifically on site / page SEO, too much is really too much.
In other words, sometimes in an effort to tweak your site to be more search engine freindly you wind up over optimizing it, or worse, triggering penalties by perceived black hat SEO tactics.
There is no silver bullet or point and click method that results in instant search engine success. Anyone trying to sell you one should probably be slapped, or at least ignored. Effective SEO is much more about knowing what not to do, than it it is stuffing in the latest whizzbang method into your pages.
To me, that is as it should be. Websites that are created poorly and look like thousands of other sites should not outshine sites where the webmaster has worked hard to provide quality content that is unique.
If you've been at this a while, you are probably tired of reading "content is king!" so that is my only mention of it, but how about "legible text is paramount!" or "the importance of predominantly unique and grammatically correct authoring should not be underestimated". Ok, that last one was a bit long, but you get my drift.
Something I suggest.. open a copy or notepad or whatever text editor you use. Browse to your home page, hit crtl+A key to select all text then ctrl+C top copy it. Now paste that into notepad. Reading it, if it isn't obvious what the site is about within the first few sentences you should probably consider changing your layout. This is essentially what Googlebot and other search engines see, and what they use to attempt to categorize your site.
Note, when I say "consider changing your layout" I don't necessarily mean changing the look or placement of your textual content. The goal is to get the unique and relevant text closer to the top when viewing the HTML source of your pages. Using CSS positioning, that text can be displayed wherever you like.
Another shoot yourself in the foot tactic I often find is large amounts of inline styles and java script within the HEAD tags of HTML pages. Whenever possible these should be moved and referenced from external files, like style.css or functions.js Search engine bots really don't care about how the page is styled and typically don't run java script, so why make them wade through it. Again, getting your unique and relevant text closer to the opening BODY tag.
I've just realized I probably have a hundred other SEO blunders to cover, but the last not so good idea I'll mention here is the overstuffed deceptively flattering sitemap.xml approach...
Overstuffed.. It is not necessary to mention every URL of your site in the sitemap.xml If your site has a dozoen or so pages, then it's likely fine, but hundreds? Probably too many, thousands? Definitely too many.
Just ensure your sitemap.xml file has your most important sections of your website possibly /blog , /forums, /contact, /about etc. You do not need every URL. If your site has reasonable navigation the search engine spiders can follow the links to the remainder of your site.
Deceptively flattering.. I may have mentioned this before, but all too often I run across sitemap.xml files where every URL has an importance of 1.0 and or is updated daily. Really!? C'mon be honest. If /contact is updated daily then you must be an under cover agent or maybe a paid assasin.
The point is, be realistic. /forum likely does get new content every day, /blog might be daily, weekly .. or if you are lazy like me monthly. So yes, be honest and realistic when assigning importance and update frequency in your sitemaps.
In other words, sometimes in an effort to tweak your site to be more search engine freindly you wind up over optimizing it, or worse, triggering penalties by perceived black hat SEO tactics.
There is no silver bullet or point and click method that results in instant search engine success. Anyone trying to sell you one should probably be slapped, or at least ignored. Effective SEO is much more about knowing what not to do, than it it is stuffing in the latest whizzbang method into your pages.
To me, that is as it should be. Websites that are created poorly and look like thousands of other sites should not outshine sites where the webmaster has worked hard to provide quality content that is unique.
If you've been at this a while, you are probably tired of reading "content is king!" so that is my only mention of it, but how about "legible text is paramount!" or "the importance of predominantly unique and grammatically correct authoring should not be underestimated". Ok, that last one was a bit long, but you get my drift.
Something I suggest.. open a copy or notepad or whatever text editor you use. Browse to your home page, hit crtl+A key to select all text then ctrl+C top copy it. Now paste that into notepad. Reading it, if it isn't obvious what the site is about within the first few sentences you should probably consider changing your layout. This is essentially what Googlebot and other search engines see, and what they use to attempt to categorize your site.
Note, when I say "consider changing your layout" I don't necessarily mean changing the look or placement of your textual content. The goal is to get the unique and relevant text closer to the top when viewing the HTML source of your pages. Using CSS positioning, that text can be displayed wherever you like.
Another shoot yourself in the foot tactic I often find is large amounts of inline styles and java script within the HEAD tags of HTML pages. Whenever possible these should be moved and referenced from external files, like style.css or functions.js Search engine bots really don't care about how the page is styled and typically don't run java script, so why make them wade through it. Again, getting your unique and relevant text closer to the opening BODY tag.
I've just realized I probably have a hundred other SEO blunders to cover, but the last not so good idea I'll mention here is the overstuffed deceptively flattering sitemap.xml approach...
Overstuffed.. It is not necessary to mention every URL of your site in the sitemap.xml If your site has a dozoen or so pages, then it's likely fine, but hundreds? Probably too many, thousands? Definitely too many.
Just ensure your sitemap.xml file has your most important sections of your website possibly /blog , /forums, /contact, /about etc. You do not need every URL. If your site has reasonable navigation the search engine spiders can follow the links to the remainder of your site.
Deceptively flattering.. I may have mentioned this before, but all too often I run across sitemap.xml files where every URL has an importance of 1.0 and or is updated daily. Really!? C'mon be honest. If /contact is updated daily then you must be an under cover agent or maybe a paid assasin.
The point is, be realistic. /forum likely does get new content every day, /blog might be daily, weekly .. or if you are lazy like me monthly. So yes, be honest and realistic when assigning importance and update frequency in your sitemaps.
Labels:
SEO
Friday, October 9, 2009
Some helpful mod rewrite rules to improve SEO.
If you are new to creating websites, or just want add some new tricks to your existing skill set, getting a grasp on mod rewrite is a pretty good way to improve your website, both in terms of usiblity (for people) and improving search engine rankings (for bots).
htacess is feature of Apache web server. If set up correctly, it gives the website owner a way to pass configuration changes and tweaks to the Apache web server, affecting how their website behaves.
Apache is typically installed with modules, one of those modules is called mod rewrite, and with it, you can essentially tell Apache what to do, and how to respond to nearly any type of website request.
Check your website for any .htacess files you might have. Keep in mind, Apache directives inside a .htaccess file affect all directories beneath it, so if you have existing .htaccess files, you can usually just add to them.
So either edit or create a .htaccess file with any text editor such as notepad.
Customized error pages
Let's face it sometimes things just don't go as planned, but your website can recover from it gracefully with custom error pages.
So, writing a single error page in HTML, including links to the site sitemap.xml, and other sections of the site, a search form if applicable, as well as a javascript redirect to the site's main page. So if a person encounters an error, they will
be steered in the right direction. You can use a single .html file to respond to numerous error types.
Using an error page named 404.html to respond to 404 (not found) and 500 (internal) errors, add this to your .htaccess file
Blocking website access by IP address.
In order to make sense of mod rewrite directives, you need to tell Apache to access the mod_rewrite module.
Adding a line that reads . . .
Next, need to tell mod rewrite which directory to act upon, if this .htaccess file resides in your main web directory, you probably want . .
If the .htaccess file is in some sub directory of your website, you probably want ..
After enabling and setting the base directory, you can start adding mod rewrite rules. One I always include is what I call the the non WWW to WWW rule, basically force the browser or search engine spider to use www in the URLs of the site. If they browse to any pages without WWW, they are 301 redirected.
If you are using PHP, or really any other web development language, you can take parameters from ugly URLs and make them pretty.
The example below will serve index.php?author=Smith when /author-Smith.html is requested.
Likewise this rule serves index.php?book=Diary when /book-Diary.html is requested.
Finally you could get book and author in one request like this..
index.php?author=Smith&book=Diary when /author-Smith-book-Diary.html
The $1 represents characters in the first parentheiss (.*) the $2 represents the 2nd one and so on.
If you have created something like the above mod rewrite rules, congratuations, you have used regular expressions since they were used. ^book=(.*)$
Regular expressions can get tricky but the vasics are ..
^ [caret] Means the beginning of a line, so ^G would match any line with a capital g as their first character.
$ [dollar] Means the end of a line , so G$ would match any line ending with a capitol g.
. [period] Means any single character, including space,
* [asterisk] Mean continually match on previous rule.. more on this later
Ranges . You can specify a range of characters to match on with square bracket []
Match any single digit number ..
[0-9]
Match any single lowercase letter
[a-z]
Match any single uppercase letter
[A-Z]
Match any letter, regardless of case
[A-Za-z]
Match any alphanumeric character
[A-Za-z0-9]
Remember the period and asterisk earlier? Combing examples above, you could do..
Any 2 digit number
[0-9].
Any number, regardless of lenghth
[0-9]*
And same for letters
Any 2 letters
[A-Za-z].
Any letter combination, regardless of lenghth.
[A-Za-z]*
Using just the period and asterisk means ALL characters, so
.* would match an entire line.
A.*Z would match all characters between capital a and z
Remember when a regular expression rule is wrapped with parenthesis, like this .. ^book=(.*)$ it becomes available to the destination URL as $1, $2 etc.
There are far more complex ways to use mod rewrite, as well as .htaccess but hopefully this has gotten you off to a good start.
Getting started, the .htaccess file
htacess is feature of Apache web server. If set up correctly, it gives the website owner a way to pass configuration changes and tweaks to the Apache web server, affecting how their website behaves.
Apache is typically installed with modules, one of those modules is called mod rewrite, and with it, you can essentially tell Apache what to do, and how to respond to nearly any type of website request.
Check your website for any .htacess files you might have. Keep in mind, Apache directives inside a .htaccess file affect all directories beneath it, so if you have existing .htaccess files, you can usually just add to them.
So either edit or create a .htaccess file with any text editor such as notepad.
Handy .htaccess entries
Customized error pages
Let's face it sometimes things just don't go as planned, but your website can recover from it gracefully with custom error pages.
So, writing a single error page in HTML, including links to the site sitemap.xml, and other sections of the site, a search form if applicable, as well as a javascript redirect to the site's main page. So if a person encounters an error, they will
be steered in the right direction. You can use a single .html file to respond to numerous error types.
Using an error page named 404.html to respond to 404 (not found) and 500 (internal) errors, add this to your .htaccess file
ErrorDocument 500 http://www.superdupersite.com/404.html
ErrorDocument 404 http://www.superdupersite.com/404.html
Blocking website access by IP address.
Deny from 127.0.0.0
Firing up mod rewrite . .
In order to make sense of mod rewrite directives, you need to tell Apache to access the mod_rewrite module.
Adding a line that reads . . .
does exactly that.
RewriteEngine on
Next, need to tell mod rewrite which directory to act upon, if this .htaccess file resides in your main web directory, you probably want . .
RewriteBase /
If the .htaccess file is in some sub directory of your website, you probably want ..
RewriteBase directoryname/
Handy mod rewrite rules
After enabling and setting the base directory, you can start adding mod rewrite rules. One I always include is what I call the the non WWW to WWW rule, basically force the browser or search engine spider to use www in the URLs of the site. If they browse to any pages without WWW, they are 301 redirected.
RewriteCond %{HTTP_HOST} ^superdupersite\.com(.*)
RewriteRule (.*) http://www.superdupersite.com/$1 [R=301,L]
Prettier URLS
If you are using PHP, or really any other web development language, you can take parameters from ugly URLs and make them pretty.
The example below will serve index.php?author=Smith when /author-Smith.html is requested.
RewriteRule ^author-(.*)\.html$ index.php?author=$1 [R=301,L]
Likewise this rule serves index.php?book=Diary when /book-Diary.html is requested.
RewriteRule ^book-(.*)$ index.php?book=$1 [R=301,L]
Passing multiple paremters
Finally you could get book and author in one request like this..
index.php?author=Smith&book=Diary when /author-Smith-book-Diary.html
RewriteRule ^author-(.*)-book-(.*)$ index.php?author=$1&book=$2 [R=301,L]
The $1 represents characters in the first parentheiss (.*) the $2 represents the 2nd one and so on.
Regular expressions
If you have created something like the above mod rewrite rules, congratuations, you have used regular expressions since they were used. ^book=(.*)$
Regular expressions can get tricky but the vasics are ..
^ [caret] Means the beginning of a line, so ^G would match any line with a capital g as their first character.
$ [dollar] Means the end of a line , so G$ would match any line ending with a capitol g.
. [period] Means any single character, including space,
* [asterisk] Mean continually match on previous rule.. more on this later
Ranges . You can specify a range of characters to match on with square bracket []
Match any single digit number ..
[0-9]
Match any single lowercase letter
[a-z]
Match any single uppercase letter
[A-Z]
Match any letter, regardless of case
[A-Za-z]
Match any alphanumeric character
[A-Za-z0-9]
Remember the period and asterisk earlier? Combing examples above, you could do..
Any 2 digit number
[0-9].
Any number, regardless of lenghth
[0-9]*
And same for letters
Any 2 letters
[A-Za-z].
Any letter combination, regardless of lenghth.
[A-Za-z]*
Using just the period and asterisk means ALL characters, so
.* would match an entire line.
A.*Z would match all characters between capital a and z
Remember when a regular expression rule is wrapped with parenthesis, like this .. ^book=(.*)$ it becomes available to the destination URL as $1, $2 etc.
There are far more complex ways to use mod rewrite, as well as .htaccess but hopefully this has gotten you off to a good start.
Labels:
SEO
Tuesday, August 11, 2009
The magical mythical world of SEO
Anybody with a website has no doubt heard the term SEO, Search Engine Optimization. In fact it is a billion dollar industry. Everyone seems to think there is some magic bullet that will miraculously rocket their web site to the top of the SERPs , that's short for Search Engine Result Pages by the way.
Well the truth is, there simply isn't such a thing and anyone that tells you otherwise is either trying to deceive you, or is simply not very bright. I often do SEO work for clients, but the bulk of my work is mostly undoing the damage that occurs when overzealous webmasters cross the line and are penalized for attempting to manipulate search engines. They are usually oblivious to the fact that they have shot themselves in the foot, so much of my time is spent explaining what I'm explaining here.
In short, SEO is not about knowing what to do, it's about knowing what not to do. Tactics like creating mirror sites to funnel page rank to your primary site are quickly discovered for what they are, attempts to manipulate search engines. If you are lucky those mirror sites will simply be disregarded. If you are not so lucky, your primary site will be penalized, or possibly completely removed from the search engine's index.
Keyword density is another one of the many apples "search engine experts" hold in front of web site owners to entice them out of their money. Suggesting there is some magic number that will take their site to the top... that number is of course 2.83881% .. Joking, there is no such thing.
Instead, here is what I suggest. Open up note pad. Enter your sites title, your meta keyword and description tags, and finally the copied text from your main index.
Now read it as if you've never read it before. Read it out loud. If it reads like poorly translated Japanese, then you have a problem and should rewrite it. If your dog or cat's head tilts in that "what the hell are you saying" way, you should rewrite it. While doing so keep the following in mind.
That is not to say you should not be aware of your keyword density, just don't obsess over it. To make sure you haven't gone overboard (which is prone to get your site penalized) check out this tool http://www.keyworddensity.com
Entering your top competitors URL, your own and your primary keyword phrase, you can see if you have gone overboard. Your goal here is not to one up the competition, just to ensure you are in the ball park, within a few percent of the competition. Again there is no magic number, but there is a number that will get your site penalized, so a bit lower than your competition is probably a good idea.
Once your on site work is done, it is time to get relevant and quality back links to your site.. something I dread because it is time consuming and does not provide immediately noticeable results.
Again the same rules apply. Your website is for people. So provide them with content and resources they can actually benefit from, whether it's a funny video or an article on potty training gerbils, write for people, not search engines and you'll do well.
Well the truth is, there simply isn't such a thing and anyone that tells you otherwise is either trying to deceive you, or is simply not very bright. I often do SEO work for clients, but the bulk of my work is mostly undoing the damage that occurs when overzealous webmasters cross the line and are penalized for attempting to manipulate search engines. They are usually oblivious to the fact that they have shot themselves in the foot, so much of my time is spent explaining what I'm explaining here.
In short, SEO is not about knowing what to do, it's about knowing what not to do. Tactics like creating mirror sites to funnel page rank to your primary site are quickly discovered for what they are, attempts to manipulate search engines. If you are lucky those mirror sites will simply be disregarded. If you are not so lucky, your primary site will be penalized, or possibly completely removed from the search engine's index.
Keyword density is another one of the many apples "search engine experts" hold in front of web site owners to entice them out of their money. Suggesting there is some magic number that will take their site to the top... that number is of course 2.83881% .. Joking, there is no such thing.
Instead, here is what I suggest. Open up note pad. Enter your sites title, your meta keyword and description tags, and finally the copied text from your main index.
Now read it as if you've never read it before. Read it out loud. If it reads like poorly translated Japanese, then you have a problem and should rewrite it. If your dog or cat's head tilts in that "what the hell are you saying" way, you should rewrite it. While doing so keep the following in mind.
- Your website is for people not search engine bots.
- Googlebot will not laugh at your jokes or buy your products.
- If you discuss a topic in depth, your keywords and their variations will occur naturally.
That is not to say you should not be aware of your keyword density, just don't obsess over it. To make sure you haven't gone overboard (which is prone to get your site penalized) check out this tool http://www.keyworddensity.com
Entering your top competitors URL, your own and your primary keyword phrase, you can see if you have gone overboard. Your goal here is not to one up the competition, just to ensure you are in the ball park, within a few percent of the competition. Again there is no magic number, but there is a number that will get your site penalized, so a bit lower than your competition is probably a good idea.
Once your on site work is done, it is time to get relevant and quality back links to your site.. something I dread because it is time consuming and does not provide immediately noticeable results.
Again the same rules apply. Your website is for people. So provide them with content and resources they can actually benefit from, whether it's a funny video or an article on potty training gerbils, write for people, not search engines and you'll do well.
Labels:
SEO
Sunday, August 2, 2009
To all SEO professionals. INCOMING ! ! ! !
Maybe you've been using Google for more than a search engine for these past several years. If you are a website owner, or an SEO consultant, it's pretty likely you also use Google to measure a website's prominence, to act as a benchmark on the pecking order against the competition.
Well, the rules and standards that determine that pecking order will be changing... Actually it is always changing, but for the first time in several years, a change is coming that could have more than a subtle effect on the mighty Google SERPs (search engine result pages). Whether it will affect your website(s) or not is yet to be seen.
Google has been working on something they call Caffeine, and unlike Microsoft's caterpillar to butterfly transformation that resulted in Bing, Caffeine won't likely result in an obvious change to the Google interface we all know and love, instead, the changes are under the hood, so to speak.
In short, it's faster. In some of my tests up to %50 faster.. WHOA!!! %50 faster? Yes %50 faster, which raises the question, how did they make such a dramatic performance increase without simplifying the pecking order rules. Has the mighty Google algorithm been whittled down to bare essentials? Since I'm not a Google engineer, I can't say, but on the other hand, if I were a Google engineer, and told you, I'd have to kill you.
So what does this mean to you? Well, it may have absolutely no impact on your website's position in SERPs, but some may be affected, and I'm guessing, that change would be more than the occasional small 1-2 spot variation we all see from time to time. So, I can only suggest, we all duck and cover. That is to say, avoid any drastic SEO related tweaks until you know how your website is affected.
You can get a sneak peek, but please be aware, Caffeine is still under the knife, and could be further adjusted, so I suggest simply monitoring, not reacting, until Caffeine has been finalized.
Well, the rules and standards that determine that pecking order will be changing... Actually it is always changing, but for the first time in several years, a change is coming that could have more than a subtle effect on the mighty Google SERPs (search engine result pages). Whether it will affect your website(s) or not is yet to be seen.
Google has been working on something they call Caffeine, and unlike Microsoft's caterpillar to butterfly transformation that resulted in Bing, Caffeine won't likely result in an obvious change to the Google interface we all know and love, instead, the changes are under the hood, so to speak.
In short, it's faster. In some of my tests up to %50 faster.. WHOA!!! %50 faster? Yes %50 faster, which raises the question, how did they make such a dramatic performance increase without simplifying the pecking order rules. Has the mighty Google algorithm been whittled down to bare essentials? Since I'm not a Google engineer, I can't say, but on the other hand, if I were a Google engineer, and told you, I'd have to kill you.
So what does this mean to you? Well, it may have absolutely no impact on your website's position in SERPs, but some may be affected, and I'm guessing, that change would be more than the occasional small 1-2 spot variation we all see from time to time. So, I can only suggest, we all duck and cover. That is to say, avoid any drastic SEO related tweaks until you know how your website is affected.
You can get a sneak peek, but please be aware, Caffeine is still under the knife, and could be further adjusted, so I suggest simply monitoring, not reacting, until Caffeine has been finalized.
Subscribe to:
Posts (Atom)