Wednesday, December 15, 2010

It's all about the backlinks

I know I've written in the past that I primarily do "on site SEO", making sure my client's site is friendly to both human visitors, and search engine spiders, but once that aspect is done, all that is left to do is build up the number of relevant backlinks to your site, and of course continue to add quality content to it.

So, how do you do that? Well, there are numerous ways, one is to write witty articles and cross your fingers that some of your viewers will link to it. Great for organic (as in non-automated) backlinks, but not so good if you need to build up backlinks quickly.

You can also make your articles available for other webmasters to publish on their sites via RSS, or use one of the numerous article syndication sites out there.

Another way is to give away a freebie web widget, like a small but useful javascript widget that other webmasters can include on their own sites, of course including a plain text link in the copy/paste code you provide them. The drawback here is you have no control over what the subject, or the quality of the linking site is, but typically the majority of these types of links will be beneficial to your site.

And yet another way is to visit other sites related to your topic, posting in forums or commenting on blog posts, including an occasional non-spammy link to your own site. Emphasis on "non-spammy", because if those webmasters are anything like me, any comment spam will be removed within minutes of you posting it.

There are likely other ways, but the one remaining I can think of at the moment is to pay for them. Granted, search engines like Google have some pretty intelligent ways of detecting paid links, and if you set off the "paid link" alarm, you are basically throwing your money away. Worse, you might trigger a penalty and wind up worse off than you are now.

I won't go into detail about how to avoid setting off that alarm, because I truly believe in order to rank well in search engines you need to actually provide something that is useful, entertaining or informative (hopefully I do all 3 here)

One service that I know of, which works pretty well is called Text Link Ads. Says what it does, does what it says. I've been both a publisher and an advertiser using their services, and it does work, if you are willing to shell out some cash, and be picky about what sites your link will appears on.

Another method, although I've had mixed results, is to find a freelancer that specializes in back link building. Scriptlance is a good place to look, but be careful.

Hiring whomever has the lowest bid is likely to be a waste of time and money. It could also damage your site's search engine rankings if done incorrectly.

Whomever you hire, ask to see previous examples of their work. Are the links they've created grammatically correct? Is the syntax correct? Are they in context with the pages they appear on? Do those pages appear in Google's index ? Does the page have noindex/nofollow tags ? Do they look spammy? Automated ?

There are a lot of things to consider when going this route. It can be done, but results can vary, widely. The main thing to consider is the fact that Google has a legion of some pretty bright people whose sole purpose in life is to detect and devalue non-organic back-links. Actually they teach computers to do it, but you get my point.

This concludes today's witty, useful, entertaining and informative article.

Until next time. Cheers !

Friday, December 3, 2010

Save the binvironment !!!

I came up with an expression about 10 years ago regarding 2 major players in the
realm of technology, Intel makes em' faster, Microsoft makes em' slower...
Taken at face value you might not get my meaning. What I mean is software developers will continue pushing the limits of the hardware of the times.

The same is true today but now a broadband internet connection is a required hardware component for my system, without it I can't work or play. And of course software developers, rather web developers are working hard to reach the limits. VOIP and streaming media are good examples of this trend.

But I'd like to draw your attention to the unwanted/unintentional bloat that is getting into the mix, particularly the bloat that comes about in HTML code when using WYSIWYG editors, or image files that have not been optimized. BLOAT BAD! for numerous reasons, some of which you might not realize.

Another contributor to the total saturation of the information highway is web sites not responding to "if-modified" requests correctly, which is esepcially true of many PHP driven web applications. This in turn can prevent the content from being cached properly, and must be re-served for every request.

And yet another contributor, most web servers are capable of compressing content before sending it to the viewer. In most situations enabling this compression is a good trade of spare cpu ticks in exchange for smaller amounts of data to send down the wire.

So yes, save the binvironment.

If you are a web developer..
  • Optimize / crop images
  • Inspect your HTML, removing bloat code.
  • Avoid inline CSS styling, use an external css file
  • Avoid inline Java, javascript use an external js file
  • Check that your web site responds reasonably to "if-modified" requests.
  • Check that your web server is compressing content before sending

By the way a pretty handy place to check server responses is http://redbot.org
After submitting your domain's URL, click the "check assets" link. They even
have a handy bookmarklet you can use. SWEET!

Monday, November 22, 2010

Apparent java script problem with IE8

<a href="http://www.somesite.com"
onclick="location.href='./out.php?ID=XXX';return false;">
SITE NAME</a>


The above works in every browser I tested except IE8. I've read many others have the same issue, but most forums were related to ASP, VB and .NET (whose developer's have an inclination to use Microsoft products for all things from browsing to buttering toast) and the suggested workarounds were simply not applicable to PHP.

Some suggestions relying on other Microsoft products looked incredibly convoluted, but then the simplest ASP script looks like spaghetti code to me. Still, they must be great platforms, after all, you have to pay through the nose for the pleasure to work with them.

Anyways.. possibly, adding this to the meta tag section of your pages will fix it.. if you are lucky.. Do you feel lucky ? Well, do ya ?

<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE7" />

This reportedly tells IE8 to act like IE7 which is slightly less broken, a sure sign of Microsoft's version of progress.

YO MICROSOFT ! There is a java script issue with IE8 ! You know, java script? that standardized browser unspecific language invented by your arch nemesis Netscape© that like so many others you've attempted to bastcardize into a proprietary McLanguage ?

I expect a 134MB down-loadable patch to address this issue within the month.

And by the way, quit trying to create your own version of web standards in order to broaden your attempted monopoly on all things computer related. Just be happy with your domination of the operating system, office software and game console markets.

Sunday, September 26, 2010

Gonna party like it's 1989 !

Sticking with my stolen song title methodology.. lame, I know.

Anyways, I thought I'd touch on one of the little nuances of web development that can have a pretty large impact on a website's usability. Link colors and decoration.

Webmasters have always had control over the color of their textual links, via the BODY tag, Then along came CSS and it became the new rage to make textual links look "cool". Of course "coolness" is in the eye of the beholder, and unfortunately many webmasters wound up making their textual links look nearly identical to their regular text, in turn making it difficult for people to spot links that they might be interested in.

Even worse, often there was no way to tell which links the viewer had already visited, causing some visitors to unintentionally view the same page twice, discovering it isn't what they wanted (again), getting frustrated and going elsewhere.


Usability is just as crucial as the content. After all what use is great content if it isn't accessible, sort of like a museum full of masterpieces, except its never open.

Along with slow page loads, a poorly though out navigation scheme, which includes link color and decoration, is one of the reasons people will abandon your site and go elsewhere.

When HTML first evolved into common use, the web browsers of the day came up with a common scheme, and surprisingly all the companies making browsers agreed, and displayed them the same way. Something almost unheard of today.. but I digress.

  • There is a reason the default link color is blue and underlined.
  • There is a reason the default visited link color is purple.
  • There is/was a reason the default active link color is red.

Even the behemoth Google sticks to this standard. Actually they enforce it via CSS, just in case you've twiddled with the browser default's.

Granted, active link color isn't really an issue today. Hopefully the viewer's computer, and the web server you use results in a nearly instantaneous reaction, but back in the day, it was necessary to tell the viewer "we're working on it, keep your shirt on, and don't click the link again !". If your site has an international audience, it's probably still a good idea to make active links stand out, since there is a chance people will repeatedly click a link, which just bogs down your server, and probably frustrates all your viewers.

Of course you needn't use these exact colors, but do make certain your textual links don't blend in too well with your regular text. Also be sure to differentiate between links to pages that the viewer has already seen. I'm also a staunch supporter of leaving links underlined, but that is likely just personal preference.


Until next time HAPPY WEBMASTERING!

Sunday, August 15, 2010

It's a wild world (wide web)

Maybe showing my age here, by using a bit of Cat Steven's lyrics, but it's true, the world wide web is a wild and dangerous place. Just as a "user" you are subjected to emails, the least harmful of which want you to buy viagra, others try to trick you into divulging sensitive information, or worse, infect your
computer with a trojan virus to later do the villains dirty work.

If you are reading this, you are a user and have no doubt encountered some or all of these types of dangers.

It's even worse if you run a website because not only is your own computer a potential victim, but now your web server will be preyed upon, continually poked and prodded in an effort to find a weak link in your security chain.

Over the past 2 years I've cleaned up about 20 infected web servers. In almost every case I find the weak link that allowed the server to be compromised in the first place. All too often, the weak link was the site owner, who for whatever reason didn't take their web server's "security" seriously.

So why, oh why does one go to all the trouble to create a spiffy website only to leave it unprotected ?

In my experience, the website owners consider themselves "little fish" and assume that an actual hacker won't target them, the classic "what could they possibly want with my little corner of the internet?"

What they are after differs, but it all starts the same way, by compromising your web server. then they may simply use it to attempt to hack bigger fish, except now those attacks come from your web server.

Maybe they are after the email addresses in your site's database, or if you run an e-commerce site, they may be looking for retrievable credit card info. Basically, what they are after is irrelevant. If they get in, they'll find some way of leveraging that access to their advantage and likely to your demise.

If you are a website owner, there are of course concerns about the code that makes your website go. Is it subceptable to SQL injection or XSS attacks, but.. again, and granted only in my experience.. the weakest link is you.

Do you use FTP ?
FTP is just plain bad in many respects, most notably due to the fact that user names
and passwords are sent unencrypted. Their are trojan viruses that look for the tell tale signs of an FTP login and convey that info to the bad guys, since it's not encrypted, they can be in your web server running amok within seconds after you log in.

Look into disabling FTP and enabling SSH. With SSH enabled you can use the SCP protocol to upload files. Since SCP rides on an SSH connection, everything is encrypted. I use a program called WinSCP, which is very intuitive if you've used an FTP program before. It also has some pretty nifty features I've never seen in an FTP program.

Is your password good ?
This likely seems obvious, but I'm continually amazed at how many web site owners have their passwords set to something that a brute force attack would easily plow through. Be sure to use a mixture of upper & lowercase letters and numbers.

mysite BAD
MySite Better
MyS1t3 PDG (pretty damn good)

Do you use a commonly exploited email program ? (Outlook anyone)
I try to keep my Micro$oft bashing under wraps, but in this case, for your safety.. if you must use Outlook (or Outlook Express) be aware that it is the primary program used to convey viruses to your computer. Even with up to date anti-virus software running you are still vulnerable. Consider using a web based
email client which can greatly reduce the likely hood of a virus reaching your own computer.

Do you use up to date anti-virus software ? .. is it reputable, effective ?
If I had a nickel for every time I heard, "I can't be infected, I'm running XYZ anti virus!" where XYZ is either reputable, but known to be easily circumvented (sorry Norton) or some fly by night coder in a basement somewhere that has no intentions of updating viruses definitions.

One word.. AVAST
Reputable, effective, and free, although it is certainly worth the nominal price they ask for their commercial version.

So there you have it. The weakest link in your web server's security chain is likely you, and the computer you are using right now.

Wednesday, July 21, 2010

Top on site SEO blunders

You may not know it from looking at this blog, but I've been delving into on-site SEO since about 2003. In that time, I've discovered SEO is not so much about knowing what to do, it's primarily about knowing what not to do.

In fact, roughly 50% of my SEO related work has been un-doing what some other SEO "expert" did. Nothing ruins a web based project like getting penalties or banned from Google. And Google has a fleet of some pretty smart folks whose sole reason for being is finding attempts to artificially influence their search results and then penalizing or banning the offending sites.

I only do "on-site" SEO work because I think I'd rather wash dishes or dig ditches for a living than link build, never the less, link building is a crucial step, but wasted if the on-site elements squander it away. So before
you start with off-site SEO, probably best to work on the on-site stuff first.

Of course you'll want to ensure your pages display correctly, are readable, informative or at least entertaining. That they use valid HTML, CSS etc. After all what's the point in getting people to your pages if they don't look right. Instead, I'll try to focus on the things you just might not think of.

So, here we go. Ask yourself these questions ...
  • Do all your pages have unique title, meta keyword and meta description tags?
    Likely your site has more than one page and while you'll probably want your site name in the title, I'd suggest that each page have something unique in the title as well as the meta tags.
  • Navigating your site, do you ever reach "dead-ends" AKA orphaned pages ?
    Just like people, search engine spiders like Googlebot need to be able to get from one page to another, so ensure every page includes at least a link back to your home page. Creating a HTML site map page and including a link to it on every page is also a good idea.
  • Viewing the source of your pages, do you have to scroll down through dozens of lines of javascript and inline styling to get to the legible content?
    With all the nifty pointy clicky WYSIWYG web editors out there it is far too easy to create a page where the actual content is so far down in the HTML code that search engines don't give it the value it deserves. Using external stylesheets, and combining multiple javascript snippets into one or two external files will certainly help. Basically you want your textual content to appear as close to the opening BODY tag as possible. Using CSS positioning, you could create a div right after the opening body tag filled with your unique and quality content, and positioned to display after your site header, navigation and other non-unique page elements.
  • Are your keywords "sprinkled" about ?
    I know I've harped about this one before, but keyword over stuffing is likely the biggest cause of site's being penalized by Google that I've encountered. There is no magic number, but there certainly is a limit to how often your keywords should appear. If you just focus on writing informative content, sprinkling your keywords so that the text reads as if you are speaking to your visitors, you should do well in avoiding over stuffing penalties. Still, it doesn't hurt to check your keyword density, and that of your competitors to ensure you are in the ball park.
  • Is relevant content to your site's topic actually text?
    Try this little test. Open your website in a browser, use the edit menu to "select all", then "copy". Now open a text editor like notepad and paste your content in. Reading this as if you've never read it before, is it clear what the site is about? Are your keywords near the top of the page? If much of your site's content is conveyed with images, javascript and Flash widgets which aren't seen by search engine spiders, you are shooting yourself in the foot. This test is actually the first thing I do when looking to improve site's search engine performance. It is also how I usually discover sites using the decade old black hat technique called text cloaking, like white text on a white background. This is just plain bad. Google and likely most other search engines will see it for what it is, and if you are lucky, ignore it. Not so lucky? Banned.. next?
  • Are you honest?
    Ok, I admit, that's a loaded question. What I mean is do you have overly self important elements in your pages? Like the Revisit meta tag, meta name="Revisit-After" content="1 Days" or perhaps in an XML sitemap with "changefreq" set to "daily" Do you actually update your site every day? Another common blunder in sitemap.xml files is having every URL with "priority" set to "1.0"
    Simply put, not all pages are created equally. Be honest and realistic. Incidentally, the Revisit-After meta tag is completely ignored by Google, but may be used by other search engines, so a realistic value there certainly won't hurt.

Some of the points above I touched on long ago here and here which might be worth reading.. maybe if you are so inclined.

There are also of course off site blunders which are either a waste of time & resources, or potentially damaging to your site, but I've rambled long enough. I'll try to point some of those out next time.

Sunday, June 13, 2010

The flexability of Wordpress

I am continually amazed at all the ways Wordpress can be used. Granted, it was initially intended to
act as a blog, and still does of course, but it, it's thousands of plug-ins and themes have evolved,
making Wordpress suitable for just about any web based endeavor you can think of.

Speaking of add-ons. What Wordpress article would be complete without touting the can't live without add ons, so here it is...   Contact Form 7 and the All in One SEO Pack.   A very short list yes?

I've done a few dozen Wordpress installs by now, and without fail the very first things I add is Contact Form 7. No matter what you intend to do with Wordpress, you will almost certainly have a contact form, and while Wordpress includes one out of the box, it isn't nearly as feature rich.

The All in One SEO Pack is another plug-in I almost always install. It allows you to tailor title and meta tags, and acts as a gentle reminder that unique pages ideally have unique meta tags and such.

The remaining must have Wordpress add-ons really depend on what you intend to do. For example, if you intend to make photos and graphics an integral part of your website, you'll want to install the NextGEN Gallery plugin. If you want to sell stuff, you'll want the WP e-Commerce add on.

If your site becomes popular, and possibly sluggish, then I'd suggest you look into adding the WP Super Cache, but since it caches the pages, this is typically the last thing I add, since it can make tweaking Wordpress themes tedious, having to clear the cache after every edit.

Then there are the various forum applications you can add. Granted some integrate with Wordpress better than others, but typically, the forum doesn't really need to be integrated at all, just skinned to match the look of your Wordpress installation.  Without a doubt my favorite forum software is SMF, and I usually use it, but sometimes Buddypress may be better suited, depending on your needs. There is also something called WP-Forum, which I haven't tried yet, but it does look promising.

So what else do you want your website to do? Chances are there is a Wordpress add on that does it, and if not, plenty of documentation and help out there. You could also of course hire a Freelancer like me :p

Sunday, June 6, 2010

Hiring a freelancer

As a freelance web developer myself, I typically write from that perspective,
but on occasion I need to outsource. So today I thought I'd share what I know about finding and hiring a freelancer.

Where to look for freelancers.

Well, if you are reading this you found one (Shameless resume link) but possibly I'm just not the right guy. You can alert thousands of freelancers, myself included, of your project using the same websites I do.


ScriptLance is likely my favorite freelance website, both as a developer and a buyer, but before you rush off, you should skip down to Defining your project below.

I also use RentACoder.com. RAC as it's known offers some interesting features to the buyer such as targeting developers based on the economy of their countries.

Freelancer.com , formerly Getafreelancer.com is also a good place to find qualified people.

Elance is another alternative. Featuring in depth information about a potential freelancer for your project, it may be one of the best choices for high end mission critical projects.

Defining your project

It is crucial whomever you hire has a complete understanding of what is expected. All too often I've seen project descriptions like "Some website touch ups" and it turns out the work required is much more than the buyer realizes. The reverse is also true, where the buyer has the perception that their project is very difficult and time consuming, when it isn't.

To avoid catastrophe and unnecessary expense it is crucial the job be done right the first time, so if you have difficulty explaining the project with words, use pictures (gif, jpg or png) to act as the electronic equivalent of scribbled crayon on a napkin. Starting with a screen shot of a web site, or any program you can run, you can then add notes to it. A great free program for that is called Irfanview.. but I digress.


Many of these freelance sites get what I call robo-bidders, automated bids on your project. To weed those out I suggest adding a code word to your project, like "Please include the word cucumber in your response, so I'll know you have actually read this." Any response without your codeword should be disregarded. Most likely from a robo-bidder, or at least someone that didn't read or can't follow direction.

The freelance websites mentioned above also provide help to ensure your project is successful.


Selecting a freelancer.

Ok, you have typed up a detailed description of your project, including the URL if applicable, provided an image with notes if needed. You of course want to find someone with the right skill set to complete your project, but there are other factors you should consider when choosing someone.
  • Language and Location
  • Security
  • Payment method and Economic ramifications,

Language of course because you don't want the details of your project to be
misunderstood. For example if you are a native English speaker, it is likely best to hire another native English speaker, however often developers from non-English speaking countries can perform tasks much cheaper. Ensure the developer's language skills don't create a risk to your project.

Location can be a huge factor in communication. Be aware of what time zone they are in. Are they likely to respond to you quickly ?


While it's not always feasible for the developer to work "off site", the best way to ensure your security is to find a freelancer that has their own development area, their own web server or workstation where the work can be completed. Still, often it is necessary to provide sensitive information. Determine what passwords are needed and change them, providing the new passwords to the developer. After the project is completed, change the passwords again. Read that last sentence again. Thank you.


Payment method can also be a deciding factor. Most freelance sites like ScriptLance provide an escrow system, that shows whomever you hire that you do indeed have the funds to hire them and provides a way to protect your money. Freelancers who insist on payments which bypass this protection should not be considered.


Finally Economic ramifications. In the big picture, where your money goes does have an impact. Of course that impact increases as the dollar amounts do, but just like your moral obligation to throw your candy wrapper in the garbage can, you might consider an obligation to your own region's economy. This may mean the lowest bid, regardless of qualifications, is not the one. Keep that in mind if keeping your local economy churning is a concern to you.


By the way, if you also freelance yourself you can monitor projects posted to the sites above here > Freelance Jobs

Thursday, June 3, 2010

We needed another Apollo 13 moment.

Granted I usually talk about web related stuff here, but the problem in the Gulf of Mexico has been on my mind quite a bit lately. Nearing the 50th day that oil and gas has been spewing into the gulf I can't help but think how poorly the collective parties involved have handled the problem.

I don't want to try and blame any one entity, the cause isn't my issue. It's the reaction, or rather lack of it that I have qualms with.

The think tank thinks, the do-ers do. Seems simple enough but there is an apparent flaw, judging by the time something is actually done, and it's effectiveness. I'm wondering if these folks have an accurate take on what is at risk here.

Sure millions, even billions of dollars, but the real cost will be endured by the wildlife and the regions inhabitants, possibly for decades. Clearly some out of the box thinking was needed, and it still is.

So in short, fix the damn leak and clean up the mess. There will be plenty of time to point fingers afterward.

Friday, May 28, 2010

What is your website actually worth ?

If you are curious what your own website is worth you likely have used one of the numerous online tools that calculate it's value. The problem is, these simply can not be accurate, and often give the website owner a false impression their site is worth much more than it actually is, or occasionally much less.

Using one of my own site's I've tested numerous services that calculate a website's value and the numbers have been all over the map, one as low as $67 and one as high as $16,000

The truth is, things, website's included, are worth what someone is willing to pay for them. So how can you get a reasonable idea of what your website is worth? Do the same thing other businesses do. Figure out how much profit the site makes in a year and multiply that number by 3 to 5.

For example, I have a website that earns roughly $1000 a year (not much I know, but it's a start) So if I were to sell it I'd start with an asking price of $5000, but might consider an offer as low as $3000.

This not to say the website worth calculators are meaningless, they can be used as a general gauge to evaluate a site's performance month to month, but for that I prefer a different set of tools...

Website Grader http://websitegrader.com
Offers a free and in depth report of how well the site is doing in terms of on site SEO optimization and off site factors. What I like most about it is the advice is sound when it detects an area of your site that could use improvement and the overall "score" is easy enough for me to remember from month to month.

There are of course gentle nudges to register for their free trial, which I haven't yet, but I suspect that would be time, and potentially money well spent.


Push 2 check http://push2check.com
I recently discovered this handy site and have made it part of my daily ritual, right after my morning coffee and email check. I could probably cut back to weekly checks. Maybe I have a slight case of OCD? Anyways, the cool thing about this site is it keeps track of your previous stats, so you can easily see if your site's overall performance is improving or not over time. It's also a pretty handy way to access other performance indicators like the almighty Google page rank, Alexa score and counts of social bookmarks.

So in the end, like I said, your web site is worth what someone is willing to pay for it, but with the tools above, a bit of hard work and persistence you can increase it's value over time.

Monday, May 17, 2010

The lost art of writing HTML

I remember back in grade school, while I got pretty good grades for spelling and English grammar, I continually got bad marks for penmanship. Yes it's true, my handwriting was terrible. It seemed I was destined to print all my life, or maybe become a doctor, a profession where illegible scribblings is typical.

But eventually, with practice my handwriting improved. Then along came the computer age, and almost everything I've "written" was typed. Now if I actually need to write something, it's nearly as bad as it was back in my grade school days.

This is what I see happening now with HTML. The typical "webmaster" has become so dependent on WYSIWYG HTML editors, that when tasked with actually writing HTML themselves, they fail miserably,, but who cares right? as long as it looks ok, who cares how terrible the underlying code is.

In fact malformed HTML code became the accepted norm. Web browsers developers even started compensating for it. Oh, you forgot to close your paragraph tag before opening another...  no problem, the browser just fixed this for you, and unless you actually scrutinized the source code, you'd never know of the mistake.

Many website owners are just like me when I became dependent on the keyboard, and as a result today the typical website is full of what I call bloat code and HTML errors.. but that's OK, because today's web browser just compensate for it.  I'm not convinced this is actual progress.

Thursday, March 18, 2010

Just because it has data in it does not make it a database.

I am continually surprised when I find companies using makeshift ways to manage data. Sure, Betty who sells pottery on Ebay is probably fine using her email client as a CRM, but when I see fortune 500 companies rely on outdated copies of Outlook Express to act as a customer database I just sort of sigh and die a little bit inside.

Outlook and Outlook Express are not databases, they are email clients with contact management features.

Excel is not a database, it is a spreadsheet application.

If your company is serious about making your data accessable, secure and reliable use a real database application. I of course have a propensity to suggest MySQL, but even Microsoft's Access is a better choice than Outlook Express

Saturday, February 27, 2010

Search and replace where you need it.

Being able to search and replace text, regardless of how it is stored is a humoungous time saver so I thought I'd share a couple methods I use.

SSH BASH perl one line search replace
I usually do my file editing & programming work via SSH with PuTTY so a command line search and replace is really handy especially when large amount of files are involved. Here is an example of altering one file . . .

perl -pi -e "s/old text/new text/gi;" filename.txt

Remove the g to only replace the 1st occurrence of old text. Remove the i to make it case sensitive.

Of course replacing the example filename "filename.txt" with files you actually need to alter, for example, change all mentions of oldsite to newsite in all .php files in the current directory


perl -pi -e "s/oldsite/newsitet/gi;" *.php

Or, to do every php file even ones in sub directories...

perl -p -i -e 's/oldsite/newsite/g' `find ./ -name *.php`

Of course perl and BASH being what they are, there are likely 30 different ways to do th same thing, but thus far this one has worked just fine. It is also incredibly fast

What I recommend is copy the files you plan to do search replaces on to a new temporary directory, and test your search replace commands to ensure they have no undesired consequences.


MySQL query to search replace.

Often I need to quickly and accurately change text in a MySQL database.  Here is a MySQL query that does exactly that.

update [table_name] set [field_name] = replace([field_name],'[string_to_find]','[string_to_replace]'); 

Using PHPMyAdmin it is pretty easy to make a copy of the table and use it for test runs of search replace queries. Depending on the application a search replace done incorrectly could spell disaster so never ever do search replaces on a live and in use mysql table.

As an example, I wind up moving a lot of Wordpress websites in my freelance work, typically from my own development area on my webserver, to the new account my client has created for their live Wordpress site. This MySQL query can be modified for your use when moving a Wordpress site.

update wp_options set `option_value` = replace(`option_value`, 'URL TO MY WORDPRESS DEV', 'URL TO THE LIVE WORDPRESS SITE');

Other methods.

While I live in a SSH shell most of my computing day I am on a Windows workstation. and occasionally do need to search replace text files on my own hard drive. Thus far I don't have an elegant solution but using Windows file search, with the content search option I can get a nice tidy list of files that need the edit. I can then select as many as I need (or as many as my computer can handle) and right click to open with Notepad++ . From there I can do a search replace on all open documents and save them in a couple of clicks.

Software mentioned in the article
PuTTY :  http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
Notepad ++ : http://notepad-plus.sourceforge.net/

Thursday, February 18, 2010

Microsoft Yahoo merger

After Microsoft's failed attempt to simply buy Yahoo one would think they'd realize, or at least learn to be content with their place in the search engine pecking order, which  is 2nd, or 3rd depending on where you look, but no. In classic more money than brains corporate fashion they'll be paying Yahoo to set aside it's own search engine and run a co-branded version of Microsoft's bing.  

For myself, or rather my websites this may be a good thing. Currently while I get much less traffic from Bing than Google, Bing users are roughly 20% more likely to convert, so I'm hoping that trend holds true with Yahoo users.. or will they be Ya-Bing users..

Still in the big scheme of things, this stinks of desperation. A stark example of corporate mentality, profit at any cost, and if profit is unobtainable, pay to chip away at the top competitor, incurring a loss to do so.  Maybe it's just time to throw in the towel and stick to your primary niche products?

Monday, February 15, 2010

ISS almost finished

No tech talk today, well not internet related anyway. Wanted to talk about ISS, you know the almost
400 metric tons of materials and technology that has been shot into orbit and assembled, the international space station. Well, it is pretty close to completed.

I remember 12 years ago when the first parts were launched thinking if all the countries involved can
set aside petty bickering, focus on the science and work together the world as a whole might just be
able to pull it off, but I had some serious doubts. 12 years is a long time, and it seemed inevitable
that something somewhere somehow would mess it up.  Glad to say I was wrong.

So now that the last major components are together. With only the white picket fence to paint and
the curtains to hang, the world now has a piece of orbital real estate to make use of, and in many
ways that is a tremendous accomplishment.

Of course it's a victorious time for NASA, all the astronauts, flight crews, advisers, manufacturers and
everyone else that had a direct hand in the assembly of ISS, but to me, it is also a victory for humankind.

Proof that together, we can achieve common goals as a unified global work force, and that gives me
something rarely found, hope.

Friday, February 12, 2010

Search engines hate my site !

Ok, not actually my site, but some clients I have worked with in the past seemed convinced the cigar smoking bigwigs at certain search engine companies had a personal vendetta against them. I can apreciate the desire to increase traffic to your website, I have my own as well that I'd like to see improved (and it is gradually) but a word of caution. In the case of SEO, or more specifically on site / page SEO, too much is really too much.

In other words, sometimes in an effort to tweak your site to be more search engine freindly you wind up over optimizing it, or worse, triggering penalties by perceived black hat SEO tactics.

There is no silver bullet or point and click method that results in instant search engine success. Anyone trying to sell you one should probably be slapped, or at least ignored. Effective SEO is much more about knowing what not to do, than it it is stuffing in the latest whizzbang method into your pages.

To me, that is as it should be. Websites that are created poorly and look like thousands of other sites should not outshine sites where the webmaster has worked hard to provide quality content that is unique.

If you've been at this a while, you are probably tired of reading "content is king!" so that is my only mention of it, but how about "legible text is paramount!" or "the importance of predominantly unique and grammatically correct authoring should not be underestimated". Ok, that last one was a bit long, but you get my drift.

Something I suggest.. open a copy or notepad or whatever text editor you use. Browse to your home page, hit crtl+A key to select all text then ctrl+C top copy it. Now paste that into notepad. Reading it, if it isn't obvious what the site is about within the first few sentences you should probably consider changing your layout. This is essentially what Googlebot and other search engines see, and what they use to attempt to categorize your site.

Note, when I say "consider changing your layout" I don't necessarily mean changing the look or placement of your textual content. The goal is to get the unique and relevant text closer to the top when viewing the HTML source of your pages. Using CSS positioning, that text can be displayed wherever you like.

Another shoot yourself in the foot tactic I often find is large amounts of inline styles and java script within the HEAD tags of HTML pages. Whenever possible these should be moved and referenced from external files, like style.css or functions.js Search engine bots really don't care about how the page is styled and typically don't run java script, so why make them wade through it. Again, getting your unique and relevant text closer to the opening BODY tag.

I've just realized I probably have a hundred other SEO blunders to cover, but the last not so good idea I'll mention here is the overstuffed deceptively flattering sitemap.xml approach...

Overstuffed.. It is not necessary to mention every URL of your site in the sitemap.xml If your site has a dozoen or so pages, then it's likely fine, but hundreds? Probably too many, thousands? Definitely too many.

Just ensure your sitemap.xml file has your most important sections of your website possibly /blog , /forums, /contact, /about etc. You do not need every URL. If your site has reasonable navigation the search engine spiders can follow the links to the remainder of your site.

Deceptively flattering.. I may have mentioned this before, but all too often I run across sitemap.xml files where every URL has an importance of 1.0 and or is updated daily. Really!? C'mon be honest. If /contact is updated daily then you must be an under cover agent or maybe a paid assasin.

The point is, be realistic. /forum likely does get new content every day, /blog might be daily, weekly .. or if you are lazy like me monthly. So yes, be honest and realistic when assigning importance and update frequency in your sitemaps.

Friday, January 22, 2010

Freelancers need to occosionally say no.

As a freelance web developer I often find myself wrestling with morals.

Should I work for a client with obvious or even questionable morals ?

Should I help a client promote a product or service I wouldn't recommend to a friend ?

Should I provide what the client is asking for even though I know it will detract from their web site's search engine performance ?

While it's been hard at times, I am getting better at firmly yet respectfully declining jobs. Of course I still need to eat, but luckily the majority of clients I've worked for in the past call on me again when the need arises.

It wasn't always like this. When I first started freelancing I'd take work where I could get it and learned the lessons the hard way. Sometimes you just have to say no thank you.

Think about it. If a potential client has you write something that essentially helps him commit fraud or theft you are just as guilty as they are. It really is worth it in the long run to avoid those types of jobs.

I've also had to reluctantly pass on work promoting products or services I simply see no market for. I'll have no part in assisting a would be entrepreneur on their way to bankruptcy.

Finally there is the inflexible client insisting they have feature XYZ added, rejecting advice that feature XYZ won't help them, or might even ruin their business (like get them removed from the Google index) When a potential client is this inflexible I take that as my cue to gracefully decline. Several times they've looked me up again, essentially saying I was right, this didn't help,
and can I help them remove it.

Of course it's hard to be picky as a freelancer just starting out, but over time I hope everyone will learn from my experiences.

When considering a project, ask yourself whether it actually benefits the client as well as their respective clients. If the answer is no suggest alternatives. If the client won't budge on their specifications, it's probably best to move on.