By Mike Banks Valentine (c) Nov. 17, 2004
Link building has become an integral part of search engine marketing and positioning. The topic is inevitably the subject of many web conference programs and the WebMaster World of Search conference was no exception. In a session entitled “Proactive Linking”, three experts in search marketing, Bruce Clay of BruceClay.com, Jim Banks of WebDiversity and Greg Boser of WebGuerrilla, shared their strategies and opinions on the linking campaigns.
Bruce Clay suggested what he called “aggressive Linking campaigns” be done with some basic guidelines. Suggesting that many uneducated link builders do foolish linking into multiple domain networks which are all hosted either on the same server or simply buy links from a series of sites hosted within a single C block of IP addresses. He recommended gaining links from a wide range of IP addresses hosted in different places so that it replicates “natural” links that might be gained over a period of time. This guideline was given as “IP address – Different is Good”.
Another aspect of aggresive linking is to vary the PageRank quality of those sites linking to you (or client sites) so that, it too, appears more natural to the search engines. This is because too many linking campaigns target only PR8 and PR9 sites. Clay warns that this is an “un-natural” thing, especially for new sites or those with low PageRank themselves.
He suggests that link partners should not be chosen purely based upon PageRank and that, if it is, that you should seek a full range of PR level link partners to replicate “natural” links. This guideline was stated simply as “PageRank – Natural is Good”.
Jim Banks of WebDiversity recommends looking at linking campaigns from a highly scientific, regimented and tested method involving segmented groups. He contends that there are different types of visitors to web sites, including “shoppers, buyers and visitors” and that each carry a different return on investmënt, or ROI, to the site owner. Clearly if you’ve determined the value of a visitor based on their actions and paths through a site, what search phrases the different visitors use to find your site, and which keyword phrases convert to salës – you can target link text in your campaign based on the highest value customers.
This approach involves extensive keyword testing and conversion analysis through pay-per-click advertising campaigns. Once you’ve determined the highest value keyword phrases, then you apply those phrases to the anchor text you request from link partners. This method allows some very clear requests for link anchor text based on the highest ROI determined in the PPC testing. Banks contends that this method allows a very clear guide to organic search optimization and targeting highest value search phrases and highest ROI for individual clients.
Greg Boser of WebGuerrilla offerred commentary on both Clay and Banks methods when he called the PPC testing method advocated above as “Checkbook SEO” and basically agreed with Clay by recommending that linking campaigns should “agressively replicate natural links”. He warned not to buy links without knowing they will be posted to sites spread across a wide spectrum of pages, with varied link text and geographically varied hosts and IP addresses. Boser joked that sites that go from zero links to 20,000 overnight may be red-flagged by the search engines.
During the Q&A session following the three presenters, audience members offered up a long list of questïons about linking. Questïons ranged from how to determine the IP address of link partners, to which Clay responded that he simply writes a PERL scrïpt to retrieve IP addresses from domain names. Most webmasters know that there are many services online which return the IP address if you enter domain names into a text box and clïck the button.
Additional questïons included how to find out if domains or IP address ranges were considered “bad neighborhoods” as mentioned in warnings by several search engine webmaster guidelines for good ranking. Bruce Clay responded by saying that he does research to determine “is the IP block dirty,” as a way to avoid those so-called bad neighborhoods and that there are several published lists available that could be found with a simple search for “IP address resources”.
Audience members asked if there were words that might be avoided in page filenames such as links.html, which would be seen as what has become known as “poison words” which search engines have used to determine possible reciprocal links pages and downgrade the value of those pages. Panel members all agreed that the word “Link” should be avoided in filenames and title tags of reciprocal links pages. Audience members asked about results they are seeing at the search engines with nothing but links and pages full of what appear to be search results on top ranking sites for competitive search terms.
Clay responded that he had a problem with sites employing “search scrapers” which gather the top ranking results pages from the search engines for competitive and expensive search phrases which then link to him through those results purely as fodder for lazy webmasters using AdSense ads on those pages, which become highly ranked themselves. These “search scraper” results are happening more often for terms fetching PPC bids higher than $5 from Adwords and Overture ads. “Search engine optimization” as a PPC term ranges from $3 to $6 and is targeted by search scrapers for those Adsense ads.
Stay tuned for more reports from WebMasterWorld of Search #7 in Las Vegas.
About The Author
Mike Banks Valentine practices ethical SEO. Contact Mike at: SEOptimism.com. This article is available online at RealitySEO.com with links to resources. You may use it on your site, blog or newsletter if you maintain this resource box and make links live hyperlinks.
This article is from the SEO News newsletter, click here to visit them.