How I Got 70,000 Useless Visitors To My Site In One Day!

Recently, a page on one of my websites was bookmarked or listed on Digg, a popular social bookmark site. It gave me the perfect opportuníty to study and analyze the traffic coming from these social media sites. Read to discover the advantages and disadvantages of social bookmark traffic and how it can be applied to your own online marketíng or site.

Is Social BookMark Traffic Useless?

First, we must make the distinction that no traffic is useless. Any visitor to your site is a good thing and should be welcomed. However, all traffic is not created equally, there are great differences in the sources of your traffic. This article takes a close analytical look at social bookmark traffic from an internet marketing perspective.

In case you haven’t noticed, right now social bookmark and media sites are all the rage on the web. Social bookmark traffic comes from such popular sites as Slashdot, Digg, Stumbleupon… basically these sites are driven by their users – that is, users or members pick and bookmark the content they want to view and discuss.

These social bookmark sites are extremely popular; they command the high traffic numbers most ordinary sites can only dream about obtaining. But is this social bookmark traffic useful?

Is it worth your time? Should you be actively promoting to these social media sites? Should you concentrate your online marketíng efforts on these types of sites? More importantly, what are the benefits and disadvantages of getting a front page listing on a site like Digg or Stumbleupon?

As a full-time online marketer I wanted to know the answers to those questions. Moreover, I wanted to discover how or if I could use these sites from an online marketer’s advantage; i.e. how can they help me create more online income.

Recently, the Digg listing gave me a first-hand opportuníty to really study these sites.

Of course, nothing happens without a reason… I did actually court these social bookmark sites by placing the free Addthis.com bookmark on all my pages. You can do the same. Just use this simple bookmark to attract these sites.

But be careful; getting your site featured on the front page of these sites can drive 100,000′s of visitors to your site immediately, so much traffic that it may overtax your server and crash it.

So be warned; if you’re actively promoting to these social bookmark sites just make sure your servers or web hosting is up to the demanding task of handling all these sudden visitors.

In my case, it didn’t crash my servers but unfortunately, the page/link in question featured an old poorly written article I did on the history of the Internet. Why it was even featured on Digg is a puzzle and beyond me.

But still I am not one to waste an opportuníty, so I put my Google Analytics into overdrive and starting analyzing these visitors and social bookmark traffic. It pointed out some very interesting factors about this bookmark traffic.

Most of this traffic will:

simply bounce back
very few visitors will spend much time on your site
very few visitors will even venture into your site
very few will sign-up to your newsletter
very few will enter your marketing follow-ups/funnels
(The unknown variable here being the content on your site, how good it is? How well does it perform?)

Regardless, one common problem with traffic from these sites is that it’s very temporary traffic. The high volume will only last a few days… until your item is moved back from the front page.

These visitors will not stay on your site long and most are gone within seconds, not to be seen again. A few may sign up to your newsletter or venture to other areas of your site but not many.

Social bookmark traffic is very fleeting, like customers in the drive-thru section in a fast food restaurant. They grab the content and surf back to the major linking site very quickly and surf on to the next item.

This traffic will behave very differently than organic traffic from the search engines, or from your newsletter traffic or from traffic in your marketing funnels. Much different.

It was unlike getting one of my articles featured in Addme or SiteProNews, where I can easily get 200 or 300 new subscribers in a day. Plus, these visitors are interested in my information and have been exposed to my content (article) before coming to my site.

So there was no comparison; I would take the traffic from these sites any day over traffic from the social bookmark sites. And I would take free organic traffic from the search engines over any other source of traffic.

So the question remains – is social bookmark traffic useless? First, as I mentioned before, you must realize no traffic is useless; any visitors to your site is a good thing. Without traffic your site is worthless, just a few files sitting on a server in the middle of nowhere. Obtaining visitors is one of your first objectives as a webmaster. You must get visitors to your site or it’s game over.

The best kind of traffic is traffic coming from organic search, visitors who come from the search engines seeking exactly what you’re offering on your site. These are targeted visitors who will consider your pitch, read your information, maybe buy a product or sign-up to your newsletter or follow-up system. They often become repeat visitors to your site. These are your ideal visitors. This is the kind of traffic you want.

Social bookmark/media traffic is different but it does have some saving graces.

Mainly it can help expose your site to millions and help brand your site or business. It can get the word out about your site. Start a buzz.

If you have a site that appeals to the mass market, then these social sites could be an excellent recruiting ground for visitors and traffic.

These social sites are good for another reason; getting your links on all these high traffic, high PR7 and PR8 sites can’t hurt your search engine rankings. Once featured on a site like Digg, your link will appear on many secondary sites around the web, so far 500+ and counting. Monkey see, monkey do. Although it hasn’t been my main ambition to get featured on Fark.com, all these sites do have high PR ranks so from a SEO standpoint it is not necessarily a bad thing.

Since many of these visitors will be using the Firefox browser which has the Alexa toolbar embedded – your site’s traffic rank will improve. Over 50% of the bookmark traffic coming to my site were using the Firefox browser. Alexa’s traffic rankings are not a true picture of the web’s traffic, but it’s a good measuring stick, nonetheless.

Google might even consider it when ranking your site. Google basically considers their whole indexing system as a democratic voting structure… sites give a vote by linking to your content; wouldn’t it also be reasonable to assume more traffíc means more votes. So wouldn’t getting a lot of traffic or being featured on a site like Digg where the users vote to propel the best content to the front be the ultimate vote.

One strange thing I did notice, for some reason the traffic from Stumbleupon was different. These visitors stayed longer on my site and reacted more like organic traffic. Maybe the Stumbleupon site is of a higher quality and this may have been reflected in the quality of the visitors coming from there. It also reminded me, all traffic from these social media sites can’t be judged with the one brush.

This whole experience also pointed out another important factor; it made me realize how unsuited my content is for the general web surfer or the mainstream web. All my sites and content were planned and organized to first draw in targeted (warmed up) visitors from free organic search and from my online articles.

If I, or anyone, wanted to take advantage of this social media traffic, they would have to create site/content to appeal to these surfers and then somehow draw them into their marketing funnels. I don’t know if the majority of the users of these bookmark sites would make good prospects, but my guess is not very likely – the nature of the beast. But it would largely depend on what you’re offering on your site and how well it is suited to these users. So I am not drawing any conclusions yet.

Hopefully, I will have further chances to study traffic from these social sites and get the long-term effects, especially in regards to my keyword rankings in the search engines before making any final judgments.

For now I will keep an open mind but the jury is still way out whether or not social bookmark traffic is worth the interruption to the daily marketing tasks of your site. Just seems like much ado about nothing.

About The Author
The author is a full-time online marketer who has numerous websites, including two sites on Internet marketing. For the latest web marketing tools try: BizwareMagic.com . For the latest Internet Marketing Strategies Go to: MarketingToolGuide.com . 2007 Titus Hoskins. This article may be freely distributed if this resource box stays attached.

Where’s Your Social Responsibility Google?

Unless you’ve been living on a desert island with no Internet access, you’ve probably seen the recent blog fallout from Google’s latest crack down on alleged link brokers.

This week it seems that Google made some type of manual Toolbar PageRank reduction on a handful of major blogs and portal sites like the Washington Post, ProBlogger, CopyBlogger and Forbes.com. Some of these sites had PageRank scores of 7 which have now dropped to 5, scores of 6 which have now dropped to 4 and so on. The blog buzz is that the sites have been singled out by Google as using their high PageRank scores to sell links and have been punished by the world’s most popular search engine as a result. There is currently no proof of this and no public statement by Google acknowledging or denying the situation.

A lot of bloggers have weighed in with commentary, observations and opinions. Every time I read a new post about the so called smack-down I imagine some Googlers at Mountain View laughing hysterically and high-fiving each other for turning the tables on the SEO industry yet again.

The situation has even got the SEOs turning on each other. One of the world’s best known SEOs, Jill Whalen, made a post in response to the situation that included a comment about one of the affected sites, Search Engine Guide. Jill’s post has been interpreted in some circles as a type of attack. Here’s the comment Jill made in her post:

“Even my very good friends at Search Engine Guide were smacked down. I hadn’t been to their home page in ages since I usually visit through direct article links, but when I looked at their home page today and scrolled down to the bottom, I was taken aback to see what looks more like a link farm than anything else!”

I’ve known Jill a long time and I read her remark about Search Engine Guide as a quick off the cuff comment, not a deliberate attack. Without putting words in her mouth, I think it sounded more shocking than she meant it, probably because she was typing as a response to first impressions of Search Engine Guide after not seeing it for so long and because (being ridiculously busy) she was probably in a hurry. So the comment itself didn’t raise an eyebrow for me. But I WAS concerned about how the general webmaster community would interpret the comment.

Yes, she has every right to her opinion. But being who she is and the industry reputation she’s built up, Jill has incredible influence over a large number of webmasters and SEOs who absorb her material. Persons reading her article that are unfamiliar with Search Engine Guide may permanently associate the site with the term “link farm” and all the negative connotations that brings. No matter her intent, her remark definitely has the power to hurt Search Engine Guide and their reputation. The site’s publisher Robert Clough obviously thought so, as he was prompted to make an uncharacteristic post in response.

Personally, I think Jill should have considered the possible backlash from her casual comment and worded her post much more carefully. After all, with industry influence comes responsibility. Which brings me to the main point of this article. Google now has extreme influence and power over the Internet. When they make changes to their algorithm or the way they cache and filter web sites, it has a dramatic impact on not just web site owners, but business and life in general. Millíons of people rely on Google to survive, literally. In that respect, this attempt at link bait humor is a little too close to reality to be funny.

With such powerful social influence, I think it’s about time Google started taking more responsibility by being more transparent with their activities. If too many webmasters are doing the wrong thing with regard to linking, or an algorithm change has occurred, why not launch a media release to set the facts straight? Not everyone knows about Google’s Webmaster Guidelines, or has a Webmaster Tools account. But a lot of people read the newspaper. If they want webmasters to co-operate, Google has to recognize it’s a two way street.

By slapping on this latest penalty, (if it is indeed a penalty), Google seems to be claiming to *know* the intent of these sites. But what if they’re wrong? What if, as Jennifer Laycock claims, they are merely selling advertising space without Google being a consideration? There’s nothing in Search Engine Guide’s advertising material relating to PageRank OR Google. To assume they are trying to use their site’s high PageRank as a selling point is pretty arrogant and irresponsible of Google, in my opinion.

Without some type of public acknowledgement from them, we can only assume Google’s latest move is an attempt to control how webmasters use their own web site space. That’s a huge line in the sand they’ve crossed and I don’t know about you, but it makes me nervous.

About The Author
Article by Kalena Jordan, one of the first search engine optimization experts in Australia, who is well known and respected in the industry, particularly in the U.S. As well as running her own SEO business, Kalena is Director of Studies at Search Engine College - an online training institution offering instructor-led short courses and downloadable self-study courses in Search Engine Optimization and other Search Engine Marketing subjects.

How to Use SEO or Search Engine Optimization for High Google Listings

If you know how to use SEO to get a high listing in search engines, or are an expert in search engine optimization for high Google listings, then you need read no more of this article. Your website obviously has at least one page in the top 10 of Google, MSN and Yahoo, and you have as much traffíc as you need for your success.

However, if not, then you need some advice. You need to understand the basics of search engine optimization. Incidentally, what the basics are to you may not be basics to others. Basics to some are the correct use of LSI (latent semantic indexing), of internal linking strategies and of other techniques designed to lead search engine spiders by the hand and convince them that their site is the tops. Can you do that?

If not, then here are one or two tips. Good SEO is a lot more than just having your page title in title tags and your heading in H1 tags. It is more that just having the correct keyword density – do you know what that is? The vast majority of people don’t have the slightest clue about keyword density or what it means. Formulae said to relate to keyword density and the number of words in the key phrase as a function of the number of times the phrase should appear in a web page are mediaeval in internet time.

Do you know what? Google doesn’t give a toss about your calculations. Google cares about the service you are providing to Google customers and how relevant the content of your web page is to their needs. To find that out, Google applies a statistical mathematical equation based on statistical analysis of semantics as related to the specific keyword being used by the searcher, and the semantic content of your web page.

Google doesn’t care if you have exactly 15 incidences of your keyword every 500 words – in fact if you do, you have no chance because that is now excessive. Keyword stuffing or keyword spamming they call it! Yet people still write articles packed full of keywords in the mistaken belief that it will be good for them. Who is still telling them that?

So let’s forget keyword density. It’s old SEO and no longer relates to Google’s needs. Internal linking: now there’s a new thing to most people, even though it has been relevant for the past few years. By intelligent use of internal linking you can lead your friendly neighbourhood spider down any web you can weave for it. And you will benefit greatly by doing so, if you know where you should be leading it.

Internal linking strategy is a different concept entirely to an external linking strategy involving one way or two way reciprocal links back to your web page from that of another website. Most people are involved in that, but also most don’t know how to do it properly, and therefore don’t benefit. Let me give you a simple example.

You have a website with a page rank of 4 for your home page. Note that it is not your whole site that gets a Google PageRank, as it is properly written, but each individual page in your site is individually ranked. When you come across a website with a PR of 4 or even 8, it is the page you are looking at that has that PageRank. That will generally be the home page, and when you agree to a reciprocal link, guess what! Your link will be placed on a ‘links page’ in that site with a PR of zero. That’s right, a Google PageRank of zilch: and that’s the benefit you will get. Zilch!

If you place their link on your home page, or any other page with a PR of greater than zero, you löse out. Even if your page has a Google PageRank of only 1, Then You Löse Out! They get a share of your PR of 1, and you get a share of their PR of zero!

Stick that in your pipe and smoke it, and then tell me I don’t know what I am talking about as many have. Some of these many are so-called internet gurus and SEO experts that fleece you by promising you a Google Page 1 position for your site, when they can’t even get one for themselves.

If a site offers you a top 10 position, checkout their position by using their URL. If they were so good, wouldn’t they be in the top 10 for their URL? I would have thought so! But NO.

So, do what you can to understand what Search Engine Optimization is. What it really is – not what some would want you to believe what it is. Checkout the source code of successful sites and compare it to the unsuccessful sites and try to spot differences. If you cannot, then it is the linking strategies that make the difference. Whatever strategy you use, however, make sure that you fully understand it and that you are using it as it should be used. There are links and links – some better than others. Some can give you positive results, and some of your links can be very bad for you.

Do you know how to tell the difference? Most can’t, and so are led by what they read online. The problem is that since ‘most can’t', most of what is written online is garbage. It is difficult to spot the truth from the opinion. It is truth that gets you a good Google or Yahoo listing, not opinion. The problem is that more people believe opinion than truth since they don’t know what truth is, and most of what they read is false opinion.

The best advice you can have is to checkout the websites that have succeeded and copy what they do. However, that is not as easy as you think since the off-site linking strategy that you cannot see is as important as the on-site SEO that you can see.

About The Author
If you want screenshots of a website that succeeds, then checkout Pete’s site Article-Services that varies from Number 1 and Number 4 on Google for the keyword ‘article services’, and then find the screenshots and explanation of how he does it on Improved Search Engine Rank That is how to learn from successful sites.

Deciphering Web Analytics

Want to optimize your online sales? Improve your understanding of your target market demographics? Need to improve your marketing ROI? What right minded webmaster or online entrepreneur doesn’t, right?

Your web analytics are your gateway to measurable success and provide a lot more information than most people give them acknowledgment for. Yes, they track the number of visitors you receive and indicate your most and least popular pages. However, they also guide you towards your best performing keywords, the countries that provide you with the most active visitors, and essentially provide you with a blueprint of the exact steps each visitor takes on your website.

Armed with this kind of information you should be able to improve the overall perförmance of your website and your online business. You can also improve your marketing efforts, enabling you to concentrate on the more effective, and ignore the least effective.

Keywords, Search Engines, And Popular Landing Pages

For many sites, the search engine is the leading producer of traffíc. An SEO campaign can produce excellent levels of highly qualified leads with comparatively little spend. The key to a good SEO campaign, though, is to continue the optimization process.

Good analytics packages provide detailed information that is vital to your SEO campaign. You can view a líst of the keywords that visitors have used in order to find your site. This information can be used to identify those keywords that are providing the most traffíc and any that can be improved upon.

By reading the referrer of each visitor it is also possible for most analytic programs to determine the search engine that directed visitors to your site. Again, it is possible to use this information in order to improve your optimization efforts, with a little online research.

Landing Pages And Referrer Pages

A good avenue of pertinent information is the líst of landing pages and referrer pages. The landing page is simply the page that a visitor first lands on when they reach your site, while the referrer is the page that directed them to your site.

Don’t be fooled into thinking that all of your traffíc emerges on your home page. At least, for most websites this shouldn’t be the case. Each page on your site is a potential source of search engine traffíc, and if you have well categorized pages then PPC campaigns should also be page dependent.

Alternatively, if you use any kind of advertising, it will pay to keep track of how each campaign performs. The referrer statistics will help you determine this very fact. If you have links all over the Internet, then this can point you to the more beneficial of those links so that you can attempt to gain more, similar ones.

Visitor Experience

How your visitors reach your site shouldn’t be your sole fascination. Once a person reaches the fold of your domain, you should attempt to learn whether they had a positive experience, and, if not, then why not. Fortunately, web analytics typically provide some very good statistics to help you with this.

Visitor and page load statistics. Whenever a page is loaded in a browser it is logged as a page load. However, any single individual can open numerous pages or may even open the same page numerous times. The unique visitor figure is the number of individual people that have accessed your site.

Visitor paths. You can track the actions of a visitor from the landing page to the exit page. This includes every page they visit in between, the amount of time they spend on each page, whether they make a purchase or click any links while on those pages, and more. This information is crucial to determining any problem areas on your site. If a particular page is leading to a lot of people exiting your site, then address it immediately. These statistics can also provide you with hot spots you weren’t previously aware of.

Translating The Results

Translating the results need not be any more complicated than actually reading them. Doing so, though, can seriously improve your profits. Here are a few guidelines that can be used when next viewing your analytics.

Lots of Visitors But No Conversions

A lot of people place too much emphasis on driving traffíc to their site, and not enough emphasis on actually converting those visitors to customers. If you find that the pages of your site are frequently being visited, but surfers are leaving without becoming customers then you need to take action quickly. Typically, your site content may need improvement or the traffíc you are gaining is not targeted to the topic of your website. Look at visit lengths and paths to determine which is the case for you.

Visitors are Leaving From a Specific Page

Again, this can usually be combated with improved content on that page. If the content of an individual page is poor, but the rest of your site is good, then you will usually see that your visitors are navigating happily around your site until they reach this one page. Look for broken links, inappropriate content, or just poorly written content.

Traffic From a Specific Source is Particularly Inactive

If you look at your referrer statistics and note that one source of traffíc is sending a lot of inactive visitors to your site there may be one or more explanations. Review where the visitors are being directed to and ensure that this page is well optimized for conversions. Also do some digging on the referrer’s end. A banner or link placed on an irrelevant page is unlikely to yield the positive results you are looking for.

These are just some of the ways that analytics can help you and your website. Experiment and look for trends. Question anything that you notice until you find the most reasonable answer, and then take action accordingly.

About The Author
Article by Matt Jackson. WebWiseWords, website content that sells.

The Big Convergence Of Online Advertising

It’s hard to imagine a Web without onramps like search engines these days. They’ve become an essential part of the online experience. It may be equally as hard to imagine what the next phase of online advertising will be, as used to the sponsored links as we are.

That maybe part of the problem. Ad-blindness generally becomes the necessity of ad invention. Even as Google slays the competition in the search ad sector and takes in 40 percent higher profits quarter after quarter, there must be something to the company’s (and its competitors’) willingness to dole out heaps of cash for ad display companies and prime real estate like YouTube.

Google did, after all, start weaving video into AdSense, and overlaying ads on YouTube. (How are you dealing with ad-blindness? Comment.)

So there may be (okay, there is) some credibility to Microsoft’s belief that the bulk of online ad spending in the near future will steer away from keyword advertisements. But again it is still hard to imagine too much of that bulk being redirected.

My teenager at home, Internet-savvy as he is, still uses the Google search bar to find YouTube and Wikipedia – even though I’ve fussed at him repeatedly to just type in the URL. He’s a good representative of the average surfer – the journey begins with a search engine.

Even then, and even though Google has prided itself (in the days of moving upwards from launch to national conscience) on its clean, unobtrusive interface with text ads tucked neatly to the sides or differentiated up top in the prime real estate area.

But don’t count on it staying that way. It’s not clear what Sergey Brin meant by “the convergent optimizer” that will be launched from beta-testing, but here is the context in which it is delivered, courtesy of Tiernan Ray:

Co-founder Sergey Brin talked about a new initiative rolling out this quarter, called the “convergent optimizer,” currently in beta-testing. (Sounds spooky.) It will allow ad clients to bid on how much a Web user is worth to them, and then Google will figure the math of how much that client should spend to buy a keyword. Brin said the company was seeing higher click-through rates than expected for ads it has been placing at the bottom of YouTube videos.

The mention of YouTube in that report is telling. Surfers are again blind to online advertising — just as we ignore banners, we’ve learned to ignore the right column as well as the ads above the search listings, even on webpages where we think ads are supposed to be.

Call it Google-operant-conditioning. And, as my teenager shows, it’s difficult to unlearn bad habits. In order to be noticed, online advertising will have to converge. Microsoft knows it, Google knows it. (Have you heard of this “convergent optimizer?” Fill us in with a comment.)

But it won’t replace search. Search is the starting line. Online ads will be adapting and changing, though, to catch the searcher in the SERPs, in the social network, the video site, the blog, wherever the searcher ends up after they’ve searched.

There is an imbalance on the advertiser side, though. Part of the necessity that is driving invention is that smaller advertisers are being squeezed out of the starting point, the SERPs by big brand advertisers, who not only drive up the cost of PPC, but also buy out the top ranking websites in order to appear in the organic results.

Small advertisers, then, are left to hang on to the long tail for dear life. Very soon, if not already, it’s Trickledown Economics applied to search advertising. The little guy gets the keywords left over after the bigger competitors buy up everything else.

So there is a necessity to provide more advertising channels, especially if Google and the rest want to maximize revenue. Maybe you can’t get a position you want on the SERPs, but if targeted well enough, your ad might be aptly placed in a YouTube or AdSense video, within a Facebook widget, delivered via geo-targeted text message, or maybe alongside a Twitter conversation.

Are big brands squeezing you out of the SERPs and good AdWords positioning? Tell how that affects your strategy in the comments.

That’s just the beginning. When television and the Internet converge, there’ll be whole new models to discuss.

About the Author:
Jason Lee Miller is a WebProNews editor and writer covering business and technology.

20 MORE Must-Have Search Engine Marketing Tools

My recent article 20 Must-Have Search Engine Marketing Tools listed 20 of the most popular time-saving tools you can use to help you with your search engine marketing efforts.

The article proved quite popular with both search engine marketers and webmasters, some of whom decided to send me their favorites that weren’t included in the list. I also discovered a few more of my own since I wrote the original article, so I decided to add to the list by reviewing another 20 tools.

So here are 20 MORE must-have search engine marketing tools:

1. SEO Toolbox
The SEO Toolbox is a collection of 11 free SEO tools developed by the team at SEOmoz, including a backlink checker, URL inclusion checker, an outbound link checker, domain age detection and a PageRank checker.

Price: $0

2. EditPlus
EditPlus is a 32-bit text editor, HTML editor and programmers’ editor for Windows. While it can serve as a good replacement for Notepad, it also offers many powerful features for Web page authors and programmers.

Price: Shareware (Registration fee encouraged)

3. WordPress
Like Blogger, WordPress offers hosted blogging and blog templates. Unlike Blogger, WordPress also offers a stand-alone publishing platform to enable you to host and fully manage your own blogs.

Price: $0

4. Marketing Experiments
MarketingExperiments is an online laboratory engaged in research publishing and education. Their mission is to test and document every conceivable marketing method on the Internet.

Price: $0

5. Web Page Analyzer
Web Page Analyzer is a free web page analysis tool and web page speed tester to help you improve your web site’s performance. Enter a URL and the tool will calculate page size, composition, and download time.

Price: $0

6. Web Accessibility Toolbar
The Web Accessibility Toolbar has been developed by the Web Accessibility Tools Consortium to aid manual examination of web pages for a variety of aspects of accessibility. It’s particularly helpful for site usability testing and there are versions for both Opera and Internet Explorer users.

Price: $0

7. Search Engine Friendly Layouts
SearchEngineFriendlyLayouts offers CSS-based layouts that are known to be search engine friendly (easier for search engine robots to index). All of the XHTML, CSS and Javascrípt code used in the layouts are provided for use free of charge.

Price: $0

8. The Interactive HTML Tutorial
Dave’s Interactive HTML Tutorial is a tutorial for anyone who is serious about learning HTML code or who just wants to brush up on some of the basics. It includes code descriptions and integration examples.

Price: $0

9. Indextools
Indextools is another popular web site analytics program that also offers built-in PPC bid management tools.

Price: From USD 49.95 per month

10. WordTracker
WordTracker was one of the very first keyword research tools available on the Internet. It helps you pinpoint the most popular keywords for your product and services, generate thousands of relevant keywords to improve your organic and PPC search campaigns, research your online markets and find niche opportunities to exploit.

Price: From USD 30.00 per week

11. CSS Layout Techniques
CSS Layout Techniques catalogs search engine friendly web site templates based on Cascading Style Sheets (CSS). All code is made freely available for download. The site also includes links to various online CSS resources and tutorials, appropriate for both the novice and the seasoned CSS veteran.

Price: $0

12. RSS Feeds Submit
RSS Feeds Submit is automatic RSS and blog submission software that submits your feed to over 80 search engines and directories automatically. The creators claim it’s the quickest way to submit your feeds to the most popular RSS directories and blog search engines. You can also choose to submit your site manually to directories that require more detailed information about your feed.

Price: USD 29.95

13. iBusinessPromoter (IBP)
iBusiness Promoter (IBP) is a suite of professional web promotion tools created by Axandra.com that helps you with all aspects of website promotion and search engine optimization. It includes tools for optimizing your pages and links, researching keywords, submitting your site to search engines and directories and search position querying to determine how your site pages are ranking for particular keywords.

Disclaimer: Some of the functions performed by this tool (e.g. automatic submissions and search rank querying) are discouraged by Google in their Webmaster Guidelines.

Price: From USD 249.95

14. Bid Rank
BidRank is a desktop application that you run on your PC to help you manage your PPC campaigns and automate the keyword bidding process. There are two versions of the product available: BidRank for Yahoo! which is a Yahoo! approved third party bid management tool to help you manage Yahoo! Search Marketing campaigns. Then there’s BidRank Plus which works with multiple pay-per-click search engines, including Google AdWords, to help you manage multiple PPC keyword accounts.

Price: From USD 14.90 per month

15. Hot Banana Web CMS
Hot Banana is an easy-to-use Web Content Management System (Web CMS) that helps marketers build and manage SEO-friendly Web sites that can be automated and optimized for maximum lead generation and conversion performance. Content Management Systems are notorious for being SEO unfriendly but this one is purposely built to avoid such problems.

Price: From USD 329.00 per month

16. WebPosition
WebPosition is a powerful suite of tools aimed at improving your web site’s search engine positioning and monitoring performance. WebPosition allows you to review your search engine rankings, target your keywords, optimize pages using built-in expertise, submit URLs to search engines and analyze conversions using WebTrends site metrics.

Disclaimer: Some of the functions performed by this tool (e.g. automatic submissions and search rank querying) are discouraged by Google in their Webmaster Guidelines.

Price: From USD 149.00

17. Competitive Intelligence
Trellian’s Competitive Intelligence provides the means to monitor your competitors’ web sites to identify their major traffíc sources. You can find out which sites are responsible for sending traffíc to their pages, including search engines and the search keywords used.

Price: From USD 99.95 per month

18. HTML Toolbox
The HTML Toolbox from NetMechanic is an online tool that helps you discover HTML errors and syntax that prevents browsers from processing your HTML and prevents visitors (both humans and spiders) from reading your site. HTML Toolbox automatically fixes html problems upon request with one quick clíck. The Toolbox includes several tools in one, including a HTML Checker and Repairer, a Spell Checker, HTML Validator, a Browser Compatibility Checker and a Load Time Checker.

Price: Free for up to 5 pages

19. Web CEO
Web CEO claims to be the most complete SEO software package on the planet. The latest version of this SEO/SEM software provides the ability to research keywords and keyphrases that will bring most targeted visitors to your site; optimize your Web pages for better search engine visibility; submit your site to search engines; research, analyze and build links; manage pay-per-click campaigns; track your positions in search engines; review site traffíc statistics; get rid of errors on your sites; find bad links before your visitors do; edit your Web pages; upload any file or folder to your site and monitor the availability of your web site.

Disclaimer: Some of the functions performed by this tool (e.g. automatic submissions and search rank querying) are discouraged by Google in their Webmaster Guidelines.

Price: From USD 199

20. AdWatcher
AdWatcher is a suite of tools designed to help you receive the maximum ROI for every advertising dollar you spend from online marketing campaigns, be it Google AdWords, banners, text links, or email marketing. It detects and combats clíck fraud and allows you to manage all of your ad campaigns from one easy-to-use interface. Essentially, it provides clíck fraud monitoring and ad tracking.

Price: From USD 29.95 per month

So there you have ANOTHER 20 time-saving tools to help you with your search engine marketing efforts. Now there’s no excuse for avoiding SEM. Happy site marketing!

About The Author
Article by Kalena Jordan, one of the first search engine optimization experts in Australia, who is well known and respected in the industry, particularly in the U.S. As well as running her own SEO business, Kalena is Director of Studies at Search Engine College - an online training institution offering instructor-led short courses and downloadable self-study courses in Search Engine Optimization and other Search Engine Marketing subjects.

XQuery: The Search Language For A Multi-Platform Future

The advent of wireless internet access has made web design a very complicated matter. Previously, all web browsers were created equal. HTML was the only language used to create web sites, and it was only possible to go online with a desktop PC.

Since the turn of the century, cyberspace has changed. It is now possible to surf the world wide web using a wide variety of wireless gadgets, such as cell phones, palm tops, laptops, computer screens in automobiles, etc. As a result, new programming languages and specifications that are more versatile than HTML have evolved to create websites that can be displayed on the new web browsers utilized by these various devices.

Languages such as XML, XHTML, XSL, and a host of other programming innovations were developed because web sites coded in basic HTML were not being displayed properly on the browsers installed on all these neat gadgets. XML is a language that enables data to be displayed across all platforms because XML is a simple text file that merely defines data, it does not tell the web browser how to display the data. XSL and XHTML were created so that XML could be transformed into a web page.

Now that you have a basic understanding of how and why programming has changed, you are ready for a brief introduction to the main topic of this article, XQuery. XQuery was invented so that there was a way to query data stored in an XML document, much the same way SQL is used to query a database.

XQuery uses simple functions to query a document. An XQuery function looks a little like a javascript function in that it uses parentheses containing an element that is to be the object of the function. With XQuery, the element in parentheses is typically the name of the document or file to be queried.

To find what it is looking for within that file, XQuery narrows its search by using path expressions that look a lot like the path for an ordinary file stored on your computer, with the various subsets of data within the XML file separated by backslashes. The predicate is the final component of an XQuery function. The predicate tells the function exactly what information, data, or range of data within a particular subset is to be extracted and returned to the user.

For example, an XML file for a dating website would contain a list of men and women who have posted their profiles on the website. Some of the people in the XML file might be classified as single, while others might classified as divorced. The XML file would also contain the age of each man and woman.

If a woman were to visit that dating website and perform a search for profiles of only single men who are over the age of 30, that search request would be converted into an XQuery function that would contain a path that would tell the function to search through the list of men who are classified as single, and the predicate would instruct the function to return only the profiles of the single men who are older than 30.

Learning how to use XML, XHTML, and XQuery is of critical importance to every web designer or programmer. There are now so many ways to connect to the internet using computers that run on different platforms that are no longer compatible with many elements of the HTML programming language. Web designers need to be conscious of this and start designing web sites that utilize XML and XQuery.

About the Author: Jim Pretin is the owner of http://www.forms4free.com, a service that helps programmers make an HTML form.

The Case Against Outsourced SEO

About a week ago I got a telephone call from a college buddy of mine named Paul who runs a soon-to-launch online business. Here’s how the conversation went:

Paul: “We want to hire your company to do the SEO for us. Whatever the price is, we can afford it.”

Me: “Tell me a little more about your company and exactly what you expect to achieve from search.”

Paul: “We want to rank Number 1 in Google for EVERYTHING in our industry, and I know you can do it for us.”

Me: “I’d be happy to consult with your team to make sure you understand the principles of SEO and get off on the right foot, but I think you’re better off doing the work yourself.”

He was perplexed. Why wouldn’t we want to take on his SEO work? It has nothing to do with him or his company. It has everything to do with the misunderstood nature of what it takes to consistently rank high in natural search. The absolute best companies I’ve worked with make every decision with SEO in mind. Everyone in their organization – from management to programmers to marketing – is thinking about the search impact of their decisions. For that reason it makes sense to hire a consultant or to learn it yourself, but not to hire an outside firm to outsource your entire SEO campaign to.

Most of the time when companies outsource SEO they do it with the mentality of “here you go, you handle it, we expect results.” They view it as an entirely separate entity and not as a core value that needs to be instilled in their organization to be successful. That’s why outsourced SEO just doesn’t work: your organization still makes decisions the old way.

How will this programming change impact our search results? Can we build link-building into our marketing campaign? What adjustments can we make so that both are working in harmony to achieve our objectives as a company and rank as high as we can? There is no incentive to learn about search if someone else is handling it for you, and consequently you probably won’t be asking these important questíons when making a critical business decision.

Paul was still a bit confused with that answer. So let’s take a closer look at some of the key components necessary for SEO success and what needs to take place for them to be accomplished:

Keyword Research – this entails researching how frequently phrases relevant to your site are searched. I like to use the SEO-Book tool or the free version of Wordtracker. Keyword research is important because it will impact your site structure, title tags (widely regarded as the most influential factor in how high you rank), and will help identify opportunities in your industry (if a term is searched a lot but there aren’t a lot of good results, you may have just identified a great expansion opportuníty for your company). This is best done by either a consultant or the internal head of your SEO campaign, which should be someone in upper-management.

On-Site Optimization and Site Structure – this is what most people think of when they think of SEO. What changes should be made to your site so that search engine spiders have the best chance of crawling it, understanding the content, and ranking you accordingly. Most often, this involves changes to Title/META tags, cleaning up source code so that it’s proper HTML, moving CSS and Javascrípt to external files, adding sitemaps, modifying internal linking structure and anchor text, and several other standard changes that eliminate all potential crawling and indexing issues. This is best done by your programmer(s) so that they understand the importance of the changes and make them part of their routine in the future. These changes can be suggested by a consultant, but will only really be successful if programmers are on board.

Link building – this is probably the second most common task associated with SEO. By now you already know that you need one-way incoming links from relevant sites with applicable anchor text to rank high. Many outsourced SEO firms will either engage in elaborate link exchanges or purchase paid links for you: both of which are obsolete in terms of having any positive impact in your rankings, and now can potentially penalize you. The best one-way link building techniques – press releases, content syndication, blogging, product syndication, viral videos, etc – all require a LOT of input from you to be successful. Most of the time they should be integrated into your existing marketing plan to have the highest chance to thrive. For example, most companies already issue press releases when they have newsworthy announcements so it’s a natural extension to email the release to online news sites and blogs, and to use an online distribution service. I think successful link building is best done by your marketing department as part of your overall marketing strategy. It’s fine to have a consultant help put the plan together, but the actual implementation of the plan should be done by you.

Analytics – this involves the measurement and tracking of your sites’ SEO and marketing campaign. Previously, this could be tedious for small sites and I might have recommended outsourcing. But with the new version of Google Analytics, a properly configured account will tell you everything you need to know about where every single sale on your site came from. Your programmer or consultant should be able to set it up for you and configure the reports to track only the most important metrics for your organization. I also like to track incoming links and search engine rankings for a site (two things that Analytics does not track), but those can easily be tracked with the Marketleap Link Checker and Digital Point Keyword Tracker .

In the end, whether you decide to hire a consultant or tackle SEO internally with the vast information available online, you still need to make SEO part of your organizations objectives for it to be a success: something that outsourcing usually doesn’t do.

About The Author
Adam McFarland is the co-founder of Faceup-Sites and the author of the Faceup Web Marketing Book: The Perfect Combination of SEO, SEM, and other tactics to maximize results without breaking the bank. Faceup-Sites specializes in helping businesses develop highly customizable sites that are easy to update, visually pleasing, and search engine friendly at a fraction of the cost of what most developers charge.

The Death of Paid Content Has Been Exaggerated!

There is a debate raging on the internet at the moment about whether the move by some of the major national newspapers in the US, away from subscription to a free, advertising driven business model, is a signal that the days of paid content is over.
This debate shows a lack of understanding of content publishing on the web. The reason that the national newspapers are failing in the subscription market is because most of their content is available elsewhere for free. If there is a free alternative, guess what, people will always take the free option.

In the early days of the web, brand was enough to sustain many of the online sites of the national newspapers, but now brand is not enough. There are many credible sources creating content and the internet community is getting pretty good at ensuring quality floats to the surface and the dross is trampled under foot.

So do huge national newspaper sites being forced to go free mean paid content is dead? The figures suggest not. In fact they suggest that the market has blood surging through its veins. According to the Online Publishers Association, paid-for content billed over $2bn in 2005 and is expected to reach over $5bn in 2007. In Europe according to a study for the EU, revenues will jump from €849m in 2005 to €2bn by 2010. So if the large national newspapers with their huge audiences are not generating subscriptions, who is? The answer is highly focused niche websites. As Gary Hoover said at the recent SIPA (Specialist Information Publishers Association) Conference “In the information business all the money is in the niches”.

At the specialist information end of the market, knowledge and expertise is still a limited resource and there are many reasons why people pay to get access to it. These include:

• When knowledge is restricted to one individual or a small group of individuals e.g. share tipping and ínvestment information – www.t1ps.com and Bull Market Report www.bullmarketreport.com

• When knowledge is inextricably linked to one personality or celebrity e.g. Jancis Robinsons’ expertise in wine, www.jancisrobinson.com

• When the editor has privileged access to source material e.g. insider industry information like www.beernet.com

• The timeliness of information. If one website gets access to information quicker than other sites, people will pay for that time advantage e.g. the fashion trend prediction site www.wgsn.com

• A specialist website aggregates information which saves the reader time and hassle e.g. www.lvtbulletin.com provides analysis of court judgements that are relevant to landlords.

• The website hosts a specialist community. Charging for access acts as a quality filter to ensure all members have a reason and interest in participating e.g. the many collectors clubs and niche industry groups such as www.restaurantowner.com

• People pay for exclusivity. Many paid-for websites are driven by people wishing to be a member of a small elite group. It’s much the same as private members clubs or exclusive golf clubs in the real world e.g. www.smallworld.com

• People who are passionate about a subject often want to submerge themselves in it and are prepared to pay to mix with likeminded people e.g. fans of the T Bird car www.tbirdfans.com

• Training sites that give people access to information that will improve their skills or knowledge e.g. the photography site www.photographytips.com and the writers bureau’s writing course www.writersbureau.com

• Help sites that enable people to improve themselves or their health e.g South Beach Diet, www.southbeachdiet.com and What to Expect Pregnancy Club, www.whattoexpect.com

• Save people time e.g. business book summary sites such as www.bookbytes.com, www.redbooks.com and the site that provides preachers with downloadable sermons www.preachingtoday.com

What is driving this revolution is the combination of cheap and simple publishing tools, zero cost distribution via the web and the access to a global audience via the search engines. Suddenly individual experts can easily share their knowledge and become global celebrities in their specialist areas of interest.

Chris Anderson has researched this phenomenon in his book “The Long Tail: How Endless Choice is Creating Unlimited Demand”. He observed that:

“When you can dramatically lower the costs of connecting supply and demand, it changes not just the numbers, but the entire nature of the market. This is not just a quantitative change, but also a qualitative one, too. Bringing niches within reach reveals a latent demand for specialist content. Then, as demand shifts towards niches, the economics of providing them improve further, and, so on, creating a positive feedback loop that will transform entire industries – and the culture – for decades to come”

Historically the distribution of knowledge and expertise has been restricted by the cost of distributing it via magazines, books and newspapers. Editors, literary agents and publishers were the gatekeepers who decided and controlled what was worth printing. Chris Anderson compares this to islands being visible above an ocean, where the waterline is the economic threshold for what is worth printing. The islands represent the publications that are popular enough to be above that line, and thus profitable enough to be offered through the publishers distribution channels. However islands are just the tips of vast undersea mountains. When the cost of distribution falls, it’s like the water level falling in the ocean. All of a sudden things are revealed that were previously hidden. And there is much, much more under the waterline than above it. What we are now starting to see, as online production and distribution costs fall, is the shape of the massive mountains of choice where before there was just a peak.

This can illustrated by the fact that there are approximately 75,000 print magazines, newsletter journals and newspapers in the UK and US, yet there are over 15m active blogs and millíons of niche content websites.

Conclusion

The future of internet publishing is in the niches. Subscription and advertising revenues will continue to migrate down the long tail to the niche sites. Specialist publishers who are focused on creating the best site in their subject area in the world are set to prosper. The mass market publications will continue to see their audiences and revenues squeezed.

About The Author
SubHub provides an all-in-one solution to enable you to rapidly design, build and run your own content website. Publish for profít on the web. Website: SubHub.com SubHub Articles Feed

Pay Per Click Party Over?

First the good news. Pay per click, as it has been perfected by Google, is unarguably the Web’s highest business achievement to date. Google has become an international corporate icon worth more than some of the most famous name brands of our generation like Disney, McDonalds and Hertz.

Even more impressive is that pay per click has empowered literally hundreds of thousands of entrepreneurs in their web businesses. Quite a few sites, which prior to pay per click would have trouble producing income, are earning more than $10,000 per month. Large sites such as NY Times, CNN, BusinessWeek and ESPN are also using pay per click to supplement their ad revenue.

Pay per click seems to be booming … but is the party soon to be over?

Blogger Steve Rubel came up with five reasons (in bold) on why pay per click is in trouble. I’ll take a look at each one of his points below.

1) Clutter
Google didn’t invent text ads, it just popularized them. Their Adsense program has made it simple for sites to add which has caused a glut of text ads everywhere you click. I see the “clutter” problem as an Adsense clutter problem that leads to a phenomenon all web publishers have dealt with … ad blindness.

When people get used to seeing a certain style of ad across the web, click rates go down…. and down and down! That’s why banners originally got up to 8 percent click rates when they were first introduced by Wired.com. Good click rates now are .25 – .75% for top banners. Clutter contributes to ad blindness which causes lower click rates which could mean the glory days are over for pay per click.

As for clutter in search results, I see this as a problem of an educated user base. When text ads first appeared in search results the average user didn’t identify them as ads, thus they had a great click rate. Now only the most casual users still doesn’t realize what is and isn’t an ad in search results, resulting in much lower click rates. This problem will only get worse over time.

What can search engines do to combat this “clutter effect”? The answer is to make the ads look less like ads. If 3 line text ads are obvious ads, how about 1 line ads integrated within the search results that might look like this:

Google and the other engines need to find a way to make the ads blend better with search results and content. How far they can go with this strategy without alienating the searcher or site visitor is the question.

2) Declining Relevance of Traffic/Transition to Cost Per Action
People are still clicking in big numbers but evidence suggests they are not converting as much. My guess is that conversions of ads in search results are not as much of an issue as conversions from Adsense partner clicks. This has become a bigger problem as the Adsense program has grown.

3) Rising Costs
Click costs have gone up substantially since the good old days of GoTo.com where you could buy clicks for as little as one cent. Marketers must justify their expenditures on advertising based on its impact on sales. Will the increasing cost of text ads for most key words cause cuts in search marketing budgets?

The rising cost has its biggest impact on small business where price matters more. When small businesses are cut out of buying popular key words they may eventually just give up on search marketing – not good in the long run for the search engines.

4) Marketers Spread the Ball Around
The Internet has changed since search engines began selling text ads. You now have social media like MySpace, Facebook and Twitter. We have seen the rise of widgets where call to action ads will likely be integrated. Video is now mainstream with sites (like WebProNews) making video part of everyday content. The Internet is becoming accessible via many devices making it an accessory of life!

Marketers now have the ability to market products and services that are integrated into the user experience. Content can be ads and ads can be content. Marketers are becoming smarter in how they use the Internet. It is just a matter of time before a critical mass of advertisers see pay per click text ads as a tool past its prime.

5) Search Ads Are Viewed as Untrustworthy
As with anything that gets a lot of bad press (think click fraud) people start to wonder about its trustworthiness. I don’t think this is critical yet, but Google and the others must find a better way to detect, deter and prevent click fraud.

Additionally, the engines must be more selective about their PPC sites. There are many sites that exist solely to get “pay per click” clicks. This is bad for Google, Yahoo and Microsoft because it leads to a general uneasiness among ad buyers.

About the Author:
Rich Ord is the CEO of iEntry, Inc. which includes WebProNews and 100 other sites.