Knol – Wikipedia’s Rival

There’s a new kid on the online block named Knol and even this early in the development stage, some people are already predicting that it could bring about yet another significant change to the way we share information on the Internet.

Knol is a new Web service being developed by Google that is meant to serve as a virtual storehouse of knowledge on the Internet. With content being contributed by various experts on different topics, it will behave much in the same way that Wikipedia does currently. In fact many industry experts have made the suggestion that Knol is set to become a direct competitor to Wikipedia and other similar types of web sites.

Google is of course the go to web site as far as search engines go, being the most popular search engine web site today by far. If Knol is as successful in drawing a widespread following as the developers hope, it could bring about the Google’s transition from a search engine into a company that creates and publishes Web content.

Some industry observers warn that one problem that could potentially arise is that Google’s objectivity in presenting search results could be compromised.

Knol – the name of which is derived from the word “knowledge” – is being developed to allow people to create Web pages on virtually any topic. When completed, it will include several features that will allow its users the ability to perform a number of tasks, such as submitting comments, rating individual web pages and suggesting changes.

We mentioned earlier in this article that Knol has been compared to Wikipedia by many industry analysts. While there are in fact many similarities between the two web services, the main difference is that Wikipedia allows virtually anyone to edit an entry while Knol only allows the author of each particular “knol,” – which is what the individual pages in the service will be called – to do so. This means that the same topics could have many different authors with sometimes contrasting – or even competing – points of view.

Google has stated that the main thrust of the Knol project was to focus attention on authors who have sufficient expertise on particular topics. As vice president for engineering at Google Udi Manber wrote in the Google corporate blog recently, the Internet has evolved largely without the benefit of a standardized means to highlight the author’s name on each web article. He goes on to say that the company believes that knowing who wrote a particular web article will considerably aid users make better use of the Internet and its various content.

Manber also stated that another important goal of Knol was to cover a wide range of topics, from the various sciences to health concerns to history. Eventually they hope to have Knol become the first stop for research on any topic. Today it is Wikipedia that provides that function and its web pages show up at the top of the results page of Google and many other search engines more often than not.

Some in the industry have suggested that this latest move of Google is driven by the unprecedented growth of web sites that combine knowledge resources such as Wikipedia, and that Google feels the need to have a strong presence in that particular area.

Wikipedia is by no means the only web site that offers that type of service. Many other companies have taken slightly different approaches in functioning as knowledge repositories on various topics on the Internet. These services include Squidoo, Yahoo Answers, About.com and Mahalo.

In spite of the widespread popularity of these services – as well as the existence of many free tools that allow experts and regular people the means by which they can share their knowledge online – Manber said that Google feels that it is still not easy enough for the average user to do those things.

Interestingly, considering all the hype and excitement that is currently surrounding the news of Knol’s existence, Google has refrained from discussing the project any further than these initial details, and have even said that it is still an experimental project at this time. This means that just like many other Google tests that never saw the light of day, Knol could end up never even being released publicly at all.

As for Wikipedia, site founder Jimmy Wales has downplayed his site’s comparison with Knol, saying that while Wikipedia’s goal is utmost objectivity in its content, with each individual article being the sum total of the collective knowledge of its various authors, Knol’s model will likely result in highly opinionated and possible even contradictory articles on even the simplest of topics.

Another important distinction is that Wikipedia is a strictly non-profit web site that does not carry any type of advertising, while Knol is a decidedly more commercial venture, with its content authors earning revenue from any Google ads on their site.

Editor’s Note: Currently, Knol is accessible by Google invitation only. Some additional information on Knol can be found at:

Google
Mashable.com
blogoscoped.com
Google Blog

About The Author
Mikhail Tuknov offers search engine optimization marketing services – http://www.infatex.com

Mistakes Of Pay Per Click Advertising: The Terrible 10

Vigilance, micromanaging and attention to detail can help you avoid some common and costly mistakes of PPC advertising. What are those mistakes?

Here are the terrible 10 that are typical to most pay per click campaigns.

Too Many Keywords Per Ad Group

It’s important to target your ad to be as relevant as possible. Don’t group all your keywords into one or two ad groups. Break them out. Keep them tight. This gives you more control over ad variables so that you can be as relevant as possible.

Not Using Negative Keywords

Negative keywords reduce unwanted impressions, and more importantly, unwanted click throughs. However, with increasing priority given to “quality scores” and click through rates in the PPC engines, it’s key to trim the fat from your keyword campaigns. If your company sells “widget management software” then be sure that you have keywords like “-serial” or “-free” assigned as negative keywords (unless, of course, you offer it for free in some manner). You can find good negative keywords in your log files or when you build your lists.

Weak Testing

Split-testing your ads is critical. Even the smallest of changes can boost results. In addition to testing your ad copy’s “call to action” or value statements, every ad has multiple variables to test. The titles, the two lines of copy, and display url all can be optimized. If you don’t have time for hands-on testing, a good professional pay per click management company can run daily split testing for you. You’d be surprised how well this can pay off.

Poor or Non-Existent Tracking

Of course, testing your ads and fine tuning your keyword lists only works well if you are tracking results. The search engines will tell you what your click-through rates are … but you need bottom-line results. You need to know your return on investment or what your cost per action is. It’s not enough to know that you spend $5,000 and get back $10,000. You might be able to spend only $3,000 and get that same $10,000.

Not Getting Keyword-Level Tracking

Proper and exact analytics or using an experienced pay per click management company is essential to get the data you need. If you have keywords that are not performing and leaking your account on a daily basis, you are throwing money away. Getting results to the keyword level allows you to adjust bids for maximum effect. If you have one keyword with a $1.34 earnings per click and another at 37 cents, this is key information that allows you to maximize profits. Lower one bid if you are above your “EPC” and raise another to eek out more profits from that sweet-spot keyword. Don’t waste money on a daily basis.

Not Specific Enough Keywords

Some broad and generic keywords can certainly push a ton of traffic to your site. They may even be very successful. Often, however, they can also do just the opposite — drain your funds with poor results. A user searching on one of these generic phrases is often doing research in an early part of the buying process. Knowing your keyword-level results and filtering out bad variations with negative keywords can help you get a true read on these generic keywords.

Not Going After Long-Tail Keywords

This follows the above item on generic keywords. Building a list and individual ads for the long-tail keywords can be a major time-sucker. It can also be profitable if the task is performed correctly. Those earnings per click will likely vary widely from a generic keyword like “mp3 player”, “sony mp3 player” and “sony 2GB S610 walkman video mp3 player”. One consumer is doing research, the other knows what they want and is most likely looking to purchase.

Not Separating Content and Search Networks

An easy way to get scorched on poor performing traffic or even click fraud is to not separate your search network ads from your content network ads. Chances are that if you don’t know what the difference is, then they are likely not separated in your account — and bad keywords are leaking your funds daily. You are better off to build different campaigns for your keywords on the content and search networks.

Not Attracting Local Clients Through Geo Targeting

If you draw most of your business from a local area, the big three PPC engines allow you to geo-target your keywords to that area. This will bring the local market to your doorstep on non-local keyword phrases. This can be hugely profitable.

Not Frequently Monitoring Your Accounts

Not everyone has time to run split testing on a daily basis or frequently checking your EPCs (even though you should…because it’s costing you). That said, there are still a high amount of advertisers who seem to ignore their accounts for days … or even weeks … or (don’t tell me you’re doing this!) months. The big PPC search engines are increasingly cracking down on poor performing keywords, smacking advertisers with that “Inactive for Search” status for individual keywords. When this happens, you lose traffic, you lose profits. If you are investing heavily in PPC, you can’t just turn your back on your account for days at a time.

The Terrible 10 of Pay Per Click Advertising is a lot to consider, but it’s vital for healthy pay per click campaigns. Whether you can actively manage your PPC accounts at this level or you need to hire a pay per click management company to do it, vigilance and precision can make a huge impact on your bottom line.

——————————————————————————–

About the Author: Josh Prizer is a Senior Account Executive and PPC expert for Zero Company Performance Marketing, a pay per click management company. Visit us now to learn more about how to improve your PPC advertising campaigns and performance.

Site Search As Key Performance Indicator

Do you know what’s happening in your own site search? Understanding site search is one of the most important KPI (Key Performance Indicator) you should measure.

According to a Forrester study, over 50 percent of major web sites fail in search usability. When your search fails to deliver, your conversion suffers. A low converting site will result is less sales and decreased revenue.

Search is not just another nice feature to have. You have to think of search as a revenue generating part of your business.

Your company works hard to drive traffic to your site. Many visitors will use your on site search instead of browsing through your site. Online shoppers want to use site search to expedite the shopping experience. The faster and easier they can locate the product they are looking for the more likely they’ll buy on your site. The more roadblock you set in place to inconvenience the shopper, the more likely they’ll buy from someone else.

Do you know what they are searching for? Are you in any way measuring what search phrases are queried on your site? It is not enough to have site search as a feature. You must analyze it. You have to understand it. Then, you have to make adjustments based on your findings.

The best place to start learning about your site search is through the search log files. If you don’t monitor your log files, you will fail to gain an insight into what your customers are looking for on your site. Understanding site search is a KPI that should be part of your tactical operations. Learning about site search will tell you what your customers are looking for.

In addition to understanding what site visitors are searching for, you have to test what results yield from searches. For example, if your customers are searching for “return policy” what results are they shown? Are the search results relevant to the search queries? If the result you get is not the best possible result, you have to tweak you search engine.

The top few results must be relevant, because searchers are not interested in reading deep down your search results. Result number 10 is infinitely more irrelevant than result number 1.

Every reasonable search phrase should result in relevant search results. For example, if the site searcher types “return policy” in the search field, the search should result in some result. Every e-commerce site should have a return policy; therefore, the site search should yield the relevant result.

One of the worst possible outcomes for a search query would be no result. If a user types any relevant key phrase, it should result in relevant results. If they don’t, your search is failing your customers.

Site search is a tool to enhance customer satisfaction. If it works as it is supposed to, it has done its job. If site search fails it becomes a frustrating experience instead of a positive experience resulting in lower conversion rates, lost sales opportunities, loss of revenue and unhappy site visitors.

——————————————————————————–

About the Author: George Meszaros – Webene.com, web site design and online marketing.

The Secret Sauce of Google Success

What do you need to get top rankings on Google? There are many ingredients in the mix, but here are three of the most important that you need to concentrate on.

1.) Keyword Relevant Copy and Content.

Whatever the keywords you want to get ranked in the Search Engine Results Pages (SERPs), be sure that you have enough copy and content about those specific words which will give Google a reason to rank you in the first place.

If for example, one of your priority keywords is “virtual assistant software”, create a separate page or section for this keyword (at least a few paragraphs) using the keyword in the headline, the first sentence, the last sentence as well as wherever it makes logical sense in order to achieve the keyword frequency and “density” that search engines are looking for. Ideally, each page will only have one or two keywords and will be very focused on that specific topic.

Additionally, by including on this specific keyword page either articles, pdf files or news items about your keyword, it will help you improve your chances of a better ranking. Give Google a reason to rank you at the top. He with the most relevant copy wins – so make it rich and deep.

2.) Can the Search Engines Read and “Crawl” All the Pages and Content on Your Site?

Probably the biggest surprise to most marketers is that the search engines are unable to either navigate or read most of the content on their website. If they can’t read your copy, then it’s not surprising that you’re not getting the rankings or traffic to your website that you aspire to.

The only thing a search engine can read is words. Sites that are dynamic, or created in other formats such as Flash or Java often can’t be read by the search engines. Even if they can read the content on your site, many times they can’t navigate it properly or just bounce “off the walls” as there are no specific links or site map to tell the proper sequence or where to go next.

Want to see what Google is indexing on your website? Go to Google and type in: site:www.yourdomain.com . This will show you the title and description of the pages of your site they know about. If they are all the same or they don’t have a title or description listed, chances are very good that your site is invisible to your target market.

3) Links… Why Are They So Important?

Link popularity is one of the most important factors search engines use in determining where you will rank in the search engine for your keywords and phrases, as it helps them to determine how important or popular your site is and what it’s reputation is. In essence the search engines are saying “we’re going to give top ranking to pages that have important and relevant sites linking to them”.

Link Building is the process of finding related/relevant websites and receiving a link from them to you. Natural linking occurs when a site has good content that others will link to. But to get these links people have to know about you. It is a catch 22. Building links has gotten sophisticated in the last couple of years. Today you need a mixture of links from many sources including articles, press releases, social bookmarks, directories and social media sites.

How many links do you need to have? It depends on the individual keyword or phrase you want to be found under and how the links are structured. The search engines look at inbound links as a popularity contest but more importantly, they are looking at the quality of the pages that are linking to you and the “anchor text” – the “clickable link” and what it says about the page that it links to. The key to linking is to have the right anchor text on a link that points to a page that has content using the same keyword phrase.

You do not want to boost the overall number of links by more than 10-15% each month for an established site with history because this may trigger a filter from the search engines as an indicator of artificially inflated link popularity. New sites have an advantage since there has not been a history established and the link building can be done at a faster rate. Linking is critical not only with your search engine placement, but also because it helps stabilize you positions in the search engines and delivers traffic directly from the sites that link to you. But linking is not a once and you are done process. Generating new links is an ongoing process.

In summary, successfully implementing the above 3 strategies either through your efforts or through employing search engine promotion specialists will deliver the “triple punch” and the knockout punch you need to get top rankings on Google and the other search engines as well.

About The Author
Article by Terry Mickelson Founder of PageViews.com, one of the foremost search engine optimization companies specializing in B2B search engine optimization and link building programs. For further information as well as a free ranking report on your website, contact Terry Mickelson at 480-556-9752 or email tmickelson@pageviews.com.

Google vs. MSN on Paid Links and Cloaking

Don’t buy paid links! Paid links are bad! Don’t cloak either. Search engines don’t allow it. You’ve all heard this before right?

Well as it turns out, not ALL search engines are as hard line on these issues as some claim to be. MSN Live specifically has now gone on the record that none of the above are necessarily taboo.

Over the holiday break, Jeremiah Andrick, product manager for MSN Live Webmaster Tools, stopped by our offices. We did a lengthy video interview where we chatted about all kinds of good searchy things. Highlights of our chat included some of the emerging differences between Live Search and engines like Google on subjects like cloaking and paid links.

Google doesn’t like cloaking because it can be used to spam and fool their crawler.

Google doesn’t like paid links because their algorithm places such a high emphasis on links as a quality indicator, and paid linking subverts their system’s integrity.

Google’s solution to these problems has basically been to disallow paid links and cloaking and punish the sites that disregard their rules by penalizing or even booting them from the index.

Much to my surprise, MSN isn’t exactly following Google in this regard. They obviously aren’t going to recommend anyone use cloaking and paid links, but they don’t discount either practice as forbidden.

MSN Live Search is becoming a lot more interesting. Initially, Microsoft’s revamped search product drew some criticism for being basically a Google spin off in appearance, with less than stellar results.

The folks in Redmond haven’t just been sitting around though. The quality of their results has shown some nice progress, but more impressive (and promising) has been the quantum leaps they’ve made in terms of communicating with the webmaster community.

Paid Links:

In our video interview, you might notice that Jeremiah says to avoid paid links. Apparently, the Live Search crew has reevaluated their stance on paid links. Live Search’s Nathan Buggia, in an email on the subject of paid linking said the following:

“Paid links are a gray area. Are they of value to the end user? Sometimes they are. Often they’re less valuable and less relevant than the organic links on a page. We reserve the right to treat them that way.”

The operative phrase here is “a gray area”. That’s not saying paid links are forbidden or evil or a bannable offense. Some paid links are crap, some… not so much so. Live Search is working on methods to evaluate and qualify links – paid or otherwise – before they ascribe authority to them.

When Jeremiah says in the video to avoid paid links, it would be more appropriate to rephrase that as ‘avoid bad (irrelevant/junk) linking’.

Indexing issues:

Webmasters have issues with Live Search indexing sites properly. I asked Jeremiah what he thought people were running into when Live Search wasn’t properly indexing them.

Jeremiah said most indexing issues fall into one of three categories:

1. Problems with design

2. Problems with content

3. Technological issues

According to Jeremiah, “It tends to be technological issues, or the content itself, that’s the problem,” when people are having trouble being indexed correctly. “Most people are kind of hip to using better structure in their site,” so design isn’t as often the culprit.

Jeremiah cited a mix between design and content as a specific problem. “Most publishing systems and CMSs a lot of people are using were designed 3 or 4 years ago,” he said, before the now widespread recognition of the importance of crawlability and SEO in site design. As a result, some of these systems aren’t exactly the most efficient or effective in making a site’s content crawlable.

In terms of being totally optimized, Jeremiah said, “People do what they can, but I don’t think they always do enough or that they are not necessarily doing the right things”.

But even under ideal circumstances, Jeremiah admits “you’re not going to get everybody and you’re not going to get everything out of everybody.” As proof, Jeremiah offered a poignant example: “MSDN TechNet is a tier 1 site with eight or nine million documents in 42 different languages… I can tell you that it is not 100% indexed.”

Their goal is likely common to all search engines… 100% indexability of everything. But that is just not a reality at this stage of search evolution.

Cloaking:

We’re all pretty clear on Google’s position on cloaking by now. It’s a pretty simple and straightforward ‘no’. Never. Under no circumstances. It’s evil, it’s bad, it’s spammy. Don’t do it.

While MSN Live doesn’t exactly endorse cloaking, they do seem to have a slightly softer stance on the issue. Jeremiah and I talked about companies who make heavy use of Flash and other non indexable graphics on their sites.

At one point, we talked about Nike, whose site is made completely out of Flash. Jeremiah was quite up front about it, saying, “They break some of our rules just to get to the point where they can get all of their content indexed. They do a bit of cloaking and things like that.”

Google of course, still indexes Nike. I’m pretty sure if MSN knows they cloak, Google can probably figure it out as well. I don’t look for Nike to suffer any Google penalties – much less be thrown out of the index for it though. However, if you aren’t Nike but maybe just a smaller webmaster, I don’t know that you’d be afforded the same considerations from Google. I’d just about bet on it.

On one hand you have Google, who apparently selectively enforces their strict no-cloaking policy depending on who you are. On the other hand, you have MSN saying sure, it goes on and while we don’t encourage it, we aren’t necessarily going to boot you from the index for it. Interesting, no?

Conclusions:

At the end of the day, we didn’t hear too much in our video that we haven’t heard a few times before from every other search engine. Jeremiah told us in terms of SEO, “It’s always the basics. Keep it clean, let’s try to be natural, and as Live Search grows we’re going to try to provide better results for people when we’re able to do that algorithmically.”

All pretty standard party line stuff. With the exception of their obvious difference in their stance on cloaking, and paid links, it could have been a Matt Cutts interview in many respects.

It’s not so much what was said, as it was who was saying it. Keep in mind, this is Microsoft. Typically, they don’t say diddly – or didn’t used to, at least.

I think this interview may be emblematic of an interesting movement going on at Microsoft. Traditionally, MS has been perceived as a closed empire – operating secretly behind closed doors, dispensing information strictly on a ‘need to know’ (or as subpoenaed) basis. Recently though, they seem to be trying to reach out a lot more.

Take for example, some of Jeremiah’s quotes: “The thing that Nathan and I are trying to do, and that is to bring more transparency… We want to endear ourselves to you and want you to want to work with us.” From Microsoft? Really?

Then on GameSpy I see an email from their Xbox division publicly apologizing for holiday problems with the Xbox Live network. Could this be a kinder, gentler Microsoft emerging in 2008? And more importantly, will it lead to more marketshare for Live Search and more traffic for site owners?

About the Author:
Mike is a manager at iEntry. He has been with iEntry since 2000.

Future Of Social Media Sites

The current proliferation of social media sites is the most pervasive phenomenon on the Internet today. Not since the dot.com explosion has the there been an Internet trend that is so widespread in its popularity. The comparison with the dot.com growth is in fact one that is made by many industry observers, and while there are a number of clear similarities, there are also some important differences.

It is expected that by early 2008, all the various social media sites will have more than 230 million members. That number is predicted to grow until 2009, with a leveling off on the number of new members expected by 2012.

The combined revenue from these sites, which in 2007 reached almost $970 million, is estimated to balloon to a whopping $2.4 billion by 2012.

Membership growth in social media sites varies greatly from region to region. The Asia Pacific region accounts for the lion’s share of users, with 35% of the total users expected by the end of 2007. EMEA accounts for about 28% of all users, North America follows closely with 25% percent, and the Caribbean and Latin America trails behind with 12% of all users.

With the inevitable crowding of the social media site industry, many observers feel that consolidation of the market is a sure thing. This has given rise to some predictions that the smaller individual social media sites will be swallowed up by the bigger players in the field. Some experts feel however that this is not necessarily the case. In particular, social media sites with a focus on special interests are expected to survive the trend towards consolidation.

The extensive hype and excitement currently surrounding social media sites is perhaps what inspires the comparison to the dot com boom, but in the midst of all the buzz, there is a certain degree of trepidation felt by many as well. While many investors are naturally excited about the potential of social media sites, the fact that these types of web sites have not been proven for the long term is causing some hesitation. The promise of riding on the wave of the next big Internet phenomenon is a tempting prospect, but it is tempered by the uncertainty of social media sites as a long term sustainable industry. The most cautious industry observers have even gone so far as to suggest that most social media sites would do well to hold off on an IPO for the time being.

The founder and chief executive of Facebook, Mark Zuckerberg, has officially stated that despite his company’s spectacular growth, Facebook is still many years away from flotation.

While there is no doubt that social media sites are a genuinely groundbreaking innovation that is changing the way we communicate in many significant ways, past experience with similar Internet phenomena shows that the hyper charged atmosphere of excitement cannot last indefinitely. The industry is currently characterized by easy capital, plenty of media attention and widespread user curiosity – all of which directly boosts creativity – but all that will come to an end eventually.

This does not mean that there is no future for social media sites. On the contrary, the future is just as bright as ever and at this relatively early stage of the game, it is hard to predict just how huge the whole industry can get. What companies and investors should do however is to adapt their approach to be prepared for the changes that will inevitably come in the future.

In a report published in 2007, Ri Pierce of Grove Technology and an analyst at U.K.-based Datamonitor has detailed a few suggestions that will help companies deal with the changes. Many of these suggestions revolve around understanding market strategies and various technological developments.

One of the most important things that companies can do to roll with the punches is formulate a two pronged approach to deal with the hothouse atmosphere that the industry is currently experiencing as well as the eventual cooling off that is sure to follow. This strategy will involve companies becoming more heavily involved in establishing and maintaining the infrastructure that is needed to run these types of web sites. They would also do well to find effective means by which to support social-networking services especially in the aspects of scalability and availability.

As for the social media sites themselves, the most effective means of ensuring continued popularity is through social media optimization. There are a number of ways commonly used to do this but five rules have been particularly effective in attaining this goal. Formulated by Rohit Bhargava, these rules are: Increasing the linkability of your social media site, making the tagging and bookmarking process easy for your audience, rewarding inbound links, helping your content travel, and finally encouraging mashups, which are web applications that combine data from more than one source into a single integrated tool.

About The Author
Mikhail Tuknov offers ppc search engine marketing services http://www.infatex.com.

Google Indexing Sites In 1 Day Again

I created a new site on Friday, and by Saturday exactly 24 hours later it was in Google’s Index. I posted about this just over a month ago in my post, 7 Steps to Get Your New Site Indexed in 24 Hours.

I had a lot of comments about whether or not Adwords was necessary, so I thought I’d try it again without running Adwords this time. Here’s how it all played out:

1) I created 5 pages of content (Home, FAQ, About Us, etc.).
2) I put them in a simple template with site-wide links. I also linked to it from one of my other sites (it’s very relevant so it makes sense).
3) I added tagged the site on only 2 social bookmarking sites.
4) Commented in 1 forum, put the URL in one directory (niche specific), and submitted it to Digg.
5) Installed Google Analytics
6) Created a sitemap, pinged Google, and put the sitemap in my Robots.txt. Logged into Google Webmaster Central and submitted my sitemap there.
When I checked exactly 24 hours later I was sitting in the index and had already begun to get a few visitors from Google.

I had previously done Google adwords out of both necessity (get quick traffic) but also because of the trust factor I believe it gives to Google, and the fact that Google integrates a quality factor into their quality score (so they come to your site and look at it). Obviously this is just one test compared to several others I’ve done with Adwords, but it seems its very possible without running some ads.

Anyone else seeing 24 hour indexing for new sites?

About the Author:
Michael Jensen is a co-founder of SoloSEO.com, an online service for SEO project management and do-it-yourself SEO tools. SoloSEO.com allows web marketers of any skill level to manage keywords, content tracking, link building, and competitor data.

SEO By Owner in Three Easy Steps

You have two choices to consider when trying get your site ranked higher in search engines. You can hire a Search Engine Optimization Company that is an expert in the field or if you have some time, you can do it yourself

Research your Keywords

Ask yourself what keywords you think someone might type in when searching for products you sell or services you provide. Though a keyword may be only one word it is usually a phrase made up of keywords. Phrases are more specific and will more than likely be what potential customers use when searching for products or services online.

After a quick brainstorming session write down all that you were able to come up with. Make sure to consider geographical phrases if they are important to your customers and don’t forget of alternative words that could be used (an example could be “new car Littleton”). Make sure to surf the websites of your competition to get more ideas until you can come up with a list in the neighborhood of 20 to 30.

Now you should take two keywords from your list that you feel potential customers will use most frequently. Take into consideration that popular keywords are also competitive keywords and harder to achieve higher ranking with. If your goal is to obtain a high ranking for the term “auto insurance”, the path to achieving it may be long. Try to make your keywords the ones that are most related to your business and not riddled with vagueness or extremely competitive. Make each phrase about two or three words in length like the previous example.

Now that we have your keywords we will move on to the next step.

Site Text

Your site text is made up of the wording that is on your web page. There is a phrase for search engines and it is content is king. Search engines love unique content and your keywords should be placed in key locations within the content so Google understands the relevance your site has to them. Also make sure that your copy reads well around them as it needs to make sense for your visitors who are most important.

Keywords can be placed in headings, at the top of pages, in bold or italics, used as link text for other pages of your site and in your title tag.

Add additional content after you are finished tuning up your webpage. Give more detailed descriptions of the products and services you offer. Provide a frequently asked questions page and pages of articles that pertain to your products or services.

With your design you should keep in mind that search engines cannot read images nearly as well as text. Sites that are made up with excessive flash or pictures really impede how well they can read the content of your site.

Link Building

A common way of thinking about links is that every link from another site that leads to yours is a vote for the popularity of your site. Every quality link you receive can improve your search rankings.

The quality of your inbound links is more vital than the quantity. The preferred and more valuable link is from sites that are relevant to your niche and with authority (highly regarded in the niche). A quality directory with relevant categories is another example. Just a few quality links with authority can have more value for your site than hundreds of lesser quality. Think of it like you do your personal business network. Both can have a strong effect on the success or failure of your business.

Take time to consider all the other relevant websites in your niche such as organizations, industry affiliates and non-competing companies. Send them a email introducing yourself, your products and services and explain how your website could benefit their visitors. Then politely suggest that they create a link to your website from theirs.

Record the Results

Over time you should watch and record your search engine rankings by doing a Google search for your chosen keywords to see where you rank. You can also monitor where your visitors are coming from by watching your hosting reports. Do this for each significant page of your website.

Continue to add to your websites content and increase the links to your website over time. This needs to be an ongoing effort for as long as you want visitors to your website.

As you continue to record the results of your efforts you should see the traffic increase and with that your sales. Know where your visitors are coming from so you can continually monitor your marketing efforts successes. Only by measuring it do you know where and how to improve it.

——————————————————————————–

About the Author: Bruce Swedal – The Authority Web Directory is a vital resource in every online-marketing campaign. Begin your link building effort here by visiting our Submit URL page.

Optimizing Your Site for Both Google and Yahoo!

Search engine optimization techniques for Google and Yahoo are quite different. Many websites rank well in one search engine but not the other. This is the direct result of each search engine having its own unique ranking algorithm. For example, the Google algorithm predominantly values the anchor text of in-bound links. Yahoo places more emphasis on keyword density and meta tags.

The primary reason for the difference in ranking algorithms is that Google owns the patent on Page Rank (PR), named after Google’s founder Larry Page. As a result of owning this patent, other search engines need to place more emphasis on different optimization factors including website URL, keyword density and so on.

What are the greatest differences in search engine algorithms?

Google places a significant amount of emphasis on inbound links to your website. The value of these inbound links are measured based on their Google PR. The more links you receive from high Google PR web pages, the better your search result placements will be for a given keyword or search term.

Yahoo places emphasis on website URLs, meta tags, and keyword density. These factors can be analyzed on any website with a limited amount of effort, allowing Yahoo to quickly and easily rank websites properly.

How you can optimize your website for both Google and Yahoo!

The challenge all website owners face is making the most of their optimization efforts. When thinking about search engine optimization, you need to cover all the bases. To do so, pay particular attention to the following guidelines.

Keyword Targeted URL. If your website URL doesn’t contain your keywords, consider purchasing a new one or creating a new page off of your root directory (ex: marketingscoop.com/internetmarketing.htm). Having your keywords in the URL helps improve both your Yahoo and Google search results. Yahoo weighs the website address as an important ranking factor. Google values a keyword rich URL when third party websites place a link to your site using nothing but a web address.

Meta Tags. Although not as important as they once were, Yahoo still uses meta tags to help align search engine rankings and appropriate website pages. Make sure that your meta tags are complete and include your keyword phrases in the title, description, and keyword tags.

Keyword Density Between 6 – 8%. Although much has been written about the importance of keyword density remaining between 2 – 3%, Yahoo looks for sites with keyword densities as high as 8%. Don’t be afraid to include your keywords throughout your webpage content. Make sure however, that your keyword density is not more than 8%.

Link Building. This is the most important factor for increasing Google search result placements. Develop a link building campaign and give other sites a reason to link to your site. This may include free downloads, tools, or other valuable resources.

Site Map. Publish a sitemap. This makes it easy for search engines to spider your website and access all of your most important web pages. Site maps should be accessible from your home page and kept up-to-date.

Optimizing for both Google and Yahoo can be challenging. Following basic seo principles and working to develop incoming links can help you reach the top of the largest search engines. Apply these techniques regularly to see the greatest results.

——————————————————————————–

About the Author: Michael Fleischner is a marketing expert with more than 12 years of Internet marketing experience. Learn how to improve search engine rankings with his latest ebook, The Webmasters Book of Secrets at http://www.webmastersbookofsecrets.com.

Social Bookmarking and Power Linking

Just in case you missed it… the Web has changed.

I think a little history of the Internet is in order to grasp the big picture. I’m not going to give any dates (late eighties and early 90′s)… I’ll just give a quick run down.

I would say nearly fifteen years ago I had a dial up Internet connection that allowed me to log into various College computers, BBS’s (bulletin boards) and Newsgroups.

The Internet was much different than the Web is today. There were no graphics… it was totally text based and everything was dial up. All you could do was basically post to a newsgroup, post messages on some BBS’s and send email. Internet Marketing as we know it today did not exist.

In time though – a few brave souls ventured out of the shadows and began marketing within the newsgroups. This started wars between the “Purists” and new “Marketers” that I still remember to this day.

You see, the Purists considered the Internet to be their own little playground. They viewed anyone selling something as evil. After all… Marketers had the TV, the Radio, Magazines, Newspapers, etc, etc as an avenue in which to sell their crap. “The Internet is ours” was their battle cry.

They viewed the Internet as a way for them to communicate with each other without having to wade through all the BS advertising – and they could control what was being said. When the evil “Marketer” entered the picture, this all changed – and it changed quickly.

Once the evil Marketer had discovered the Internet as a new marketing medium, the “Purity” of the Internet, newsgroups and BBS’s was destroyed forever.

The Internet was now becoming just another medium for Marketers to sell their wares. It was inevitable and only a matter of time before this happened. But the Purists fought it tooth and nail.

The Newsgroups and BBS’s were now inundated and overrun with advertising. There was so much spam that you could hardly follow a thread or make sense of it. The thread may have started out discussing a subject such as “Microsoft DOS” as it’s first post… but it was hard to make sense of it as the evil Marketers would post “off topic” spam ads trying to sell their wares throughout the threads.

As we all know things have changed a lot since those “Caveman” days… the evil Marketers persevered and the Purists lost the War (or did they?)

Jump to present day…

The Purists did not really lose – they just lost a battle ten or fifteen years ago, but they have recently won the War and staked their claim on the Internet as belonging to them with Web 2.0.

Just in case you missed this “coup d’ etat” – give some serious thought to the current environment on the Web and Social Networking specifically. The Web no longer belongs to the evil Marketer. It is back in the hands and control of the Purists and they are once again controlling the conversations.

I know… it sounds like a bunch of BS, but for the most part it is true.

The difference is the Purists have discovered a way to once again “control” the conversations they want to have, while at the same time make a profit from these discussions (i.e. social networking communities) .

Welcome to Web 2.0 and Social Networking

I do not consider myself an Internet “Purist”… I’d fall more into the “Evil Marketer” category. This being the case, like you (if you are an Evil Marketer too) I have to adapt and change the way I do things or I will soon fall to the wayside and die a slow death.

Like the saying goes, “When in Rome… do as The Romans Do” was never a truer statement than it is today.

The way you sell things and market has to change – you are now in Rome.

Marketing as we know it today on the Web is dying a slow death – but it is happening fast. Only those who adapt and change will survive.

The Web 2.0 Marketer will survive this “coup d’ etat”.

Interruption Marketing has lost out to Participation Marketing and the Web 2.0 Marketer will prosper with these changes. It is no longer about forcing our messages down the throats of people – they get enough of that with all the other advertising mediums. It is all about authority, conversations and participating within discussions that other people deem important – not what you feel is important.

Seth Godin has been telling us this very same thing for a few years now – but a lot have not listened.

Why is this? Because it takes work and discipline (and change) to make relationships and participate in meaningful conversations. It is easier to do what I would call “method of the day” or “hit and miss” marketing – at least in the short run. But that is all it is… marketing for the short haul with no regard to the direction the Web is moving.

Like Seth Godin, Jack Humphrey has been telling Marketers they need to adapt and change their marketing methods for years now with Social Power Linking. Jack saw this “coup d’ etat” coming before most and began joining in with the “conversations” while most of us were still doing our marketing basically the same way as the “evil Marketers” of past (and current) – cramming and forcing our message down our visitors throats.

I have posted before on Social Bookmarking and Social Networking and the success I have had with these methods. You can view other posts on this blog and see the results I have had by simply joining in and creating “conversations” .

This is not a fad or a “method of the day”… it is here to stay.

With the direction the Web is heading (we are really already there) – Social Bookmarking, Social Networking, Conversation Participation with links from and pointing to those conversations is the key to being successful for the Web 2.0 Marketer.

So don’t rebel against the “coup d’ etat” – come on in, the water is fine.

About The Author
You can view additional articles, posts, podcasts and videos on Web 2.0 Marketing, Social Bookmarking and Affiliate Marketing by visiting: Riffs~Rants and the Pursuit of Happiness.