Monday 25 August 2008

10 Basic Rules for Where to Place Your Keywords

First of all, Google and most other search engines do NOT look at the META keyword tag. Many people say not to bother with it, but I use the META keyword tag and I place my keyword phrases in it. Here's why. I use this tag to help me remember what keyword phrases I am optimizing the page for. You'll find this to be a big help later when you have a lot of pages and have forgotten what keyword phrases you were trying to optimize the page for in the first place.

For the META description tag, keep your most important keyword phrase near the beginning of the sentence and make this tag a full sentence.

Do NOT use bold or italic keyword phrases in the first sentence on the page, but DO use your most important keyword phrase in the first sentence, but not the first word.

By all means, use your keyword phrases in your headings, (H1, H2 and H3).

Start putting keyword phrases in bold in the second paragraph.

Put your keywords or keyword phrases in italics a few times AFTER the first usage of the keyword. Never let the first usage of your keyword phrases be in Italics.

Use keywords in ALT tags.

It's very important to get other sites to use your most important keyword phase for your page in any inbound links. Of course, you are not in control of how other sites link to you, but work hard to get them to use your keyword phrase. Most sites will link to your home page, so give them the most important keyword phrase you are optimizing your home page for.

When you are linking from any page back to your home page, use your most important keyword phrase in the link. When your home page is linking to any other page, use the keyword phrase in that link that the other page is being optimized for.

Don't plan on getting much (if any) help by putting keywords or keyword phrases in your left Nav panel. Google likes keywords in full sentences. Putting the sentence in a paragraph is even better. By the way, a sentence according to Google is three or more words starting with a capital letter and ending with a period or other punctuation. Stop words such as:

"I," "a," "the," and "of" do NOT count as one of your three words.

Follow these rules and your Web site will make a big jump in its relevancy for your keyword phrases. Following these rules will NOT boost your PageRank.

To be #1 or even in the top 10 on the search engines your relevance for a given keyword phrase is much more important than your PageRank.

For example, you could have a PageRank of 10 and still not show up in the top 100 sites when someone is searching for "peanut butter sandwiches" unless of course, your page is optimized for (and has a high relevance for) the phrase "peanut butter sandwiches."

One final point: Use your keyword phrase in an H1, H2 or H3 headline followed by a keyword-rich paragraph and then repeat this with another H1, H2 or H3 headline and another keyword-rich paragraph. And of course repeat this again.

Use this format in addition following the 10 rules above and you will have a page with a high relevance for your keyword phrases.

Don't try to optimize a page for more that two or three keyword phrases and always optimize for keyword phrases and NOT keywords. After all, the keyword is included within the keyword phrase. Most people don't search for just one word any more anyway.

I have seen pages rank #1 with keyword densities form 1% to 20%, but I usually try to have a keyword phrase density of between 2% to 6%. Sometimes I go up to 10%.

Sunday 24 August 2008

Viral Marketing - Sneezing Via Email

If you've never heard of viral marketing before, we won't blame you for thinking that it's the FDA's problem! This rather sinister term was coined by some very respectable people belonging to a venture capital firm called Draper Fisher Jurvetson (DFJ), who described viral marketing as a "network-enhanced word of mouth". Simply put, that means getting your existing clientele to act as brand ambassadors for your product. And the amazing bit is that it's quite involuntary! (Have you noticed what appears at the end of every email sent from your Yahoo account?)

Let's see how this works. Practitioners of viral marketing leverage their customer base to pass on a marketing message to others in their network. The recipients of such messages, in turn, pass the same onto their contacts, and so on. Before you know it, the message will have touched a multitude of people, pretty much like an epidemic. We see you're getting the picture.....

This is old-hat, you say. Nothing other than network marketing, a trick that's been employed by legions of marketers! True. The only difference is that this kind of marketing has succeeded beyond imagination with some internet based businesses. In fact, the good folks at DFJ invented "viral marketing" as a tribute to Hotmail's success. Not surprisingly, the term is usually associated with internet-centric business models.
Like any other trick, not all viral marketing campaigns succeed. Certainly, only very few work as well as Hotmail does. However, there is some method to this madness, and if you're considering employing this strategy anytime in the near future, you might like to consider some friendly advice from the folks at or
Why did a particular program work? It has been observed that successful campaigns had some or more of the following characteristics.

Something or the other was FREE: This never fails to work. Whether it's a free email account, or a larger mailbox or screen savers or that trial software for Arabic translation, the word FREE grabs eyeballs like no other.
It was fully transferable: Viruses love travel, and it's the same with viral marketing. A short and sweet marketing message as a tag at the end of each email, or an easy to download graphic improves chances of the epidemic raging.

It pressed the right buttons: As with any form of marketing, this too exploits an implicit need. If you're not part of an instant messaging group, you're out! If you don't blog thrice a week, you should be in a museum! At the heart of every successful campaign is its ability to create a feeling of community.

It networked, so didn't perish: And that's the crux of the whole thing. Social scientists say that each person has about a dozen people in his or her close network, and perhaps hundreds in an extended one. Viral marketing campaigns that find a way of entering communication between people have a better chance of making it. Riding on the back of someone else's success is another effective way of spreading the good word about your product. Affiliate marketing programs work on the same principle, by using traffic on popular partner websites to their advantage.

Regardless of how individual programs are structured, the hook is an implied endorsement from a friend or trusted source. The power of communication technology has helped elevate this rather simplistic proposition into almost an art form. Whether your campaign makes a pretty picture or not is another story altogether!
Hi, I'm Akhil Shahani, a serial entrepreneur who wants to help you succeed. If you like to work smart, check out http://www.SmartEntrepreneur.net It's full of articles and resources to help you start and grow your business successfully. Please visit us & download our special "Freebie of The Month" at http://www.smartentrepreneur.net/freebie-of-the-month.html

What are effective marketing strategies.....

What are effective marketing strategies? On one hand, it seems that it's best to get your name and information in front of every possible customer. On the other hand, strategic contacts will keep you from wasting time and money. A possible solution to this dilemma is e-mail marketing.
E-mail marketing is the extension and evolution of direct mail marketing. The underlying concept is to keep your customer base well informed of your actions, advancements and opportunities. E-mail marketing can be highly valuable, with the right attitude and knowledge. Without the right information, you'll become roadkill on the spam highway.

E-mail marketing has recently been viewed as an ineffective tool for network marketing. This is because of a lack of personalization and an overzealous marketing community. Many e-mail marketers don't know how to communicate to people. Whether this is a lack of people skills or just deficient education doesn't matter. E-mail accounts the world over are inundated with spam, and people are sick of it. The good thing is that if you know the keys to communication, you can still launch a successful network marketing campaign using e-mail marketing.

You need to make e-mails as personal as possible. No more statements begging for random strangers to look at your website. A great e-mail network marketing campaign inspires customers to ask themselves one simple question: What's in it for me? When you can get people to want or need your services, you have begun the task of launching a successful network marketing e-mail campaign.

If potential customers get e-mails that lead to sales pages that look tacky, they won't return. Make sure that your link leads people to enticing pages full of information, not pages full of questions. Give them important information about your business instead of phrases full of hype and false promises.
Also, consider the fact that spam penalties can seriously hurt your business and your reputation. The Can Spam Act requires you to clearly feature an opt-out option on all your commercial email, and you also have to include a valid physical address in every commercial e-mail you distribute.

When you are creating your e-mail campaign, consider the fact that deceptive subject lines and false or misleading headers are not just deceptive; they are also illegal. The penalties for such e-mail campaigns include cease and desist notices from the Federal Trade Commission or fines up to $11,000 per violation. In some instances, the worst offenders can even face jail time.

That being said, e-mail marketing can still be a viable and effective sales tool if it is used properly. Never send e-mail messages to a list of people you do not know that you purchased from someone else, for instance. If you use e-mail marketing to update current and previous customers on a new feature or benefit you offer, than your message is a welcome guest and your campaign stands a chance at success.

E-mail marketing is a strategy that must be fully understood before you venture into it. Once you know all the rules and regulations, you will be well on your way to informing your customers about the goods and services that you provide. A successful, informative and useful e-mail marketing campaign will certainly benefit your business if you execute it properly.

Network marketing and MLM home businesses can benefit from e-mail marketing when it is done properly.Erik Gifford,a network marketing internet attraction marketing coach has posted a free article at network marketing guidance

4 Ways to Get Your Articles Noticed, Without Paying a Dime, Using My Affiliate Basic Techniques

When you first begin your article marketing campaign, do you really know what needs to be done? Starting out, who actually tells you what to do to be successful? Who tells you that in order to be an affiliate, you need to know these essential basics?

Unfortunately I did not have that support and things were a little tough because everything was a struggle and usually was not a success. The first few articles that I wrote were failures. I'm not kidding either. I couldn't even write a children's book if I tried. I came a long way baby. I began to learn how to not only write articles, but to optimize them for the search engines as well. Soon, every one of my articles that I wrote were golden. I found article writing becoming easy and profitable

1. Find your niche that you want to promote and decide on the keywords that you want to use. Make sure the you write quality content in to your article. Within this content, you need to make sure that your keyword density is approximately between 3-4%. If you go over, lets say, 6% with your keyword density, the search engines might regard it as a spam article and you won't even get a page ranking out it.

2. Proof read your articles within the next day to check for errors. Also, it is a great idea to let a friend or family member proof read it as well. If your grammar and sentence structure isn't great, you will have a difficult time being recognized as an authority.

3. Now you can publish your article at your preferred article submission site. Make sure you have a good bio box and everything is just the way you want it. Publish your article and wait for approval. This is usually a painful waiting period because you are so ready to get your ideas out for the world to read.

4. Once you have an approved article, Submit it to the major search engines. Submit it to Google, Yahoo, and MSN. After you get them into search engines, ping your articles and you will soon get a front page ranking with the next day.

Good luck and I wish you all the success in the world.
For more information, you might want to check out http://www.liason-marketing.com/index.html
My lens is another great resource to read and follow the links http://www.squidoo.com/wealthyaffiliatebasics

Advertising Costs on the Internet

Are you considering advertising your website more so that you can make more money? Are you concerned with what it might cost you and if you decide to advertise which methods are best for you to use? These are legitimate concerns and you should have them when you are trying to figure out the advertising costs on the internet for the methods you want to use. Some methods seem so attractive, but yet they really are not. Here are some of the costs and methods that work best for the majority of people.

First, let's concentrate on the free methods for those that are on an incredibly tight budget and need to get some advertising dollars coming in. You can use a signature on your email, forum marketing, free classified advertising, or you can eve write articles. These are all very good ways to generate some good free traffic to your websites. You should stay away from free traffic exchanges unless you are trying to build a list because the conversion rate is not worth the time you will put in and you could have written a few articles that will get you much better results.

Second, we will look at methods that are very inexpensive that work. You can use FFA (Free For All) pages, but you need to make sure you use the ones that you pay for. Also, you can use Pay Per Click Advertising, email blasts, and paid classified advertising. These are all methods of advertising that will cost you some money, but you have a good amount of control over how much you spend on your advertising and usually how much you spend per visitor. You should avoid any website that is simply selling traffic because they usually do not provide a very high quality of traffic to your website.

Last, we need to discuss those methods that are more expensive and do work very well. Again, you can use Pay Per Click advertising and expand your budget with it. You can also hire an advertising firm or hire a search engine optimization firm. You can pay for a listing in the Yahoo index or the Google index. These are all methods of advertising that are expensive, but work. Especially using a search engine optimization service is well worth the advertising costs on the internet that you will spend because they will bring you a naturally high search result on the engines and it will bring you traffic for a very long time.

Discover more about Advertising Costs On The Internet and advertising methods that you can use to get your business or website going. Get more info here:
Advertising Costs On The Internet

Bum Marketing Methods - How To Make Money By Doing Nothing

You are probably wondering what bum marketing is, and how you can make money from it yourself. It is quite simple really, all you need to do is find an affiliate program you like, and then promote it using free outlets such as Squidoo or Blogger. By creating a web of interconnected articles marketing your products as well as your other articles, you will gain organic traffic quickly and easily. By using article marketing you allow your articles to be picked up by giants such as Google and Yahoo without actually dedicated all that much time or resources to building backlinks as you would need to if you planned on starting your own website.

Bum marketing is based off keyword research, you need to find a niche and work from there. Gurus will pick the highly competitive niches as they will yield the highest results if they are done correctly but for new bum marketers it is a far better idea to start small. Find a niche that isn't overly competitive that has products to be sold. This can take time, you need to do research to determine what markets are flooded with affiliate marketers and which aren't, a good way to determine this is to do a Google search for keywords you would like your bum marketing articles to target. If there is a full page of Google AdWords results (11 sponsored links) then you most likely will not fair too well to begin with in this niche. Finding the right niche can be hard but once you do you should stick with it and watch the profits soar. Once you have a niche, all of your articles, webpages, sales pitches, etc. will become relevant to each other, and through this relevancy you can increase traffic and sales conversions across the board as you target new keywords. The key to bum marketing is getting organic traffic. While there are millions of marketers that manage to make money without organic traffic (traffic that comes naturally from search results as opposed to paying for advertising) you will need a solid base of organic traffic.

My personal steps for using bum marketing on various products:

* Find a niche that isn't overly crowded that you can talk about.
* Find products within the niche that are the right price and quality so that they will lead to sales and therefore commissions.
* Research keywords that could be used to search for that specific product as well as general keywords that people would search for that may be willing to buy the product.
* Build a Squidoo lens or a Blogspot blog for the product giving an unbiased review (I never sell products that don't work well, I refuse to scam people for money).
* Build backlinks for the lens or blog, typically with a little bit of article marketing and purchasing a social bookmarking package from a SEO firm.
* Promote products using social networks such as Facebook and MySpace, especially MySpace.
* Watch the commissions roll in as I hit the front page of Google and Yahoo, and find ways to expand my product line using traffic from various sources.

Bum marketing or even affiliate marketing in general is not for everyone. In order for it to work, you must be patient, there is no other option. I started using bum marketing a few months back and now it is already paying large dividends for me. I'm currently a 19 year old freshman college student yet I am making enough money online to pay for my graduate school as well as a brand new car, hopefully this information will lead to your success as well. For additional resources on bum marketing I would highly suggest checking out some of the links below.

To increase the visibility of your own web enterprises, consider visiting the following Squidoo lenses:
Bum Marketing Your opportunity to steal some of my marketing tactics to be on the road to success yourself.
Social Bookmarking Services to gain backlinks, search engine placement, and traffic.
LazyURL to optimize your site for search engines and any other online marketing purpose.

Search Engine Spiders Lost Without Guidance - Post This Sign!

The robots.txt file is an exclusion standard required by all web crawlers/robots to tell them what files and directories that you want them to stay OUT of on your site. Not all crawlers/bots follow the exclusion standard and will continue crawling your site anyway. I like to call them "Bad Bots" or trespassers. We block them by IP exclusion which is another story entirely.

This is a very simple overview of robots.txt basics for webmasters. For a complete and thorough lesson, visit http://www.robotstxt.org/

To see the proper format for a somewhat standard robots.txt file look directly below. That file should be at the root of the domain because that is where the crawlers expect it to be, not in some secondary directory.

Below is the proper format for a robots.txt file ----->

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /group/

User-agent: msnbot
Crawl-delay: 10

User-agent: Teoma
Crawl-delay: 10

User-agent: Slurp
Crawl-delay: 10

User-agent: aipbot
Disallow: /

User-agent: BecomeBot
Disallow: /

User-agent: psbot
Disallow: /

--------> End of robots.txt file

This tiny text file is saved as a plain text document and ALWAYS with the name "robots.txt" in the root of your domain.

A quick review of the listed information from the robots.txt file above follows. The "User Agent: MSNbot" is from MSN, Slurp is from Yahoo and Teoma is from AskJeeves. The others listed are "Bad" bots that crawl very fast and to nobody's benefit but their own, so we ask them to stay out entirely. The * asterisk is a wild card that means "All" crawlers/spiders/bots should stay out of that group of files or directories listed.

The bots given the instruction "Disallow: /" means they should stay out entirely and those with "Crawl-delay: 10" are those that crawled our site too quickly and caused it to bog down and overuse the server resources. Google crawls more slowly than the others and doesn't require that instruction, so is not specifically listed in the above robots.txt file. Crawl-delay instruction is only needed on very large sites with hundreds or thousands of pages. The wildcard asterisk * applies to all crawlers, bots and spiders, including Googlebot.

Those we provided that "Crawl-delay: 10" instruction to were requesting as many as 7 pages every second and so we asked them to slow down. The number you see is seconds and you can change it to suit your server capacity, based on their crawling rate. Ten seconds between page requests is far more leisurely and stops them from asking for more pages than your server can dish up.

(You can discover how fast robots and spiders are crawling by looking at your raw server logs - which show pages requested by precise times to within a hundredth of a second - available from your web host or ask your web or IT person. Your server logs can be found in the root directory if you have server access, you can usually download compressed server log files by calendar day right off your server. You'll need a utility that can expand compressed files to open and read those plain text raw server log files.)

To see the contents of any robots.txt file just type robots.txt after any domain name. If they have that file up, you will see it displayed as a text file in your web browser. Click on the link below to see that file for Amazon.com

http://www.Amazon.com/robots.txt

You can see the contents of any website robots.txt file that way.

The robots.txt shown above is what we currently use at Publish101 Web Content Distributor, just launched in May of 2005. We did an extensive case study and published a series of articles on crawler behavior and indexing delays known as the Google Sandbox. That Google Sandbox Case Study is highly instructive on many levels for webmasters everywhere about the importance of this often ignored little text file.

One thing we didn't expect to glean from the research involved in indexing delays (known as the Google Sandbox) was the importance of robots.txt files to quick and efficient crawling by the spiders from the major search engines and the number of heavy crawls from bots that will do no earthly good to the site owner, yet crawl most sites extensively and heavily, straining servers to the breaking point with requests for pages coming as fast as 7 pages per second.

We discovered in our launch of the new site that Google and Yahoo will crawl the site whether or not you use a robots.txt file, but MSN seems to REQUIRE it before they will begin crawling at all. All of the search engine robots seem to request the file on a regular basis to verify that it hasn't changed.

Then when you DO change it, they will stop crawling for brief periods and repeatedly ask for that robots.txt file during that time without crawling any additional pages. (Perhaps they had a list of pages to visit that included the directory or files you have instructed them to stay out of and must now adjust their crawling schedule to eliminate those files from their list.)

Most webmasters instruct the bots to stay out of "image" directories and the "cgi-bin" directory as well as any directories containing private or proprietary files intended only for users of an intranet or password protected sections of your site. Clearly, you should direct the bots to stay out of any private areas that you don't want indexed by the search engines.

The importance of robots.txt is rarely discussed by average webmasters and I've even had some of my client business' webmasters ask me what it is and how to implement it when I tell them how important it is to both site security and efficient crawling by the search engines. This should be standard knowledge by webmasters at substantial companies, but this illustrates how little attention is paid to use of robots.txt.

The search engine spiders really do want your guidance and this tiny text file is the best way to provide crawlers and bots a clear signpost to warn off trespassers and protect private property - and to warmly welcome invited guests, such as the big three search engines while asking them nicely to stay out of private areas.

Copyright © August 17, 2005 by Mike Banks Valentine
Google Sandbox Case Study http://publish101.com/Sandbox2 Mike Banks Valentine operates http://Publish101.com Free Web Content Distribution for Article Marketers and Provides content aggregation, press release optimization and custom web content for Search Engine Positioning http://www.seoptimism.com/SEO_Contact.htm

RSS Feeds - a Website Owners Friend in Disguise

We've all heard about it-it seems like all the buzz right now in the search engine marketing industry is RSS. If you're a website owner, than there are two ways your website can benefit from using RSS on your website-you can provide an RSS feed or, for the not-so-technically-inclined folks like me, you can use an RSS feed to keep your site's content fresh.

RSS is a way to syndicate website content. According to Wikipedia, "RSS is a family of XML file formats for web syndication used by (amongst other things) news websites and weblogs...the RSS formats provide web content or summaries of web content together with links to the full versions of the content, and other meta-data." Wikipedia goes on to say that "A program known as a feed reader or aggregator can check RSS-enabled web pages on behalf of a user and display any updated articles that it finds. It is now common to find RSS feeds on major web sites, as well as many smaller ones."

If you're a website owner, you can use RSS to your advantage in two ways: use someone else's RSS feed or produce your own RSS feed. 1. Install a script on your website-whenever a web page on your website is loaded the script automatically loads data from an RSS feed. If the RSS feed you choose to use is the latest news, then the latest news will appear on your website. This is fairly easy to set up and is good for search engine optimization purposes. I'll discuss installing an RSS feed script on your website later on in this article.

2. Provide an RSS feed of your website's content so others can use it. By providing an RSS feed of your website's content, you're essentially allowing people to use the content on their website or through their feed reader. In either case, you're also providing links back to your website, which is good for search engine optimization purposes-it will also get visitors to visit your website. Providing an RSS feed of your site's content can be tricky to set up-or it may not be appropriate if you don't have a lot of content on your website. I'll discuss your options later on in this article.

If you're a website owner, then chances are you want to keep your website's content fresh. By updating the content on a regular basis, the search engine spiders will take notice-they'll visit your website more often and index the new content and new web pages-which can ultimately bring more visitors to your website. For example, if your website is about real estate, you might consider including the latest real estate news on your website. Users typically search for topics that are related to items in the news, so if those topics and keywords are included on your website you can typically be found in the search engines for those terms. It's like having your own real estate news staff on hand, 24 hours a day, adding the latest news on your website.

Installing an RSS Feed on Your Website

Installing an RSS feed on your website is not as difficult as it sounds. You simply install a script one time-and then anywhere you want the RSS feed to appear you simply pick a feed and copy and paste some code on your page. The first thing you need to do is figure out which script to use. If your website is using an Unix server and has PHP installed, the the easiest PHP script I've found is called CaRP. You will first want to visit the CaRP download page and download the file. CaRP has a free version that you can use on your website. They request that you link back to their website if you use it. Unzip the zip file and upload the files to your website using an FTP program. Then, run the setup file in your web browser, chmod the appropriate files, and continue with the directions given to you in the web browser. Once it's installed, the script will give you code to copy and paste wherever your want the RSS feed to be displayed on your website. You can even change the font, size, and color of the feed by specifying those attributes before the code.

There are other RSS parser scripts available, but CaRP is the one that I'm more familiar with because its ease of use and ease of installation. To find other RSS parsers, you can search Google for "rss parser script". CaRP is typically used if you have PHP installed on your website, and RSS parser scripts are available if you're running a website on a Windows server. If you're using the PHP version of CaRP then you'll want to use PHP pages on your website-or you will need to parse your html pages as PHP pages.

Finding an RSS Feed

Once you've installed the parser script, you'll want to find the appropriate RSS feed to use on your website. Keep in mind that a lot of RSS feeds are provided for "non-commercial use only", so if your website is a for-profit website you'll need to check the terms of using the RSS feed before you use it.

The best way to find an RSS feed is to search for it. Following my real estate example above, searching for "rss real estate" (without the quotes) finds several feeds. Topix.net provides a real estate rss feed. By copying that URL and pasting it into the CaRP code provided by CaRP, you can add that code to any web page on your website and the latest Real Estate News from Topix will automatically appear. Another way to find a feed is to look for a blog on your site's topic. Most blog software includes an RSS feed, so searching Google for "keyword blog rss" might also help you find a feed you can use.

Adding an RSS feed on your web page won't get you high rankings in the search engines. A while back I tested this theory a while back by making three nearly identical web pages-one static page, one with RSS feed content on it, and another with a live RSS feed on it. It turned out that after all three pages were indexed and ranked, the page with the live RSS feed actually ranks third-the static page without the RSS content on it always ranks the best. Search Google for "silly burlywood revenue" and you'll see what I mean.

Although adding an RSS feed won't get your page top rankings in Google, there are other benefits. For example, updating your web page's content on a regular basis gets the page crawled more often-and more active crawling can contribute to other benefits, such as ranking for terms that appear in the feed on your site as well as causing new web pages on your site to get indexed faster than they were before.

Providing an RSS Feed of Your Content

Depending on your website's content, providing an RSS feed of your content might be appropriate. If your website provides news or contains a blog, then publishing an RSS feed might work well. Most blog software automatically publishes an RSS feed of your blog, so you might want to find its URL and start promoting it. If you sell a lot of products on your website, you might consider making an RSS feed available-perhaps one that includes your top selling products along with their prices. Other websites might be interested in publishing that data for their users, and you would receive more visitors and links back to your website, something that will help your site's search engine rankings.

Publishing an RSS feed is a little more complicated, perhaps to lengthy a discussion for this article. However, there are many good tutorials out there, including Danny Sullivan's Search Engine Watch article about it, as well as the RSS tutorial at mnot.net.

Whether you use RSS to publish your own feed or you use someone else's feed on your website, both provide great benefits to website owners-and definitely will continue in the future to be used more and more.

Bill Hartzer is a successful writer and search engine marketing expert who has personally created hundreds of websites over the years. Extended bio info:

Bill created his first website back in 1996 to help promote his former database software business. It was then when he learned about the power of the search engines and web search, which helped potential customers find his business online.

Bill Hartzer has over 15 years of professional writing experience. He has survived stints as a writer for television, as well as a technical writer for several computer software companies in Florida and in Texas. Mr. Hartzer combines his writing and online skills to create compelling and useful websites for corporations worldwide. Mr. Hartzer focuses on the optimization in the business to business arena, but applies these optimization skills to business to consumer websites, as well.

SEO Expert Guide - Paid Site Promotion (Marketing) (part 7/10)

In parts 1 - 6 you learnt how to develop your proposition, identify your key words and optimize and promote (for free) your site and pages. You were also introduced to our mythical Doug (who sells antique doors, door handles, knockers, door bells or pulls and fitting services) in Windsor in the UK.

Now we turn our attention to paid site promotion, which will be relevant to those of you trying to enter an already crowded marketplace, where your key words are saturated!

(a) Pay-per-click (PPC) Advertising

Run a search on Yahoo or Google for a popular consumer product like "MP3 players". In the results, you'll see a set labeled as 'Sponsored Links' or 'Sponsor Results'. Some results will appear in colored text boxes along the site of the page, whilst others may appear in the same format as the main search results. All these results are paid advertisements from the sites listed within the ads.

The ranking order is a product of the bid amount (CPC) and the popularity of the ad (CTR%) and are purchased through pay-per-click (PPC) advertising suppliers. The largest two are Google's Adwords (displayed on Google, AOL, Ask Jeeves) and Yahoo's Sponsored Links - run by acquired company Overture (appearing on Yahoo!, MSN, AltaVista, and others). MSN are planning to release their own PPC scheme soon.

If you have tried and failed with free promotion tactics, the chances are that you are operating in a highly competitive area (where a PPC campaign may well be justified). After all, if you can make more money from a converted click-through than it cost you to buy the click-through, why wouldn't you look at PPC?

Look at your A-list of keywords. Refer back to your research on Overture. How many searches are conducted per month on your keywords? How much are you - and your competitors - willing to pay for those keywords?

For your first campaign, use a large number of relevant search phrases, so that you can test and learn what works best. Build unique ads for each search phrase, as this will help to optimize your click-through rate or CTR% (defined as clicks - or unique visitors - divided by page impressions on the ad pages - expressed as a percentage). This in turn means more targeted traffic, in some cases paying less per click (due to the methods by which advertising is priced).

Make sure you use a suitable (and perhaps even a dedicated) landing page for each campaign. Simply sending people to your homepage (from where they have to navigate your site) will not help your conversion rates (defined as sales divided by unique visitors, expressed as a percentage)! Help them to buy, as they are likely to be in a hurry!

The PPC providers give you useful interfaces with which to track the effectiveness of your campaign and overall return on your investment. Pay close attention to which keywords are delivering for you and make notes for future campaign planning.

Whilst there is no hard and fast rule, a CTR% of 1.8% - 3.5% is in an acceptable range (and anything over that represents a very good performance). On Google, if your ad achieves less than 0.5% CTR, your ad may well be de-listed. The lower your CTR%, the more you will have to pay in cost-per-click (CPC) to get into the top 3-4 results in your chosen keywords (vital if you want to appear on partner sites like AOL).

(b) Paid Directory Submission

I mentioned earlier that Yahoo! Express Submission is the best way to get a listing on Yahoo! Directory. With a node-level PR of 10, Yahoo! Directory carries much weight with Google and the $299 fee (whikst steep and not absolutely guaranteeing a listing) is probably worth the cost.

Do not submit to Yahoo! until you have really tested which site description works best on the free search engines.

(c) Express Search Engine Submission

In the past, it was only worth doing a paid submission with Ask Jeeves, as this engine continued to enjoy a small but loyal following but did not grow it's index as aggressively as the big boys. This meant paid submission was the only way to guarantee a good placement.
However, Ask Jeeves withdrew this service in 2004 (in favour of a strategy that mirrors the larger players). As such, I would not recommend paid listings with any search engines now.
Next we turn to tools you can use to monitor your ongoing optimization effectiveness...

Navigate the guide

Previous : SEO Expert Guide - Free Site Promotion (PR) (part 6/10)
Next: SEO Expert Guide - Black Hat SEO - Activities to avoid (part 8/10)
About the author:
David Viney (david@viney.com) is the author of the Intranet Portal Guide; 31 pages of advice, tools and downloads covering the period before, during and after an Intranet Portal implementation.
Read the guide at http://www.viney.com/DFV/intranet_portal_guide or the Intranet Watch Blog at http://www.viney.com/intranet_watch.

Keywords are the "KEY" to a Popular and Profitable Web Site

Keyword Research will reveal answers to 3 critical questions:

1. Is there a demand for what your site offers? If not, you need to keep moving down your list until you find something that people are already looking for.

2. How are people searching for your topic? For example, if your theme is "Japanese food" how are people searching for information? Are they typing in "Japanese recipes", "low sodium miso soup", "history of Japanese food", "Japanese food in NYC", etc. This part of the search will allow you to build up good topics for your site pages and provide keywords that you will use to optimize your pages to become a search engine magnet.

3. How many sites will you be competing with - does demand outstrip supply or vice-versa? Right now your job is to build huge lists of high-demand, low-competition keywords. Be thorough and exhaustive in looking for phrases that people might use to find your site.

You cannot cut corners at this stage of the process or your business will suffer. This can be slow, tedious work but fortunately there are some good tools to help you automate your keyword research and help find the most profitable keywords to build your content around. There are some good free tools to help you make light work of this including Search It from Site Build It! and Overture. More comprehensive tools that provide demand, supply and profitability data that help you find profitable niches include Brainstorm It from Site Build It! and Wordtracker.

Keyword Optimization

You may have heard the terms "search engine optimization (SEO)", "keyword optimization", "page optimization", etc. Basically, they all refer to the same thing - making sure your pages have the right amount of keywords, placed correctly, to effectively get your site "spidered" or found by the search engines, such as Google.

Repeating keywords throughout a site is just as important as choosing the right keywords. If you use the keywords too often the search engines will ignore them; if you don't use them enough the search engines will not find and index them properly.

The main keyword, in our example "Japanese food", should be used as part of the domain name and in the title tag of the HTML code for the page. It should also be used in the heading of the page where you tell visitors what the page is about. Also, most SEO experts agree that it is best to put your keyword in the first and last sentences of your page.

As for the body copy, there are some good rules-of-thumb that help you achieve the correct "keyword density" or keyword repetition. Many experts say you should use the 4% - 7% rule (approx. 25 words in a 500 word document). However, don't use a single keyword over and over or your copy will seem forced. Instead, weave in some variations of the keyword (e.g. plural forms, synonyms) to ensure your content flows well and makes sense! Simply filling up a site with your keywords will not fool the search engines; rather, it may be considered spamming and your site can get banned!

Many people say that content is king, but in fact, it's content that is keyword rich that is king. It is absolutely critical to find your best keywords and use them in the right way to attract targeted traffic. It takes some time and practice but if you persevere your web site will be built on a strong foundation.

Written by Gail Kaufman.
For more information and practical tips on how to build a popular and profitable web site inexpensively, please visit: http://www.websitedesigngenius.com

Design A Spider Friendly Site

To be successful in the search engines it's important to design your web site with the spiders in mind. Using the latest in web page design is not generally the best way to go. Spiders don't view web pages like humans do, they must read the HTML in the page to see what it's about. Below you will find tips on how to best design your web site with search engines in mind.

Do not use frames at all. Some search engines cannot spider web pages with frames at all. For the other search engines that can, they can have problems spidering it and sometimes they too can't index the web page. Do not only use image's to link out. You should always use text links to link out to important content on your web site. Spiders can follow image links, but like text links more though.

Use external JavaScript files instead of using Java Script code in the HTML document, using Java Script in the HTML document will make the page size much larger. Using an external Java Script file to do the job will reduce page size and make it easier for both spiders and browsers to download the page. Using Cascading Style Sheets can reduce page size and making the download time much faster in most cases. It will allow the spider to index your web page faster and can help your ranking.

Avoid using web page creators such as FrontPage, Dreamweaver or a WYSIWYG editor. Software such as that will often times add scripting code that is not needed, making the page larger than it needed to be and making it harder to crawl. It will also add code that can't be read by the search engines, causing the spider not to index the page or not index the whole web page. It is better to use standard HTML. Adding code that they can't read or have a hard time to read can lead to major problems with your ranking.

Try not to use Flash when possible. Flash cannot be read by the search engines to date and will cause download time to slow a bit. If you do decide to use Flash anyway, make sure you add text to the web page, so the search engines have something to read and find out what your web page is about. It will also allow your visitors to have something to read while the Flash file loads. Also don't use Flash as a way of navigation, as I said before spiders cannot read Flash.

It's important to add a site map to your web site. Not only will this make it easier for internet surfers to get around your web site, but it will also allow spiders to find your site's content easier and index your web page sooner. The site map should contain text links and not image links.

I highly suggest that you look at your web page with a Lynx browser because this is similar to how search engines will view your web page. There are other tools on the internet that will allow you to view your web page without a Lynx browser, but see a web page just like it, so you may want to check those out as well.

Matt Colyer is the owner of Superior Webmaster. He also is a php, CGI, and ASP developer.

Playing in Googlebots Sandbox with Slurp, Teoma, & MSNbot - Spiders Display Differing Personalities

There has been endless webmaster speculation and worry about the so-called "Google Sandbox" - the indexing time delay for new domain names - rumored to last for at least 45 days from the date of first "discovery" by Googlebot. This recognized listing delay came to be called the "Google Sandbox effect."

Ruminations on the algorithmic elements of this sandbox time delay have ranged widely since the indexing delay was first noticed in spring of 2004. Some believe it to be an issue of one single element of good search engine optimization such as linking campaigns. Link building has been the focus of most discussion, but others have focused on the possibility of size of a new site or internal linking structure or just specific time delays as most relevant algorithmic elements.

Rather than contribute to this speculation and further muddy the Sandbox, we'll be looking at a case study of a site on a new domain name, established May 11, 2005 and the specific site structure, submissions activity, external and internal linking. We'll see how this plays out in search engine spider activity vs. indexing dates at the top four search engines.

Ready? We'll give dates and crawler action in daily lists and see how this all plays out on this single new site over time.

* May 11, 2005 Basic text on large site posted on newly purchased domain name and going live by days end. Search friendly structure implemented with text linking making full discovery of all content possible by robots. Home page updated with 10 new text content pages added daily. Submitted site at Google's "Add URL" submission page.

* May 12 - 14 - No visits by Slurp, MSNbot, Teoma or Google. (Slurp is Yahoo's spider and Teoma is from Ask Jeeves) Posted link on WebSite101 to new domain at Publish101.com

* May 15 - Googlebot arrives and eagerly crawls 245 pages on new domain after looking for, but not finding the robots.txt file. Oooops! Gotta add that robots.txt file!

* May 16 - Googlebot returns for 5 more pages and stops. Slurp greedily gobbles 1480 pages and 1892 bad links! Those bad links were caused by our email masking meant to keep out bad bots. How ironic slurp likes these.

* May 17 - Slurp finds 1409 more masking links & only 209 new content pages. MSNbot visits for the first time and asks for robots.txt 75 times during the day, but leaves when it finds that file missing! Finally get around to add robots.txt by days end & stop slurp crawling email masking links and let MSNbot know it's safe to come in!

* May 23 - Teoma spider shows up for the first time and crawls 93 pages. Site gets slammed by BecomeBot, a spider that hits a page every 5 to 7 seconds and strains our resources with 2409 rapid fire requests for pages. Added BecomeBot to robots.txt exclusion list to keep 'em out.

* May 24 - MSNbot has stopped showing up for a week since finding the robots.txt file missing. Slurp is showing up every few hours looking at robots.txt and leaving again without crawling anything now that it is excluded from the email masking links. BecomeBot appears to be honoring the robots.txt exclusion but asks for that file 109 times during the day. Teoma crawls 139 more pages.

* May 25 - We realize that we need to re-allocate server resources and database design and this requires changes to URL's, which means all previously crawled pages are now bad links! Implement subdomains and wonder what now? Slurp shows up and finds thousands of new email masking links as the robots.txt was not moved to new directory structures. Spiders are getting errors pages upon new visits. Scampering to put out fires after wide-ranging changes to site, we miss this for a week. Spider action is spotty for 10 days until we fix robots.txt

* June 4 - Teoma returns and crawls 590 pages! No others.

* June 5 - Teoma returns and crawls 1902 pages! No others.

* June 6 - Teoma returns and crawls 290 pages. No others.

* June 7 - Teoma returns and crawls 471 pages. No others.

* June 8-14 Odd spider behavior, looking at robots.txt only.

* June 15 - Slurp gets thirsty, gulps 1396 pages! No others.

* June 16 - Slurp still thirsty, gulps 1379 pages! No others.

So we'll take a break here at the 5 weeks point and take note of the very different behavior of the top crawlers. Googlebot visits once and looks at a substantial number of pages but doesn't return for over a month. Slurp finds bad links and seems addicted to them as it stops crawling good pages until it is told to lay off the bad liquor, er that is links by getting robots.txt to slap slurp to its senses. MSNbot visits looking for that robots.txt and won't crawl any pages until told what NOT to do by the robots.txt file. Teoma just crawls like crazy, takes breaks, then comes back for more.

This behavior may imitate the differing personalities of the software engineers who designed them. Teoma is tenacious and hard working. MSNbot is timid and needs instruction and some reassurance it is doing the right thing, picks up pages slowly and carefully. Slurp has addictive personality and performs erratically on a random schedule. Googlebot takes a good long look and leaves. Who knows whether it will be back and when.

Now let's look at indexing by each engine. As of this writing on July 7, each engine also shows differing indexing behavior as well. Google shows no pages indexed although it crawled 250 pages nearly two months ago. Yahoo has three pages indexed in a clear aging routine that doesn't list any of the nearly 8,000 pages it has crawled to date (not all itemized above.) MSN has 187 pages indexed while crawling fewer pages than any of the others. Ask Jeeves has crawled more pages to date than any search engine, yet has not indexed a single page.

Each of the engines will show the number of pages indexed if you use the query operator "site:publish101.com" without the quotes. MSN 187 pages, Ask none, Yahoo 3 pages, Google none.

The daily activity not listed in the three weeks since June 16 above has not varied dramatically, with Teoma crawling a bit more than other engines, Slurp erratically up and down and MSN slowly gathering 30 to 50 pages daily. Google is absent.

Linking campaign has been minimal with posts to discussion lists, a couple of articles and some blog activity. Looking back over this time it is apparent that a listing delay is actually quite sensible from the view of the search engines. Our site restructuring and bobbled robots.txt implementation seems to have abruptly stalled crawling but the indexing behavior of each engine displays distinctly differing policy by each major player.

The sandbox is apparently not just Google's playground, but it is certainly tiresome after nearly two months. I think I'd like to leave for home, have some lunch and take a nap now.

Back to class before we leave for the day kiddies. What did we learn today? Watch early crawler activity and be certain to implement robots.txt early and adjust often for bad bots. Oh yes, and the sandbox belongs to all search engines.

Mike Banks Valentine is a search engine optimization specialist who operates http://WebSite101.com and will continue reports of case study chronicling search indexing of http://Publish101.com

SEOs Relationship With Website Architecture

Search engine optimization for today's search engine robots requires that sites be well-designed and easy-to-navigate. To a great degree, organic search engine optimization is simply an extension of best practices in web page design. SEO's relationship with web design is a natural one. By making sites simple and easily accessible, you are providing the easiest path for the search engine robots to index your site, at the same time that you are creating the optimum experience for your human visitors.

This approach ties well into the notion of long-term search engine marketing success. Rather than trying to "psych out" the ever-changing search engine algorithms, build pages that have good text and good links. No matter what the search engines are looking for this month or next, they will always reward good content and simple navigation.

Search Engine Robots

Search engine robots are automated programs that go out on the World Wide Web and visit web pages. They read the text on a page and click through links in order to travel from page to page. What this really means is that they "read" or collect information from the source code of each page. Depending on the search engine, the robots typically pick up the title and meta description. The robots then go on to the body text of the page in the source code. They also pay attention to certain tags such as headings and alt text. Search engine robots have capabilities like first-generation browsers at best: no scripting, no frames, no Flash. When designing, think simple.

Search Engine Friendly Design

Creating search engine friendly design is relatively easy. Cut out all the bells and whistles and stick to simple architecture. Search engine robots "understand" text on the page and hyperlinks, especially text links. The relationship of SEO and web design makes sense when you start with good design techniques for your visitor. The easier the navigation and the more text on the page, the better it is not only for the visitor but also for the search engine robots.

Obstacles For Indexing Web Pages

Search engine robots cannot "choose" from drop down lists, click a submit button, or follow JavaScript links like a human visitor. In addition, the extra code necessary to script your pages or create those lists can trip-up the search engine robots while they index your web page. The long JavaScript in your source code means the search engine robots must go through all this code to finally reach the text that will appear on your page. Offload your JavaScript and CSS code for quicker access to your source code by the search engine robots, and faster loading time for your online visitors. Some search engine robots have difficulty with dynamically-generated pages, especially those with URLs that contain long querystrings. Some search engines, such as Google, index a portion of dynamically generated pages, but not all search engines do. Frames cause problems with indexing and are generally best left out of design for optimum indexing. Web pages built entirely in Flash can present another set of problems for indexing.

Depth Of Directories

Search engine robots may have difficulty reaching deeper pages in a website. Aim to keep your most important pages no more than one or two "clicks" away from your home page. Keep your pages closer to the root instead of in deeply-nested subdirectories. In this way you will be assured the optimum indexing of your web pages. Just as your website visitor may become lost and frustrated in too many clicks away from your homepage, the robots may also give up after multiple clicks away from the root of your site.

Solutions And Helpful Techniques

If there are so many problems with indexing, how will you ever make it work?

The use of static pages is the easiest way to ensure you will be indexed by the search engine robots. If you must use dynamically-generated pages, there are techniques you can use to improve the chances of their being indexed. Use your web server's rewrite capabilities to create simple URLs from complex ones. Use fixed landing pages including real content, which in turn will list the links to your dynamic pages. If you must use querystrings in your page addresses, make them as short as possible, and avoid the use of "session id" values.

When using Flash to dress up your pages, use a portion of Flash for an important message, but avoid building entire pages using that technology. Make sure that the search engine robots can look at all of the important text content on your pages. You want your message to get across to your human visitor as well. Give them enough information about your product to interest them in going the next step and purchasing your product.

If you must use frames, be sure to optimize the "no frames" section of your pages. Robots can't index framed pages, so they rely on the no frames text to understand what your site is about. Include JavaScript code to reload the pages as needed in the search engine results page.

Got imagemaps and mouseover links? Make sure your pages include text links that duplicate those images, and always include a link back to your homepage.

Use a sitemap to present all your web pages to the search engine robots, especially your deeper pages. Make sure you have hyperlink text links on your page, and a sentence or two describing each page listed, using a few of your keyword phrases in the text.

Remember that the search engine robots "read" the text on your web page. The more that your content is on-topic and includes a reasonable amount of keyword-rich text, the more the search engine robot will "understand" what the page is about. This information is then taken back to the search engine database to eventually become part of the data you see in the search engine results.

Last of all, it is very important to test your pages for validation. Errors from programming code and malformed html can keep the search engine robots from indexing your web pages. Keep your coding clean.

Check List For Success

* Include plenty of good content in text on your web pages

* Incorporate easy to follow text navigation

* Serve up dynamically generated pages as simply as possible

* Offload JavaScript and other non-text code (style sheets, etc.) to external files

* Add a sitemap for optimum indexing of pages

* Validate your pages using the World Wide Web Consortium's validation tool, or other html validator

On Your Way To Indexed Pages

The best way to assure that your pages will be indexed is to keep them simple. This type of architecture not only helps the search engine robots, but makes it easier for your website visitors to move throughout your site. Don't forget to provide plenty of good content on your pages. The search engine robots and your visitors will reward you with return visits.

Resources

To learn more about how to work around optimization problems with JavaScript, dynamically-generated pages, Frames and Flash, read the following articles:

Optimizing Pages with JavaScript and Style Sheets for Search Engines
http://www.searchinnovation.com/optimizing-pages-with-javascript.asp

Optimizing Dynamic Pages (Part I) http://www.searchinnovation.com/optimize-dynamic-pages-1.asp

Optimizing Dynamic Pages (Part II) http://www.searchinnovation.com/optimize-dynamic-pages-2.asp

Optimizing Frames for Search Engines http://www.searchinnovation.com/optimizing-frames-for-search-engines.asp

Html validation tool
http://validator.w3.org/

Stylesheet validation tool
http://jigsaw.w3.org/css-validator/

Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Marketing company serving small businesses. She has specialized in Search Engine Optimization since 1998, including three years as the Search Engine Specialist for O'Reilly Media, Inc., a technical book publishing company.
Copyright © 2002-2005 Search Innovation Marketing. http://www.searchinnovation.com All Rights Reserved.
Permission to reprint this article is granted if the article is reproduced in its entirety, without editing, including the bio information. Please include a hyperlink to http://www.searchinnovation.com when using this article in newsletters or online.

Search Engine Saturation Tool - A Must Have SEO Tool

Search Engines have become the soul of the Internet. They provide a means of aggregating, correlating, indexing and categorizing the vast amounts of content in the wild world of Internet. They have gotten complex over the years with better algorithms to serve the folks who want to find something, really find something. They have become extremely adept at finding out duplicates and hidden texts, detecting and punishing the search engine spammers. Every webmaster should take utmost care on what gets to be listed in the search engine. Things that were employed earlier to spam the search engine to get high ranking will come back to haunt you if you don't do the garbage disposal. In this article we will employ one tool that makes the SEO or webmaster's task much simpler. this article provides you with some SEO tips on how to use the saturation tool.

Search engine saturation tools provide a snapshot of what is currently indexed or known to the popular search engines. They provide you a way to understand what areas of your website are indexed and what is not. Alternately it provides information on, did the thing you don't want to be indexed, got indexed or safe from the eyes of the dragon. This tool shows exactly the weakest portions of your website. Here is the next step you need to understand it is saturation density. This is calculated as the percentage of your website pages that shows up in the saturation tool results. The percentage should exclude the pages that you wanted to be excluded. Additionally you should exclude the image files and object files. Once you take inventory of the file list you want to target and the number of files you got indexed, you can get your personal saturation indicator. This personal target should obviously be close to 100%.

The next factor you need to consider is the saturation density of your competitors. Just look at the saturation index of your competition. And compare against yours. This will give you a pretty good idea on the probability of some one finding your web pages over theirs. If your competition has 1000 pages indexed each for a unique keyword on top of the common keywords. They are going to get the Lion's share of the traffic. Ultimately this means PageRank. There is a lot you can learn about your competitor than you would by visiting their website. For example your site might be very rich in content and the competitor may seem to be low in content but they still rank higher. But if you look closer they may have a flat file based vibrant forum that gets indexed in the search engines giving them higher relevancy than yours. This is just one example I can reveal. There is tons of other such goldmine data that can be collected by simply using the saturation tool.

There are many search engine optimization websites and companies who offer this tool for free over the web. The resource box contains one such website. The saturation tool is typically a taken for granted tool. The average webmaster just discounts what is indexed and what the competition is index on. Looking at just the top 10 results and analysis of the same won't suffice. Dig down deeper, you will be amazed at what you can find out about the competition using these tools. Also remember what I said in the first paragraph. There is stuff that you don't want your competition to know about like a simple customer list that gets stored somewhere because you used an unprotected flat file system. Everybody is learning your competitor is also reading this article and they would have started using the saturation tool to spy on you. This tool is great as it enables you to be a good responsible business. Happy optimization!

To get more SEO Tips like this please visit web-inspect.com. Author does freelancing for many search engine websites and can be reached through the no fee free Freelance website - freelancefree.com. Author recommends the following Photoshop Tutorials Website for your further development - tutorialized.com.

21 Search Engine Terms Every Web Marketer Should Know Part 1

1. Search Engine - Is a database of web sites that is ranked according to the computerized criteria that the programmers decide upon called an algorithm. Various search engines determine ranking on their own different factors of importance or relevancy. For the last few years the Google search engine was the most popular search engine supplying the search results for Yahoo and to a lesser extent MSN and AOL. This all changed recently after Yahoo purchased different search engine companies and developed its own search engine. Soon MSN will enter this market with its own search engine algorithm.

Searchers input keyword queries into search boxes and are given results from the databases of the search engines in accordance with the ranking algorithm from whatever search engine they are using.

In other words, search engines index sites it feels will be of value to its customers, which are Internet surfers searching for information.

The most important concept to grasp with a search engine, is that it uses an automated computerized system to find and rank the sites within its database of web sites.

2. Internet Directory - An Internet directory is a large listing of categorized web sites - however the concept that should be understood with the Internet directories, is that they actually have editors that decide what goes into the directory. Remember an Internet directory is decided and managed by human editors, while in contrast, a search engine is ranked by a computerized algorithm or system. Directories are important to get links from because, they will raise your rankings in the Google algorithm (which is based on PageRank? or links from other sites to a great extent).

3. SEO - Search Engine Optimization From the Tao of Search Engine Promotion represents the Ying or Female Principle in that it is more fluid and receptive to the algorithms of the search engines, which of course you - do not control. SEO promotion is using known conventions and in some cases deconstructing the algorithms of the developers of the search engines and working with them. In other words it is like Judo where you use the momentum and power of the search engines to build your business. You will have to be constantly vigilant, in order to try to stay abreast of the latest developments in SEO promotion.

4. Pay-Per-Click (PPC) advertising with the search engines is bidding for particular keyword phrases or search terms used most frequently by Internet users related to information on certain niches and sectors. Overture was the first great pay-per-click advertising sales channel for the search engines. It presently represents Yahoo, MSN, AOL, Alta Vista, Hot Bot, and related partners allowing advertisers to bid on particular keywords. Google presently runs its own PPC Search Engine advertising branch called Google's Adwords Select. Since PPC advertising is so expensive use it in moderation by setting daily and monthly budgets (otherwise for high volume searched keywords you could go broke).

After setting budgets, which you do in the Overture and Google advertising interfaces, use PPC to test your conversion rates for subscribers, sales, etc., and constantly monitor it and fine-tune it. You want to increase your conversation rates. This is critical because as we alluded to the expense of PPC in earlier in this article the typical major ad agency or corporate brand only spends $5 - $15 dollars to reach a thousand people (CPM). Although they may go higher for a responsive direct mail lists like $100 or $150 to reach a thousand people. In PPC advertising small business are usually paying at least 35 cents on the dollar (for any word with any real competition) to reach a thousand people or $350 dollars. I have seen and experienced people paying $2.50 a click or $2250 dollars to reach a thousand people.

The true expense of PPC is never discussed by experts in that industry. Any small business using PPC should have an outstanding product ready for prime time, to see any return in investment. Alternately, they should only use it for limited testing or to get a product quickly to market on-line. Given the expense of PPC some small business people might be better off buying classified direct response advertising in a niche publication or obtaining ad space in ezines.

Another point to discuss in PPC is how the major companies hurt their advertisers (stab them in the back) by using pop-up blockers, which hurts the advertisers ability to collect email addresses on the front end, and with spam filters block even legitimate marketers ability to reach opt-in subscribers on the back end. Google is more of a front end pop-up blocker, AOL is more of an email blocker, while Yahoo is both. There are better ways to stop email and web site spam. What is important is that small business, if it is not careful with PPC could be paying AOL, Google, and Yahoo/Overture to put them out of business.

5. Pay Per Inclusion is paying to be included in the database of a search engine or Internet directory. Presently it is free to be included in the Google search engine. You just need a link to your site from a few other sites already in the Google database and your site will be spidered in many cases. Recently it has been announced Yahoo will charge a fee to update your new content into their search engine database. From what I understand if you are already listed in Yahoo its all well and good, but any new content that you put on your site will not be indexed.

6. Search Terms (AKA Keywords or Keyword Phrases) - Search terms and more specifically keyword phrases are words searchers put in a search box to find information on a particular product, service, or item. Keywords and Keyword phrases have different tiers. For instance the top tier keyword 'business' probably receives 500,000 searches a month. But it is so general that it would not be a good keyword for which to optimize your site. However, the second tier Keyword phrase 'small business' would have less searches, but it is more of a targeted search. The third tier keyword phrases are even more targeted and would be ideal for persons to optimize their sites for although there are less monthly searches for them.

7. Search Engine Algorithm - Algorithms are sets of rules according to which search engines rank web pages. Figuring out the algorithms is a major part of SEO. The thinking is that if you understand how they calculate relevance, you can make specific pages on your site super relevant for specific search terms. For free telecourses and more on algorithms and SEO in general, please check out http://www.searchengineplan.com.
About The Author
Kamau Austin, helps small and minority business make more money, by creating, search friendly web sites. He is the owner of http://www.Ebizbydesign.com, http://www.Einfonews.com, http://www.carolsartshows.com and the creator of the Free Search Engine Promotion Tele-course.

SEO Tips - Google Has Changed - Learn Why And What To Do

Google now checks the year your domain name was first registered.

This just makes sense. Those that care about their domain and their brand will register their domain for a long time. This demonstrates commitment. Of course it's not very hard for a spammer to do the same but the upfront costs are just that little bit higher and might help act as a deterrent. If anything a short domain registration period will be yet another flag in the Google system that will keep certain sites away from top rankings. Hopefully it will be the spammers that trigger the penalty when combined with all the other spam flags they trigger.

If you are anything like me and you like to register domains for an "idea" you have for the future there is no way you will be investing in a 10 year registration for something you may never pursue. An idea is an idea and I know half of the domains I buy amount to nothing. However buying the domain also signifies *some* commitment to the project and on many occasions is the motivational spark I need to get the website built and a new project off the ground. A one or two year registration is not a significant cost. Securing a domain for 10 years is. The easy workaround is to initially register a domain for the minimum period, if things take off then renew the domain for a longer period. Simple.

Google now places huge emphasis on links. They want to see a slow, gradual increase to the number of incoming links to your websites. Links need to have a variety of anchor text phrases. If all your anchor links are the same you could get de-listed or lose ranking position.

Ahh, natural linking patterns. I've mentioned this before - Google in many ways destroyed what it relies on to create such a great search engine. Before Google linking patterns were very natural, organic with sites linking to each other in many different ways, with different anchor text built up over a long period of time. Sites grew in popularity slowly and incoming links increased subsequently at similar rates of growth. Google stepped in and used these patterns in a ranking algorithm to accurately value sites. It worked, almost too well. Google became so popular that people began to study what it takes to rank highly. They learnt it was all about incoming links so they started an unnatural linking process, creating link farms and chasing incoming links with a vengeance.

Google wants natural linking back and will reward those sites that appear to be popular based on natural linking. This is not an exact science of course but if you do these things you are on the right track:

* Make sure your incoming links are not all carbon copies of the same keyword phrase. Vary the incoming anchor text with different phrases.

* Make sure you get links from many sites with varied PageRank. Assess link swaps based on the site (content, relevance) asking for the exchange, not solely on the green PR bar at the top of the site.

* Chase back links naturally, slowly increasing their number over time. If you go from 10 - 5000 backlinks in one month Google is going to think you are link farming and penalise you.

* Don't get paranoid. Stressing over why your site won't get listed and pressuring other sites for link exchanges can drive you crazy. Take it easy and work on building a great site slowly, tell the right people about it and they will spread the word for you.

Google is telling us that they look for relevant, quality content on your websites, which is no surprise, and that PageRank is a good indicator of a website's "authority" and relevancy.

The key term here is authority. Those sites that have been online for a long time with established authority in their field wield the power. One link from an authority site can boost you to the top of the rankings but take it away and you can just as easily fade to the bottom of the rankings. Again the emphasis here is on establishing links from many sites with various rankings. If you build an amazing site eventually the authority sites in your field may just link to you anyway and won't that make you feel special!

How do you build an authority site? With hard work of course. Authority sites don't appear out of nowhere, they build their authority over time by consistently working on quality content and audience creation. Don't expect overnight success, if you want a popular website you have to work at it for years, not weeks.

By Yaro Starak
http://www.entrepreneurs-journey.com
Do you want to profit from your own successful home based Internet business?
Learn from Yaro Starak, a young entrepreneur from Australia. He works part time from home on several web based business that generate between $2,000 and $8,000 per month. Get your free articles and audio now - visit his Internet Business Blog.

Saturday 23 August 2008

Search Bots, Crawlers, and Spiders

If you are a webmaster and you review your logs, often you will see a bunch of really strange hits. They aren't humans, you can't tell their operating system or their browser! Who are these pesky little creatures who rummage around the internet all the time?

Not quite sure what I am talking about? Here is a few examples of various bots searching my website:

207.68.146.40 (msnbot.msn.com)
msnbot/1.0 (+http://search.msn.com/msnbot.htm)
This is the MSN Search bot.

207.68.146.40 (lj2070.inktomisearch.com)
Mozilla/5.0 (compatible; Yahoo! Slurp;
http://help.yahoo.com/help/us/ysearch/slurp)
This is Yahoos Search Bot.

66.249.65.147 (crawl-66-249-65-147.googlebot.com)
Mediapartners-Google/2.1
This is Googles bot, that searches your webpages for AdSense.

What is a Bot, Crawler, Spider?
These terms are all the same, they all refer to an automated program that goes from website to website caching and processing the pages for search engines. As you know, "WWW" means World Wide Web, thus "Spider" seemed like an appropriate term. Crawler is another term that just describes what it does, crawling from site to site and page to page endlessly. Bot, is actually short for "robot" and again is just an automated program to index websites.

What is the purpose of a Spider?
A spider looks at all the pages of your website, and uses that information to rank you in search engines (how high you will list in a search result), and cache a copy of your page on their server for quick reference, and if your site ever goes down. Spiders jump from link to link on the Internet and run endlessly, even if you never submit your website to a search engine, odds are your site will still be spidered.

Can I stop bots and spiders from searching my website?
Yes and no. Legitimate spiders are run by reputable organizations that follow certain rules. For instance, most companies have a policy that their robot will search for a file called "robots.txt" in the root of your website. This text file is filled with information telling the bots what and what not is allowed to be viewed. Unfortunately, there are also bad bots out there, they search the internet harvesting e-mail addresses for spam and other bad things, these bots often don't comply with the "robots.txt" standard.

How many bots are there?
It's impossible to guess how many bots are out there searching websites. On any given day I will get roughly 10 different ones check my website. Some of them only search one or two pages, others go over my entire website. Not all of them give you a good description of what they do, or who owns them. If you cut and paste their name and IP address in to Google, quite often you can find more information about what they do.

How can I get my site spidered?
As I mentioned before, if your website is up long enough, it "will" get spidered eventually. However, if you want to ensure that it gets done within a few months, go to the various search engine websites and look for the "Add URL" or "Suggest a Link" pages. DMOZ is one of the big directories which you should submit your site. When you sign up for these search engines, your website is automatically queued up to be spidered. It may take several weeks or months to actually start showing up on the search engine, even after you see the robot spidering your website.

What about pay search engines?
There are a bunch of different search engines that make you pay to have your website listed. I personally don't support these search engines, I find that most people use the big free search engines anyway. However, if you do wish to get included in some search engines faster, many have payment options which will get your site listed within a couple of days.


Ken Dennis
http://KenDennis-RSS.homeip.net/

1 Simple SEO Strategy To Get More Visitors To Your Site From Google

Did you know that you can dramatically increase the number of visitors that come to your site on a daily basis from Google? And it's not constantly improving your position in Google search engine result pages(SERPs) for your competitive keywords which can take some time after working on your search engine marketing campaigns.

I take this example from Google because I've experienced it some time back now. Apart concentrating on getting and maintaining a top 10 ranking in Google, there are lots of easy traffic sources that you haven't exploited yet. We are still talking about search engine traffic here.

What's that strategy you ask? The answer might surprise you but it's a technique that works and is pretty legitimate. It's not creating stand alone or doorway pages with practically no content, overly optimized with keywords and a link back to the homepage. Doorway pages work but only if you know how to do it well. And this article will talk a bit about this topic as well.

The strategy is to search for overlooked keyword phrases which are not too competitive and create effective doorway pages related to these keyword phrases. These pages can be promoting a product for instance.

Just by adding a few effective doorway pages, I managed to make 9 sales in just a few short weeks and earned $364.59.

Imagine you come up with several keyword phrases that generate a few monthly searches, you now have several pages. So each page targeting a specific keyword phrase is worth traffic and not any type of traffic but it will be targeted.

So if you have one page which brings you only 1 visitor per day and you have 50 pages, you can easily receive 50 visitors per day for free. You see the potential now.

Keywords that have about 1000 searches on Yahoo! Search Marketing Solutions previously known as Overture at http://inventory.overture.com/d/searchinventory/suggestion are valuable keywords with lower competition.

If you are using Wordtracker at http://www.wordtracker.com, a keyword with only 10 searches per day can get you a good ranking and bring you traffic.

If you have already a website which is generating traffic, you can dramatically increase your numbers, simply by adding relevant doorway pages targeting a specific keyword. These pages have content and have your optimized navigation menu on it with their specific keyword phrases. These pages are just an entry to your site nothing more nothing less.

They should not be overly optimized with keywords but they must have some optimized content which is readable by your visitors and friendly for the search engines as well.

If you have a good website which is crawled by Google's robot called "Googlebot" often, your new pages will get spidered and indexed fast and will start bringing you small loads of targeted traffic.

But be careful when adding pages, don't go in a frenzy with this and add hundreds or even thousands of pages all of a sudden. For eg if your site has 50 pages already indexed in Google and ranking well and you add 100 new pages at once, your indexed pages might suffer a temporary drop in rankings. The key here is to add pages on a regular basis say 1 or 2 pages daily until you reach the total number of pages to be added.

Why not capitalize on this free source of traffic? The key is to research your keywords well first, work on your content and create these pages afterwards.

Your website will grow bigger and bigger with time and it will attract loads of targeted traffic from multiple keyword phrases.

Good luck and happy research and optimizing.

This article can be freely published on a website as long as it's not modified in any way including the author bylines, plus the hyperlink must be made active just like below.

Jean Lam is the author of the highly acclaimed eBook Top Search Engine Ranking Secrets in Google Revealed, a concise, step-by-step guide to high search engine ranking for the beginner to intermediate level webmaster.

Search Engine Marketing 101 For Corporate Sites

When most people want to find something on the web, they use a search engine. Millions of searches are conducted every day on search engines such as: google.com, yahoo.com, msn.com and many others. Some people are looking for your website. So how do you capture people searching for what your site has to offer? Through techniques called search engine marketing (SEM).

This tutorial is foundational information for anyone looking to implement search engine marketing. This tutorial will also help you understand how the search engines work, what SEM is, and how it can help you get traffic.
What is a Search Engine?

All search engines start with a "search box", which issometimes the main focus of the site, e.g. google.com, dmoz.org, altavista.com; sometimes the "search box" is just one feature of a portal site, e.g. yahoo.com, msn.com, netscape.com. Just type in your search phrase and click the "search" button, and the search engine will return a listing of search engine result pages (SERPs). To generate SERPs the search engine compared your search phrase with information it has about various web sites and pages in its database and ranks them based on a "relevance" algorithm.
Search Engine Classes

Targeted audience, number of visitors, quality of search and professionalism is what determines a search engine's class. Each search engine typically target specific audiences based on interest and location. World-class search engines look very professional, include virtually the entire web in their database, and return highly relevant search results quickly.

Most of us are familiar with the major general search engines; google.com, yahoo.com, msn.com. A general search engine includes all types of websites and as such are targeting a general audience. There are also the lesser known 2nd tier general search engines; zeal.com,ask.com,whatyouseek.com. The primary difference is that 2nd tier engines are lesser known and generate significantly less traffic.

There are also several non-general or targeted search engines that limit the types of websites they include in their database. Targeted search engines typically limit by location or by industry / content type or both. Most large metro areas will have local search engines that list local businesses and other sites of interest to people in that area. Some are general and some are industry specific, such as specificallylisting restaurants or art galleries.

Many other targeted search engines list sites from any location but only if they contain specific types of content. Most webmasters are familiar with webmaster tools search engines such as; webmasterworld.com, hotscripts.com, flashkit.com and more. There are niche SEs for practically any industry and interest.
Search Engine Models

There are two fundamentally different types of search engine back ends: site directories and spidering search engines. Site directory databases are built by a person manually inputting data about websites. Most directories include a site's url, title, and description in their database. Some directories include more information, such as keywords, owner's name, visitor rankings and so on. Some directories will allow you to control your website's information yourself others rely on editors that write the information to conform to the directory standards.

It is important to note that most directories include directory listings as an alterative to the search box for finding websites. A directory listing uses hierarchal groupings from general to specific to categorize a site.

Spidering search engines take a very different approach. They automate the updating of information in their database by using robots to continually read web pages. A search engine robot/spider/crawler acts much like a web browser, except that instead of a human looking at the web pages, the robot parses the page and adds the page's content it's database.

Many of the larger search engines will have both a directory and spidering search engine, e.g. yahoo.com, google.com, and allow visitors to select which they want to search. Note that many search engines do not have their own search technology and are contracting services from elsewhere. For example, Google's spider SE is their own, but their directory is and Open Directory; additionally aol.com and netscape.com both use Google's spider SE for their results.

There are a few other search engine models of interest. There are some search engines that combine results from other engines such as dogpile.com and mamma.com. There are also search engines that add extra information to searches such as Amazon's alexa.com, which uses Google's backend but adds data from its search bar regarding tracking traffic to the site.
Getting In

One of the most important things to understand about the SE database models is how to get into their database and keep your listing updated. With a search directory, a submission needs to be done to provide the directory all the information needed for the listing. It is generally recommended that this be done by hand, either by you or a person familiar with directory submissions. There are many submission tools available that advertise they automate the submission process. This may be fine for smaller directories but for the major directories, manual submissions are worth the time.

Not all search directories are free; many charge a one-time or annual fee for review. Many of the free search directories have little quality control. For free directories you may have to submit your site several times before being accepted.

There are three different methods for getting into spidering search engines; free site submission, paid inclusion and links from other sites. Virtually all spidering SEs offer a free site submission. For most, you simply enter your url into a form and submit. Paid inclusion is normally not difficult, except for the credit card payment. For free site submission there is no quality control. The SE may send a spider to your site in the next few weeks, months or never. Typically with paid inclusion you will get a guarantee that the page you submitted will be included within a short amount of time. The other standard way to get included is to have links to your website from other web pages that are already in the SEs database. The SE spiders are always crawling the web and will eventually follow those links to find your site.

Once you are in a search engine database, you might change your site and need the search engine to update their database. Each directory handles this differently; generally each database will have a form for you to submit a change request. Spidering search engines will eventually find the change and add your updates automatically.
Getting High Rankings

Getting into a search engine database is only the first step. Without other factors you will not rank in the top positions, a prerequisite for quality traffic. So how do you get top positions? You can pay for placement with sponsored links that is covered in the next section. To place well in the free, organic SERPs, you will need to perform search engine optimization.

Search engine optimization is one of the most complicated aspects of web development. Each search engine uses a different algorithm, using hundreds of factors, that they are constantly changing, and they carefully guard their algorithm as trade secrets. Thus no one outside of the search engines employ knows with 100% certainty the perfect way to optimize a site. However, many individuals called search engine optimizers have studied the art and derived set of techniques that have a track record for success.

In general, there are two areas to focus on for top rankings; on-page factors and linking. On-page factors mean placing your target keywords in the content of your site in the right places. The structure of and technologies used on your website also play a role in on-page factors. Linking, refers to how other website's link to yours and how your site links internally.
Search Engine's Marketing Offerings

Search engines in the early days of the web were focused solely on serving the visiting searcher. They worked to capture as much of the web as possible in their database and provide fast, relevant searches. Many early website owners learned to reverse engineer the relevancy algorithms and to make their sites "search engine friendly" to get top rankings. They were the first search engine optimizers, manipulating the search engine's natural or organic SERPs as a means of generating free web traffic.

Often times these optimized sites compromised the integrity of the SERPs and lowered the quality for the searcher. Search engines fought, and continue to fight, to maintain the quality of their results. Eventually, the search engines embraced the fact that they are an important means for marketing websites. Today most search engines offer an array of tools to balance website's owners need to market while maintaining quality for the searcher.

You can generally break search engine marketing tools into free and for-pay. Realize these classifications are from the search engine's point of view. Effort and expense is required to setup and maintain any search engine marketing campaign.

Organic rankings are still one of the most important ways to drive quality traffic. Search engines now seek to reward ethical, high-quality websites with top rankings and remove inappropriate "spam" websites. While organic rankings can produce continual free traffic, it takes time from an experienced individual to achieve optimum results. Additionally, organic placement offers no guarantees, it generally takes months to get listed and can be unpredictable once listed.

Some search engines offer services that add more control to your organic campaign. Most of these services will list / update your site faster or will guarantee that all essential content is listed. For integrity reasons, no major search engine offers higher organic rankings for a fee.

If you need top rankings quickly, pay-per-positioning (PPP) is the most popular way to go. PPP rankings appear in normal organic SERPs but are usually designated as "sponsored listings". PPP listings use a bidding process to rank sites. If you are the top bidder, e.g. willing to pay the most per click on a given phrase, you will have top placement. The 2nd highest bidder is two; the next is 3 and so on. While most PPP works using this model, some search engines offer modifications such as Google's AdWords where bid price and click-through rates are both factors for positioning.

Search Engines have many other marketing tools, such as search specific banner ads; listings on affiliate sites and more.
Getting Started

The majority of websites have sub-optimal search engine marketing. Most sites have no effective search engine marketing and are continually missing out on valuable leads. Many other websites are too aggressive, wasting money on low value traffic or harming the functionality of their site due to over optimization. Too many sites are even paying money and receiving no results because they have trusted unethical or inexperienced search engine optimizers.

All SEM campaigns should start with a strategic evaluation of SEM opportunities based on return on investment (ROI). You need to assess how much each lead is worth for each keyword phrase and determine which SEM tools will achieve the best ROI for the phrase.

You also have to decide how much you want to do in-house vs. retaining an expert. A qualified expert will typically produce better results faster, but the high expenses may destroy the ROI. Often it is best to work with an expert as a team, the expert to develop the strategy and internal staff to perform implementation and ongoing management.

Tom McCracken is the Director of LevelTen Design, a Dallas based e-media agency. He has over 14 years of experience in software engineering and marketing. He has developed solutions to improve custom service and communications for some of the worlds largest companies. With an education in chemical engineering and economics from Johns Hopkins University, his background includes; web and software development, human factors engineering, project management, business strategy, marketing strategy, and electronic design.