SEO Quiz- Test Your SEO Knowledge SEO Updates

Tuesday, April 19, 2011

The importance of robots.txt


How to create a robots.txt file for search engines to effectively navigate and index a website. A well written robots.txt file helps improve search engine rankings by providing important information to the search engine bot.



Although the robots.txt file is a very important file if you want to have a good ranking on search engines, many Web sites don't offer this file.
If your Web site doesn't have a robots.txt file yet, read on to learn how to create one. If you already have a robots.txt file, read our tips to make sure that it doesn't contain errors.

What is robots.txt?

When a search engine crawler comes to your site, it will look for a special file on your site. That file is called robots.txt and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.

The robots.txt file is a simple text file (no HTML), that must be placed in your root directory, for example:
http://www.yourwebsite.com/robots.txt

How do I create a robots.txt file?

As mentioned above, the robots.txt file is a simple text file. Open a simple text editor to create it. The content of a robots.txt file consists of so-called "records".

A record contains the information for a special search engine. Each record consists of two fields: the user agent line and one or more Disallow lines. Here's an example:

User-agent: googlebot
Disallow: /cgi-bin

This robots.txt file would allow the "googlebot", which is the search engine spider of Google, to retrieve every page from your site except for files from the "cgi-bin" directory. All files in the "cgi-bin" directory will be
ignored by googlebot.

The Disallow command works like a wildcard. If you enter

User-agent: googlebot
Disallow: /support

both "/support.html" and "/support/index.html" as well as all other files in the "support" directory would not be indexed by search engines.

If you leave the Disallow line blank, you're telling the search engine that all files may be indexed. In any case, you must enter a Disallow line for every User-agent record.

If you want to give all search engine spiders the same rights, use the following robots.txt content:

User-agent: *
Disallow: /cgi-bin

Where can I find user agent names?

You can find user agent names in your log files by checking for requests to robots.txt. Most often, all search engine spiders should be given the same rights. in that case, use "User-agent: *" as mentioned above.

Things you should avoid

If you don't format your robots.txt file properly, some or all files of your Web site might not get indexed by search engines. To avoid this, do the following:
  1. Don't use comments in the robots.txt file

    Although comments are allowed in a robots.txt file, they might confuse some search engine spiders.
    "Disallow: support # Don't index the support directory" might be misinterepreted as "Disallow: support#Don't index the support directory".

  2. Don't use white space at the beginning of a line. For example, don't write
    User-agent: *
    Disallow: /support
    but
    User-agent: *
    Disallow: /support

  3. Don't change the order of the commands. If your robots.txt file should work, don't mix it up. Don't write
    Disallow: /support
    User-agent: *
    but
    User-agent: *
    Disallow: /support

  4. Don't use more than one directory in a Disallow line. Do not use the following
    User-agent: *
    Disallow: /support /cgi-bin /images/
    Search engine spiders cannot understand that format. The correct syntax for this is
    User-agent: *
    Disallow: /support
    Disallow: /cgi-bin
    Disallow: /images

  5. Be sure to use the right case. The file names on your server are case sensitve. If the name of your directory is "Support", don't write "support" in the robots.txt file.
  6. Don't list all files. If you want a search engine spider to ignore all files in a special directory, you don't have to list all files. For example:

    User-agent: *
    Disallow: /support/orders.html
    Disallow: /support/technical.html
    Disallow: /support/helpdesk.html
    Disallow: /support/index.html
    You can replace this with
    User-agent: *
    Disallow: /support

  7. There is no "Allow" command

    Don't use an "Allow" command in your robots.txt file. Only mention files and directories that you don't want to be indexed. All other files will be indexed automatically if they are linked on your site. 

 

Tips and tricks:

1. How to allow all search engine spiders to index all files
    Use the following content for your robots.txt file if you want to allow all search engine spiders to index all files of your Web site:
    User-agent: * Disallow:
2. How to disallow all spiders to index any file
    If you don't want search engines to index any file of your Web site, use the following:
    User-agent: * Disallow: /
3. Where to find more complex examples.
    If you want to see more complex examples, of robots.txt files, view the robots.txt files of big Web sites:
Your Web site should have a proper robots.txt file if you want to have good rankings on search engines. Only if search engines know what to do with your pages, they can give you a good ranking.

Wednesday, April 13, 2011

10 Essential SEO Strategies


Learn 10 Essential SEO Tips And Tricks For Getting A Higher Search Engine Ranking.


Search engine optimisation (SEO) is a vast topic. There are whole websites and books out there devoted to the subject. The world of SEO is also constantly changing as search engines tweak their algorithms, and new search engines come and go.
However, there are some timeless SEO techniques that will always come in useful. Here are 10 top SEO tips that you can use to improve your website's ranking on the search engines.

Info: If you're new to search engine optimization, check out SEO for Beginners first.

1. Write Good Content

    This is maybe the most important strategy of all. If your pages contain good, relevant, useful content, search engines will be more likely to rank your page higher for relevant searches. As an added benefit, good content will encourage more sites to link to your pages, thereby further increasing your search engine ranking.
    It's also good to update your content regularly. Visitors like fresh content, so they will visit your site more often. More visits lead to more links to your content, which ultimately results in more traffic.

    2. Do Your Keyword Research

    Don't target a keyphrase just because it sounds right to you, or because it gets a lot of searches.
    Think about what you ultimately want visitors to do on your site (your conversion goals), then find out what keywords people search for when they want to achieve those goals. Use tools such as Google Analytics to see which keyphrases result in the most goal conversions.
    Stay away from 1-word (or possibly even 2-word) keyphrases that have thousands of competitive sites in the search results. Instead, use tools such as Wordtracker and the AdWords Keyword Suggestion tool to find relevant niche keyphrases with high search volume and low competition.
    For example, if your online store sells Mega Widgets in the Boston area, target the keyphrase "mega widgets boston", rather than just "widgets".

    3. Use Your Keywords Wisely

    Once you have a good list of keyphrases, deploy them sensibly throughout your site pages. Make sure you've used your keywords in the following text blocks (these are in rough order of importance, most important first):
    •    The title tag
    •    The h1 and h2 headings in the page
    •    Link text (in links within the page, and in links from other pages)
    •    The page URL
    •    Image alt text
    •    Bold and italicised text
    Also make sure your keywords have a reasonable density (i.e. they appear fairly often in the above text blocks — but not too often) and prominence (place them near the start of each text block).

    Info: Make sure you use text rather than images in the page where possible. This is particularly true of navigation menus. If you must use an image, make sure it has keyword-rich alt text.

    4. Get Other Sites Linking To Yours

    Most search engines rank sites more highly if they're linked to by other, well-respected sites.
    The key here is "well-respected". Just getting linked to from hundreds of reciprocal link pages is not going to do much for your ranking. Target a few relevant, good-quality sites and directories that are full of useful info and rank well on the search engines, and try to get a link back from them.
    Some link directories let you submit your site for free, while others require a fee — either one-off, or recurring. While paying for submission can be expensive, it can be worth it, especially if you're running an online store that has a lot of competitors.

    Getting links from other high-quality sites can be a challenge. Here are some tips:

    •    Writing good content (see strategy #1) is one of the best long-term strategies for encouraging inbound links.
    •    A "link to this page" function on each page of your site makes it easy for other webmasters and bloggers to link.
    •    Get in touch with the site owner and strike up a friendship. You're much more likely to get a link back from someone who knows you.
    •    If the site in question has useful content relevant to your readers, go ahead and link to that site from your own pages. This in itself might encourage a link back!

    5. Structure Your Site For SEO

    Your site structure can play an important part in optimizing your pages. Make sure your pages contain plenty of links to other important pages in your site, and that it's easy to get to all sections of your site via your homepage or navigation menu.
    Not only does this make it easier for visitors and search engines to find your content, but it also helps to spread your site's authority score (such as Google PageRank) more evenly throughout your site pages.
    A sitemap can really help here, as it lists all your site content on one, easy-to-use page — great for visitors and search engine spiders alike.

    6. Analyse Your Site

    It's important to track your site's SEO performance so that you can see if your efforts are paying off. Make use of the many free analysis tools out there, including:
    •    Yahoo! Site Explorer — Lets you find out detailed search-engine-related info on each page of your site, including the most popular pages and the inbound links to each page.
    •    Google Webmaster Tools — Gives you all sorts of details about how Google sees your site, such as problems crawling your pages and suggestions for how to improve your HTML.
    •    Google Analytics — Reports on vital traffic data such as visitors, pageviews, traffic sources, keywords, and lots more. Also lets you set up goals so you can see how well your SEO campaigns are performing.
    •    SEO Book and SEO Chat — these 2 sites offer a large range of free, Web-based SEO tools, many of which give you useful information about how your site is faring in the search results.

    7. Keep Abreast of the Latest SEO News

    Search engine algorithms change constantly, and it pays to keep up to date with the latest changes and SEO strategies. The following SEO sites are well worth bookmarking or subscribing to:

    •    Search Engine Watch is a huge SEO resource, including articles and white papers on SEO and SEM (search engine marketing), as well as some busy SEO forums.
    •    Matt Cutts' blog — Matt is a Google engineer, and in his blog he frequently discusses the latest changes at Google that can affect SEO.
    •    Sphinn is a social bookmarking site for SEO topics. Great for finding out what's new and hot in the world of SEO.
    •    SEO Book features a comprehensive, paid SEO training program, a regularly-updated blog, and some handy free SEO tools such as keyword suggesters and rank checkers.
    •    SEO Chat contains a large number of SEO articles, a huge range of free online SEO tools, and a big forum community.

    8. Avoid Black Hat Techniques

    SEO techniques come in 2 forms:
    •    White hat techniques play by the rules of the search engines, and aim to create high-quality, relevant content.
    •    Black hat techniques attempt to "game" the search engines by using techniques such as keyword stuffing (overusing keywords in a page), hidden text, and cloaking (presenting different versions of a page to real visitors and search engines).

    Black hat SEO techniques can sometimes produce a short-term hike in traffic; however such sites invariably get weeded out — or, worse, banned altogether — by the search engines. Black hat SEO simply isn't worth the risk if you want to build a long-term stream of traffic from search engines.

    9. Watch Out For Duplicate Content

    Search engines dislike pages that basically contain the same content, and will give such pages a lower ranking. Therefore, avoid duplicate content URLs on your site.

    Many factors can result in a search engine seeing 2 URLs as duplicates of each other — for example:
    •    Articles republished from other websites
    •    Print-friendly versions of pages (make sure you exclude such pages from search engines with a robots.txt file)
    •    Similar product info pages that contain very little changing content apart from the product name and image
    •    Session IDs in URLs, or other URL parameters that result in different URLs for the same page
    •    Displaying your site at multiple domains — for example, www.example.com and example.com. Choose one domain or the other, then use 301 redirects to ensure that everyone (including search engines) is looking at just the one domain.

    10. Don't Forget the Description and Keywords Tags

    Many webmasters overlook the description and keywords meta tags, but they can give your site the edge over your competitors. Pay attention to these 2 tags in each page.
    •    The description tag should be a useful, compelling summary of your page content. This tag is often used to display a summary of your page in the search results, so it's worth making it keyword-rich and including a call to action.
    •    Most search engines ignore the keywords tag these days; however it doesn't hurt to create one (if nothing else it's another chance to insert your keywords in the page). Some directories also use the keywords tag to classify sites.
    •    Ensure that each page has unique description and keywords tags. If a search engine finds many pages with the same description and keywords, it can see those pages as less important.

    Info: Make sure your description and keywords tags aren't too long — they should be 1 or 2 lines of text.

    SEO is always a bit of a guessing game, with search engines changing their ranking algorithms constantly, but these 10 tips and techniques should be useful in any SEO situation. Good luck!

    Important Sites

    SEO Tools : www.webconfs.com
    SEO Update: Googleseostrategies.blogspot.com


    Tuesday, April 12, 2011

    Important Elements to Consider at Developing SEO Strategy



    Ever wondered what factors decides how good at Search Engine Optimization ‘SEO’ you can be, we all do at certain point while trying to figure the ins and outs of it. Here are some factors that will help you improve your SEO skills, and will get anyone ready too immerse further in one of the most important elements of online exposure, Search Engines.

    The day has gone when most developers were not aware of .htaccess tricks to handle basic SEO on their own. Companies spent thousands of dollars on hiring SEO professionals. Scripts for .htaccess were really hard to find, and used only by well known sites.

    After the arrival of revolutionary open source platforms like WordPress, Drupal, Joomla, among others, things have changed and the basics of SEO are no longer a mystery, and many tutorials on Search Engine Optimization started to appear all over free of cost.

    Like the article? Be sure to subscribe to our RSS feed and follow us on Twitter to stay up on recent content.

    Today even the most basic website builder will offer you a variety of tools for on-page SEO and website structure optimization. But still, this is only the top of the iceberg. We hand picked a few factors that affect the outcome of a well planned policy, and what points to consider to have a better acumen for the course of action to implement.



    1. A good SEO person plans policies after going through many aspects of the Client’s business.

    In actuality, SEO is not just all about giving the site a high ranks on Search Engines for short time. It involves additional work, and the additional fragments that keep changing from time to time.


    2. Always finalize a policy with broad research according to newest needs.

    Once done with implementation, don’t alter much. Continue with what it was planned in the first place. Modify only as needed and accordingly with the original policy limits, so links won’t change. Search Robots may not like it if your link structure changes, when they re-crawl your URL and find those links unavailable. Stick to a consistence URL structure.

    3. If you think SEO is not a big deal, think again.

    You might have not gone through designing complicated projects. Sorry for being rude, but you can make your client fool but not the Search robots as they have smart algorithms that filter the quality and consistency of policy.

    4. The SEO get more and more complicated and powerful as you keep categorizing products on the site.

    Always categorize products and understand the nature of audience, humans and robots equally if you are willing to design a sound policy.



    5. Your previous work in SEO provides only limited help on the next project, never rely totally on it.

    The number of sites are increasing every day and so as the importance of SEO to increasing visibility. Because of this, Search Engines are always re-consider and enhancing their algorithms and the ways they rank sites, research and keep on the know of the latest changes they apply.

    6. Don’t think you can complete the goals by copying same method for different project.

    The policies differ as the nature and focus of the Project is different, plan every single project properly.

    7. There is no degree or official training in SEO.

    Policies and techniques are changing, and the only way to stay up-to-date is to extend the research level. Learn what you have lost and gained from previous projects, and what are other developer doing, and why. Examine the traffic log of your sites to better understand the nature of robots and visitors requirements, and how can you make exploration easier.


    8. Modern SEO is not just limited to improving a site quality of link and keyword.

    It also includes the online marketing of the site on the basis of your own site though search engines. Modern SEO also includes having clear visibility in social networks as they are in many cases the major sources of traffic. Many blogs have figures as 50% of their traffic arriving from social networks. Means utilizing Social Networks as channels to improve SEO relevance also matters.

    Conclusion

    The biggest problem with most SEO policies is that they overlook previous polices and keep thinking for new ones. However this can also be a positive point, because it keeps the SEO policy of their project updated according to the latest needs. However the problem could arrive, when constantly switching policies over again. And it mostly happens, in cases when a designed policy lacks of good research.

    Remember, once you change a site’s policy, it also lose a heavy amount of traffic from external links as in most cases the URL could also be altered.

    Before getting started on SEO policy, remember that every channel of a site should be categorized well. Many successful SEO people are enjoying six figure incomes. But no one can be sure to earn the same in the future, as things in SEO field keep changing and only the ones with up-to-date knowledge can thrive.

    No denial, blogs and books on SEO have made it easy to establish basic SEO policy. But it should be understood, SEO is a wide field and needs time and focus as the development and designing of policies changes depending on every site requirements.



    For more Important Elements of Developing SEO Strategies, please visit: http://googleseostrategies.blogspot.com/