SEO Quiz- Test Your SEO Knowledge SEO Updates

Sunday, September 18, 2011

Google Provides New Options for Paginated Content

At SMX Advanced earlier this year, a hot topic was the use of the rel=”canonical” attribute in conjunction with pagination. Maile Ohye of Google noted that the rel=”canonical” attribute was not intended to cluster multiple pages (articles, product lists, etc.) to page one of that series (although it can be used to cluster multiple pages to a “view all” page).

The discussion was fast and furious and we dove into the various pagination issues that content owners encounter. Maile took the feedback to Google and they got to work on some options. The goal? To present some new solutions at SMX East at a panel set up just to talk through pagination issues. Today, be impressed with my multitasking skills as I write this article that describes these new solutions while I moderate the session!

Google has done two things: evolved how they detect and cluster components of a series with a view all page and launched new rel attributes to enable content owners to specify components of a paginated series. They describe both in blog posts today.

Pagination

 

New Handling of View All Pages

Google has been evolving their detection of a series of component pages and the corresponding view all page. When you have a view all page and paginated URLs with a detectable pattern, Google clusters those together and consolidates the PageRank value and indexing relevance. Basically, all of the paginated URLs are seen as components in a series that rolls up to the view all page. In most cases, Google has found that the best experience for searchers is to rank the view all page in search results. (You can help this process along by using the rel=”canonical” attribute to point all pages to the view all version.)

 

If You Don’t Want The View All Page To Rank Instead of Paginated URLs


If you don’t want the view all version of your page shown and instead want individual paginated URLs to rank, you can block the view all version with robots.txt or meta noindex. You can also use the all new rel=”next”/rel=”prev” attributes, so read on!

New Pagination Options


If you don’t have a view all page, or you don’t want the view all page to be what appears in search results, you can use the new attributes rel=”next” and rel=”prev” to cluster all of the component pages into a single series. All of the indexing properties for all components in the series are consolidated and the most relevant page in the series will rank for each query. (Yay!)

You can use these attributes for article pagination, product lists, and any other types of pagination your site might have. The first page of the series has only a rel=”next” attribute and the last page of the series has only a rel=”prev” attribute, and all other pages have both.  You can still use the rel=”canonical” attribute on all pages in conjunction.

Typically, in this setup, as Google sees all of these component pages as series, the first page of the series will rank, but there may be times when another page is more relevant and will rank instead. In either case, the indexing signals (such as incoming links) are consolidated and shared by the series.
Make sure that the value of rel=”next” and rel=”prev” match the URL (even if it’s non-canonical) as the rel/next values in the series have to match up (you likely will need to dynamically write the values based on the display URL).

There are lots of intricacies to consider here, and I’m working on an in-depth article that runs through everything that came up in the session, so if you have questions, post them here and I’ll add them in!


, Follow her on Twitter at @vanessafox

Monday, May 30, 2011

Is Your Site Under Google Penalty?


Google Penalty



One of the most important aspects of taking care of a site’s search ‘appropriateness’ is knowing what can get you penalized by Google (or any other search engine for that matter).  Knowing how to assess the situation correctly so that you can tell if you have just been served a penalty can help you significantly to get the site back at the top for your search terms.

Unfortunately, it’s a sure thing that Google is not going to publish the criteria it uses for deciding who gets penalized. So we have to make an educated guess. In the SEO community, our opinions come from spending a lot of time–in some cases years–observing what does, and doesn’t get good results. As with just about any other aspect of SEO, most of what I’m about to say here will be met with cyber-cries of ‘but I disagree,’ or ‘I can prove otherwise,’ pr even expletives! That’s the nature of what we do–there’s always a lot of room for disagreement.

If you have been following SEO best-practices closely for some time, it’s highly unlikely that you’ll fall foul of the search engines to the degree that you get penalized. But sometimes as SEO warriors, we inherit a bad situation that someone else has created, and it’s not always obvious at first glance.

 

Google Sandbox or Penalty?

Before I continue, it’s worth mentioning that there’s a difference between a Google penalty and being flung in the Google sandbox. Actually people have questioned whether the sandbox even exists. But I think it’s fair to assume that it does. It is a common phenomenon that a new site will simply fail to show up: it won’t get indexed at all for weeks, or even months. It seems that until a new site or new pages earns its trust as far as Google is concerned, it sometimes don’t show up (are not even indexed) for a long period of time; Ann Smarty has already explained that in detail so I’ll leave it to her. Then all of a sudden those pages or sites appear without the webmaster making any changes, much to everyone’s relief.

 

Penalized: Knowingly or Unknowingly!

Sometimes an unscrupulous marketer–and I don’t use the term SEO here because in my book, search engine optimization does not include underhanded tricks of any kind–will use a technique that he knows may have a backlash later on, in order to achieve short- term gains to impress site owners. He or she will do this on the assumption that by the time the penalty is served up by Google, he or she will be long gone and no one will know what happened (and maybe even call said marketer back and pay them more money to sort it out).
More often though, a penalty is served simply because someone did something unknowingly.

So whether you are a freelance SEO or an in-house SEO you will need to be aware of what it looks like when a site has been penalized so that you can do a little detective work to find out what the problem is and quickly get your site back into the search stream.
The most obvious sign that you’re being penalized is if you’re not showing up in a search for keywords that you’re clearly optimizing for.  But perhaps the first thing you’ll notice is a sudden, drastic falling-off of traffic. Be careful here though: a sudden decline in traffic doesn’t necessarily mean that you’ve been penalized. It could just be that search trends have changed and the terms you were optimizing for are suddenly nowhere near as popular as they were. This does happen, and it’s one of the reasons why we recommend constant review of the search terms you use.

But there are other more subtle signs of search engine penalties.

 

The Specifics of Getting Penalized

  • If you feel that a Google penalty may have been incurred for any reason, the place to start is Google Webmaster Tools. Here you will find complete check lists to help you detect a problem if there is one. Go through the list and make sure that you are complying with Google’s list of best practices.
  • Compare your site’s Microsoft rank with Google. If you are on page 1 over at Bing and Yahoo, yet you’re not even showing on Google, then chances are you have been penalized.

Just to recap, although you can find this information all over the Web, here are items that Google WILL CERTAINLY  impose a penalty for:
  • Keyword stuffing: putting the same two or three keywords over and over again throughout your page will trigger alarm bells over at Google.
  • Cloaking: any form of disguising text is a huge no-no with Google.
  • Obviously-commercial content, where a few sentences that are usually not useful to anyone are woven around a set of keywords, purely for the purposes of Adsense, will probably get you penalized.
  • If you link to a website that is in a ‘bad neighborhood’ you could incur a penalty. Even sharing an IP address (as with shared hosting) can seriously damage your site if you have some notoriously bad sites on there. This is just one reason why it’s worth paying a bit extra to get the best shared web hosting: avoiding being associated with the spam and porn sites.
  • You’re acquiring links too fast and it doesn’t look natural: Google may assume you’re buying them or doing something else unethical to attract attention.
  • There has been disagreement lately over whether duplicate content will get you a penalty. I say it most definitely will (I’ve tested this one out many times myself). Even if, best case scenario, Google chooses to honor the most relevant version of the content, whether that be because it’s most relevant to the website, oldest (and therefore original) version, or for some other reason, who would want to take that chance if you can pick the option to have fresh, unique content on your website or blog?
  • If all your pages have the exact same title tags, again you’re going to get penalized. Each title tag for every page of your site should be unique and carefully chosen.

Wednesday, May 18, 2011

8 Steps to Optimize Your Blog Post


If you’re writing and publishing blog posts, but not putting in the few extra steps to optimize and align them with an overall keyword strategy, then you’re not leveraging the full potential of that content and you’re not making your website pages visible and highlighted for the search engines.


Blog Optimizing: Back to the Basics

SEO Strategy for Blogs, Blog Optimization

Content is a form of online currency that is crucial to any business' online marketing. With consumers relying on search engines for product research and reviews, content is key for ranking among those search results because search engines largely determine the quality and relevancy of the Internet’s countless web pages by looking at the text on those pages.

Just having content, even great content, on your company's website isn't enough to grab the attention of search engines. Businesses must leverage this content using search engine optimization (SEO) tactics. Maintaining a corporate blog is a good SEO tactic that allows for rapid content creation without the constraints of website architecture and web development teams.

Here’s how you can optimize your blog post in eight steps.

1. Find a Compelling Subject

One method for differentiating your content from all the other writing available across the web is to offer a fresh perspective and a unique angle on a given subject matter. If you haven’t spent time working through this step, don’t bother with the rest of the optimization process.

2. Conduct Keyword Research

This step is the perfect litmus test for determining whether your blog post topic is aligned with what people are looking for. When developing your focused keyword list around the blog post topic, make sure to do a sanity check and confirm that consumers are actually using these keywords to search for your product/service.
Save yourself time in the long run and filter out visitors who are unlikely to buy your product by ensuring your keywords align with the purchasing intent of your target audience.

3. Select Keywords

In order to rank high for a given keyword phrase, it’s important that you only designate up to two to three keywords per website page. Limit your blog post to one primary keyword, as well as two or three variations of that keyword (e.g. optimize blog post, optimize blog, blog post optimize, blog optimize).

4. Track Keyword Ranking Trends

Make sure your focus keyword is worth optimizing for. If there are only 10 searches for a given keyword per month, it might not be worth your while.
Look at how your target keyword phrase is trending, in terms of global monthly searches, how competitive the search term is, and whether any of your competitors or one of your pages are already ranking for it.

5. Optimize the Page

Page optimization is crucial for boosting the visibility of your blog post for the search engines. After you create the content, insert your keyword phrase throughout the blog post in specific locations where the search engines will be looking for information about your page (i.e. URL, title tag, H1, H2, emphasized text in the body of the post, alt tags on images).

From here on out, every time you mention this specific keyword phrase on your website, use an internal link to its corresponding blog page. There are also available SEO plugins for certain blog platforms, like WordPress’ popular  “All in One SEO Pack,” to help you control these SEO elements.

6. Syndicate via Social Channels

Syndicate your blog post externally by sharing it across your social networks like Twitter and Facebook. Additionally, post comments with your blog post link on relevant, external articles to attract clicks through to your site.

Make sure to use the blog post’s target keywords in your syndication via tweets and Facebook status updates. Help your audience share your content as quickly and easily as possible by including social sharing buttons on your blog post pages like the tweet, Facebook Like, LinkedIn Share, and AddThis buttons.

Consider adding Facebook's new comments plugin to drive engagement and sharing. Also, make your content available via RSS feed, so subscribers can regularly view your latest content on their news reader of choice.

7. Find Top Links

Inbound links are essential for boosting the search engine rank of a website page. A handful of relevant links will help you better rank. Use a link suggestion tool to help identify and track high-quality, relevant websites that you can reach out to with your blog post and request a link back to your page.

8. Track Keyword Performance

Monitor your blog post on a regular basis, in terms of rank, visits, and leads from its given keyword phrase over time. By checking back on your progress, you can understand what about your content is resonating with your audience and what to improve upon. Evaluate what worked and what didn’t, then repeat the successful tactics with your next piece of content.

Summary

SEO is a gradual process, but by just setting aside an hour a week, you can make a lot of progress over time.

While many view paid search as a quick and easy way to drive traffic without a large time investment, once you switch it off, you lose that traffic. SEO, on the other hand, when done well, can have a long-lasting, sustainable impact for your website.

See Original here (ref): http://searchenginewatch.com/article/2071301/8-Steps-to-Optimize-Your-Blog-Post


Get Meta Optimization Code For Posting in Blogspot here: http://googleseostrategies.blogspot.com/2011/05/meta-optimization-code-for-posting-in.html

.

Tuesday, May 10, 2011

Google +1 for Websites Nears Launch



Google’s answer to the Facebook Like Button will make its debut “in the coming weeks,” according to Google’s development team.

On Tuesday at the Google I/O developer conference in San Francisco, the search giant gave developers a sneak peek at the Google +1 button. It’s very similar to Facebook’s Like Button or the Twitter Tweet button — it provides a way for website visitors to endorse and share an article or web page.

According to Search Engine Land, the buttons will be available in seven different shapes and sizes with and without counters. Publishers can create one of these +1 buttons from a simple form where they can generate the embed code.

As you might expect, Google’s +1 button also comes with a suite of analytics that look similar to the Google Analytics dashboard. Once enough people have used a website’s +1 button, the data will be graphed. Demographic information such as age, gender and location are recorded. The analytics even include +1 data from Google search pages, which could prove useful for publishers that want to improve their presence on the world’s largest search engine.

Google played coy with the exact launch date of the +1 button, but you can expect it to make its debut at the end of May or in early June. The company has a signup form if you want to get notified about the button’s launch.

Google Business: Introducing the +1 Button Video :



See original here: http://mashable.com/2011/05/10/google-1-websites/


How to implement Google +1 Button on your webpage. Find the code below.......



<!-- Place this tag in your head or just before your close body tag -->
<script src='http://apis.google.com/js/plusone.js' type='text/javascript'></script>

<!-- Place this tag where you want the +1 button to render -->
<g:plusone></g:plusone>


Tuesday, April 19, 2011

The importance of robots.txt


How to create a robots.txt file for search engines to effectively navigate and index a website. A well written robots.txt file helps improve search engine rankings by providing important information to the search engine bot.



Although the robots.txt file is a very important file if you want to have a good ranking on search engines, many Web sites don't offer this file.
If your Web site doesn't have a robots.txt file yet, read on to learn how to create one. If you already have a robots.txt file, read our tips to make sure that it doesn't contain errors.

What is robots.txt?

When a search engine crawler comes to your site, it will look for a special file on your site. That file is called robots.txt and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.

The robots.txt file is a simple text file (no HTML), that must be placed in your root directory, for example:
http://www.yourwebsite.com/robots.txt

How do I create a robots.txt file?

As mentioned above, the robots.txt file is a simple text file. Open a simple text editor to create it. The content of a robots.txt file consists of so-called "records".

A record contains the information for a special search engine. Each record consists of two fields: the user agent line and one or more Disallow lines. Here's an example:

User-agent: googlebot
Disallow: /cgi-bin

This robots.txt file would allow the "googlebot", which is the search engine spider of Google, to retrieve every page from your site except for files from the "cgi-bin" directory. All files in the "cgi-bin" directory will be
ignored by googlebot.

The Disallow command works like a wildcard. If you enter

User-agent: googlebot
Disallow: /support

both "/support.html" and "/support/index.html" as well as all other files in the "support" directory would not be indexed by search engines.

If you leave the Disallow line blank, you're telling the search engine that all files may be indexed. In any case, you must enter a Disallow line for every User-agent record.

If you want to give all search engine spiders the same rights, use the following robots.txt content:

User-agent: *
Disallow: /cgi-bin

Where can I find user agent names?

You can find user agent names in your log files by checking for requests to robots.txt. Most often, all search engine spiders should be given the same rights. in that case, use "User-agent: *" as mentioned above.

Things you should avoid

If you don't format your robots.txt file properly, some or all files of your Web site might not get indexed by search engines. To avoid this, do the following:
  1. Don't use comments in the robots.txt file

    Although comments are allowed in a robots.txt file, they might confuse some search engine spiders.
    "Disallow: support # Don't index the support directory" might be misinterepreted as "Disallow: support#Don't index the support directory".

  2. Don't use white space at the beginning of a line. For example, don't write
    User-agent: *
    Disallow: /support
    but
    User-agent: *
    Disallow: /support

  3. Don't change the order of the commands. If your robots.txt file should work, don't mix it up. Don't write
    Disallow: /support
    User-agent: *
    but
    User-agent: *
    Disallow: /support

  4. Don't use more than one directory in a Disallow line. Do not use the following
    User-agent: *
    Disallow: /support /cgi-bin /images/
    Search engine spiders cannot understand that format. The correct syntax for this is
    User-agent: *
    Disallow: /support
    Disallow: /cgi-bin
    Disallow: /images

  5. Be sure to use the right case. The file names on your server are case sensitve. If the name of your directory is "Support", don't write "support" in the robots.txt file.
  6. Don't list all files. If you want a search engine spider to ignore all files in a special directory, you don't have to list all files. For example:

    User-agent: *
    Disallow: /support/orders.html
    Disallow: /support/technical.html
    Disallow: /support/helpdesk.html
    Disallow: /support/index.html
    You can replace this with
    User-agent: *
    Disallow: /support

  7. There is no "Allow" command

    Don't use an "Allow" command in your robots.txt file. Only mention files and directories that you don't want to be indexed. All other files will be indexed automatically if they are linked on your site. 

 

Tips and tricks:

1. How to allow all search engine spiders to index all files
    Use the following content for your robots.txt file if you want to allow all search engine spiders to index all files of your Web site:
    User-agent: * Disallow:
2. How to disallow all spiders to index any file
    If you don't want search engines to index any file of your Web site, use the following:
    User-agent: * Disallow: /
3. Where to find more complex examples.
    If you want to see more complex examples, of robots.txt files, view the robots.txt files of big Web sites:
Your Web site should have a proper robots.txt file if you want to have good rankings on search engines. Only if search engines know what to do with your pages, they can give you a good ranking.

Wednesday, April 13, 2011

10 Essential SEO Strategies


Learn 10 Essential SEO Tips And Tricks For Getting A Higher Search Engine Ranking.


Search engine optimisation (SEO) is a vast topic. There are whole websites and books out there devoted to the subject. The world of SEO is also constantly changing as search engines tweak their algorithms, and new search engines come and go.
However, there are some timeless SEO techniques that will always come in useful. Here are 10 top SEO tips that you can use to improve your website's ranking on the search engines.

Info: If you're new to search engine optimization, check out SEO for Beginners first.

1. Write Good Content

    This is maybe the most important strategy of all. If your pages contain good, relevant, useful content, search engines will be more likely to rank your page higher for relevant searches. As an added benefit, good content will encourage more sites to link to your pages, thereby further increasing your search engine ranking.
    It's also good to update your content regularly. Visitors like fresh content, so they will visit your site more often. More visits lead to more links to your content, which ultimately results in more traffic.

    2. Do Your Keyword Research

    Don't target a keyphrase just because it sounds right to you, or because it gets a lot of searches.
    Think about what you ultimately want visitors to do on your site (your conversion goals), then find out what keywords people search for when they want to achieve those goals. Use tools such as Google Analytics to see which keyphrases result in the most goal conversions.
    Stay away from 1-word (or possibly even 2-word) keyphrases that have thousands of competitive sites in the search results. Instead, use tools such as Wordtracker and the AdWords Keyword Suggestion tool to find relevant niche keyphrases with high search volume and low competition.
    For example, if your online store sells Mega Widgets in the Boston area, target the keyphrase "mega widgets boston", rather than just "widgets".

    3. Use Your Keywords Wisely

    Once you have a good list of keyphrases, deploy them sensibly throughout your site pages. Make sure you've used your keywords in the following text blocks (these are in rough order of importance, most important first):
    •    The title tag
    •    The h1 and h2 headings in the page
    •    Link text (in links within the page, and in links from other pages)
    •    The page URL
    •    Image alt text
    •    Bold and italicised text
    Also make sure your keywords have a reasonable density (i.e. they appear fairly often in the above text blocks — but not too often) and prominence (place them near the start of each text block).

    Info: Make sure you use text rather than images in the page where possible. This is particularly true of navigation menus. If you must use an image, make sure it has keyword-rich alt text.

    4. Get Other Sites Linking To Yours

    Most search engines rank sites more highly if they're linked to by other, well-respected sites.
    The key here is "well-respected". Just getting linked to from hundreds of reciprocal link pages is not going to do much for your ranking. Target a few relevant, good-quality sites and directories that are full of useful info and rank well on the search engines, and try to get a link back from them.
    Some link directories let you submit your site for free, while others require a fee — either one-off, or recurring. While paying for submission can be expensive, it can be worth it, especially if you're running an online store that has a lot of competitors.

    Getting links from other high-quality sites can be a challenge. Here are some tips:

    •    Writing good content (see strategy #1) is one of the best long-term strategies for encouraging inbound links.
    •    A "link to this page" function on each page of your site makes it easy for other webmasters and bloggers to link.
    •    Get in touch with the site owner and strike up a friendship. You're much more likely to get a link back from someone who knows you.
    •    If the site in question has useful content relevant to your readers, go ahead and link to that site from your own pages. This in itself might encourage a link back!

    5. Structure Your Site For SEO

    Your site structure can play an important part in optimizing your pages. Make sure your pages contain plenty of links to other important pages in your site, and that it's easy to get to all sections of your site via your homepage or navigation menu.
    Not only does this make it easier for visitors and search engines to find your content, but it also helps to spread your site's authority score (such as Google PageRank) more evenly throughout your site pages.
    A sitemap can really help here, as it lists all your site content on one, easy-to-use page — great for visitors and search engine spiders alike.

    6. Analyse Your Site

    It's important to track your site's SEO performance so that you can see if your efforts are paying off. Make use of the many free analysis tools out there, including:
    •    Yahoo! Site Explorer — Lets you find out detailed search-engine-related info on each page of your site, including the most popular pages and the inbound links to each page.
    •    Google Webmaster Tools — Gives you all sorts of details about how Google sees your site, such as problems crawling your pages and suggestions for how to improve your HTML.
    •    Google Analytics — Reports on vital traffic data such as visitors, pageviews, traffic sources, keywords, and lots more. Also lets you set up goals so you can see how well your SEO campaigns are performing.
    •    SEO Book and SEO Chat — these 2 sites offer a large range of free, Web-based SEO tools, many of which give you useful information about how your site is faring in the search results.

    7. Keep Abreast of the Latest SEO News

    Search engine algorithms change constantly, and it pays to keep up to date with the latest changes and SEO strategies. The following SEO sites are well worth bookmarking or subscribing to:

    •    Search Engine Watch is a huge SEO resource, including articles and white papers on SEO and SEM (search engine marketing), as well as some busy SEO forums.
    •    Matt Cutts' blog — Matt is a Google engineer, and in his blog he frequently discusses the latest changes at Google that can affect SEO.
    •    Sphinn is a social bookmarking site for SEO topics. Great for finding out what's new and hot in the world of SEO.
    •    SEO Book features a comprehensive, paid SEO training program, a regularly-updated blog, and some handy free SEO tools such as keyword suggesters and rank checkers.
    •    SEO Chat contains a large number of SEO articles, a huge range of free online SEO tools, and a big forum community.

    8. Avoid Black Hat Techniques

    SEO techniques come in 2 forms:
    •    White hat techniques play by the rules of the search engines, and aim to create high-quality, relevant content.
    •    Black hat techniques attempt to "game" the search engines by using techniques such as keyword stuffing (overusing keywords in a page), hidden text, and cloaking (presenting different versions of a page to real visitors and search engines).

    Black hat SEO techniques can sometimes produce a short-term hike in traffic; however such sites invariably get weeded out — or, worse, banned altogether — by the search engines. Black hat SEO simply isn't worth the risk if you want to build a long-term stream of traffic from search engines.

    9. Watch Out For Duplicate Content

    Search engines dislike pages that basically contain the same content, and will give such pages a lower ranking. Therefore, avoid duplicate content URLs on your site.

    Many factors can result in a search engine seeing 2 URLs as duplicates of each other — for example:
    •    Articles republished from other websites
    •    Print-friendly versions of pages (make sure you exclude such pages from search engines with a robots.txt file)
    •    Similar product info pages that contain very little changing content apart from the product name and image
    •    Session IDs in URLs, or other URL parameters that result in different URLs for the same page
    •    Displaying your site at multiple domains — for example, www.example.com and example.com. Choose one domain or the other, then use 301 redirects to ensure that everyone (including search engines) is looking at just the one domain.

    10. Don't Forget the Description and Keywords Tags

    Many webmasters overlook the description and keywords meta tags, but they can give your site the edge over your competitors. Pay attention to these 2 tags in each page.
    •    The description tag should be a useful, compelling summary of your page content. This tag is often used to display a summary of your page in the search results, so it's worth making it keyword-rich and including a call to action.
    •    Most search engines ignore the keywords tag these days; however it doesn't hurt to create one (if nothing else it's another chance to insert your keywords in the page). Some directories also use the keywords tag to classify sites.
    •    Ensure that each page has unique description and keywords tags. If a search engine finds many pages with the same description and keywords, it can see those pages as less important.

    Info: Make sure your description and keywords tags aren't too long — they should be 1 or 2 lines of text.

    SEO is always a bit of a guessing game, with search engines changing their ranking algorithms constantly, but these 10 tips and techniques should be useful in any SEO situation. Good luck!

    Important Sites

    SEO Tools : www.webconfs.com
    SEO Update: Googleseostrategies.blogspot.com


    Tuesday, April 12, 2011

    Important Elements to Consider at Developing SEO Strategy



    Ever wondered what factors decides how good at Search Engine Optimization ‘SEO’ you can be, we all do at certain point while trying to figure the ins and outs of it. Here are some factors that will help you improve your SEO skills, and will get anyone ready too immerse further in one of the most important elements of online exposure, Search Engines.

    The day has gone when most developers were not aware of .htaccess tricks to handle basic SEO on their own. Companies spent thousands of dollars on hiring SEO professionals. Scripts for .htaccess were really hard to find, and used only by well known sites.

    After the arrival of revolutionary open source platforms like WordPress, Drupal, Joomla, among others, things have changed and the basics of SEO are no longer a mystery, and many tutorials on Search Engine Optimization started to appear all over free of cost.

    Like the article? Be sure to subscribe to our RSS feed and follow us on Twitter to stay up on recent content.

    Today even the most basic website builder will offer you a variety of tools for on-page SEO and website structure optimization. But still, this is only the top of the iceberg. We hand picked a few factors that affect the outcome of a well planned policy, and what points to consider to have a better acumen for the course of action to implement.



    1. A good SEO person plans policies after going through many aspects of the Client’s business.

    In actuality, SEO is not just all about giving the site a high ranks on Search Engines for short time. It involves additional work, and the additional fragments that keep changing from time to time.


    2. Always finalize a policy with broad research according to newest needs.

    Once done with implementation, don’t alter much. Continue with what it was planned in the first place. Modify only as needed and accordingly with the original policy limits, so links won’t change. Search Robots may not like it if your link structure changes, when they re-crawl your URL and find those links unavailable. Stick to a consistence URL structure.

    3. If you think SEO is not a big deal, think again.

    You might have not gone through designing complicated projects. Sorry for being rude, but you can make your client fool but not the Search robots as they have smart algorithms that filter the quality and consistency of policy.

    4. The SEO get more and more complicated and powerful as you keep categorizing products on the site.

    Always categorize products and understand the nature of audience, humans and robots equally if you are willing to design a sound policy.



    5. Your previous work in SEO provides only limited help on the next project, never rely totally on it.

    The number of sites are increasing every day and so as the importance of SEO to increasing visibility. Because of this, Search Engines are always re-consider and enhancing their algorithms and the ways they rank sites, research and keep on the know of the latest changes they apply.

    6. Don’t think you can complete the goals by copying same method for different project.

    The policies differ as the nature and focus of the Project is different, plan every single project properly.

    7. There is no degree or official training in SEO.

    Policies and techniques are changing, and the only way to stay up-to-date is to extend the research level. Learn what you have lost and gained from previous projects, and what are other developer doing, and why. Examine the traffic log of your sites to better understand the nature of robots and visitors requirements, and how can you make exploration easier.


    8. Modern SEO is not just limited to improving a site quality of link and keyword.

    It also includes the online marketing of the site on the basis of your own site though search engines. Modern SEO also includes having clear visibility in social networks as they are in many cases the major sources of traffic. Many blogs have figures as 50% of their traffic arriving from social networks. Means utilizing Social Networks as channels to improve SEO relevance also matters.

    Conclusion

    The biggest problem with most SEO policies is that they overlook previous polices and keep thinking for new ones. However this can also be a positive point, because it keeps the SEO policy of their project updated according to the latest needs. However the problem could arrive, when constantly switching policies over again. And it mostly happens, in cases when a designed policy lacks of good research.

    Remember, once you change a site’s policy, it also lose a heavy amount of traffic from external links as in most cases the URL could also be altered.

    Before getting started on SEO policy, remember that every channel of a site should be categorized well. Many successful SEO people are enjoying six figure incomes. But no one can be sure to earn the same in the future, as things in SEO field keep changing and only the ones with up-to-date knowledge can thrive.

    No denial, blogs and books on SEO have made it easy to establish basic SEO policy. But it should be understood, SEO is a wide field and needs time and focus as the development and designing of policies changes depending on every site requirements.



    For more Important Elements of Developing SEO Strategies, please visit: http://googleseostrategies.blogspot.com/