SEO Quiz- Test Your SEO Knowledge SEO Updates

Monday, July 23, 2012

In thi till trill little litter fill! | Is it illicitly lil’ lilli! | If I fill ill jill I’ll frill thrill!!!!!!!!!!!!!!!!!


Google limits SERP titles by pixel width, not by character count



The original post is below, but for the impatient folkshere’s the quick answer. This post’s title is 107 characters long. In the past, Google would have cut off the SERP title after 70 characters. But now, Google shows the entire title–all 107 characters. You can verify this for yourself by clicking here. Based on this experiment, it is reasonable to assume that Google no longer cares how many characters are in your SERP title; all it cares about now is how wide (measured in pixels) your title is.

Sources:
http://www.seomofo.com/experiments/serp/google-snippet-01.html
http://www.seomofo.com/experiments/serp/google-snippet-02.html
http://www.seomofo.com/experiments/serp/google-snippet-03.html
http://www.seomofo.com/experiments/serp/google-snippet-04.html
http://www.seomofo.com/experiments/serp/google-snippet-05.html
http://www.seomofo.com/experiments/serp/google-snippet-06.html
http://www.seomofo.com/experiments/serp/google-snippet-07.html
http://www.seomofo.com/experiments/google-serp-max-characters.html




• Predict how your web page will look in Google's search results
• Optimize your SERP snippets for higher click-through rates


Snippet Optimization Tool

Wednesday, May 16, 2012

Google Analytics Custom Reports

Google Analytics Custom Reports can be incredible time savers if you have the right reports. Instead of spending time digging around for important metrics, you can find what you need separated neatly into columns for some analysis that will lead to some actionable insight.

1. Content Efficiency Analysis Report

This report is from none other than the master of Google Analytics, Avinash Kaushik. Brands all over the world are starting to double down on content so it's important to answer questions such as:
  • What types of content (text, videos, pictures, etc.) perform best?
  • What content delivers the most business value?
  • What content is the most engaging?
content-effiency-report-google-analytics
The Content Efficiency Analysis Report comes in handy by putting all the key content metrics into one spot.
Here are the columns that the report will pull in:
  • Page title
  • Entrances
  • Unique Visitors
  • Bounces
  • Pageviews
  • Avg. Time on Page
  • Per Visit Goal Value
  • Goal Completions


Click Here To Get the Content Efficiency Analysis Report!

2. Keyword Analysis Report

keyword-analysis-report-google-analytics
If you're doing SEO, you want to make sure that your optimization efforts are working as intended. Is the right keyword pointing to the right page?
This first tab of this report, Targeting, will break things down by placing the title and keyword side-by-side. The four metrics you'll see are:
  • Unique Visitors
  • Goal Completions
  • Goal Conversion Rate
  • Avg. Page Load Time (sec)
Using the 4 metrics above, you'll be able to judge whether you need to make adjustments to your campaign or not.
The second tab, Engagement, will tell you how effective each page is by looking at the following six metrics:
  • Unique Pageviews
  • Pages/Visit
  • Avg. Time on Page
  • Bounce Rate
  • Percentage Exit
  • Goal Conversion Rate
The third and final tab, Revenue, will tell you how much money a keyword is bringing you based on 3 metrics:
  • Revenue
  • Per Visit Value
  • Ecommerce Conversion Rate


Click Here To Get the Keyword Analysis Report!

3. Link Analysis Report

link-analysis-report-google-analytics
What websites are sending you the best traffic? If you're link building, what links are worth going back for more? Link building isn't all about rankings, it's about increasing traffic and conversions as well. If you find a few gems, it's worth looking into them more.
Here are the columns you'll see with the report:
  • Source
  • Landing Page
  • Visits
  • Goal Completions
  • Pages/Visit
  • Bounce Rate
  • Percentage New Visits


Click Here to Get The Link Analysis Report!

4. PPC Keywords Report

If you're paying for search traffic, you obviously want to discover high performing keywords. You can then take this data and use it for future SEO campaigns.
Here are the metrics in this report:
  • Visits
  • CPC
  • Goal Completions
  • Cost per Conversion
By breaking things down easily, you'll be able to hone in one which keywords you need to put on hold and which ones you need to pour more cash into.


Get the PPC Keywords Report Here!

5. Social Media Report

social-media-report-google-analytics
Ah yes, a report that tells you how different social media channels are performing for you. This is a simple way to figure out where you should consider investing more time into socially.
The social media report looks at:
  • Visits
  • Social Actions
  • Goal Completions
  • Goal Conversion Rate
  • Goal Value


Get the Social Media Report Here!

6. E-commerce Traffic Report

ecommerce-traffic-report
If you run an e-commerce site, it's important to break down your different traffic channels to see which one performs best. Why is one channel performing better than the other? Is it worth it to invest more in a campaign that is trending upwards? Is your investment with paid advertising effective?
This report answers some of your e-commerce questions by looking at the following metrics:
  • Visits
  • Percentage New Visits
  • Bounce Rate
  • Pages/Visit
  • Revenue
  • Average Value
  • Per Visit Value

7. Browser Report

browser-report-google-analytics
This report will tell you how different browsers are performing for your site. You'll immediately see which browsers are your winners and which ones might have problems.
For example, if Chrome and Firefox seem to be doing OK but if Internet Explorer has extremely high bounce rates, you might want to look into Internet Explorer more. After all, Internet Explorer has x percent of the browser share. (research market share for internet explorer)


Get the Browser Report Here!

Bonus: Custom Reporting in Google Analytics

Jaime from SEOmoz created a wonderful realtime Google Analytics report. Here's what it looks like:
google-analytics-realtime-data
Image Credit: SEOmoz
This spreadsheet allows you to compare different metrics of your choice with different start and end dates as well. You can easily see how your campaigns are performing from a high level all in the comfort of a clean Google Doc.


Get the Google Analytics Custom Reporting Spreadsheet Here!

Want even more custom reports? Make sure to read Greg Habermann’s top five most used Google Analytics Custom Reports to learn about and get custom reports for Unique Visitors by Page; Conversion by Time of Day; Customer Behavior; Top Converting Landing Pages; and Long Tail Converters.

Conclusion

Google Custom Reports ultimately save you a lot of time and help you make actionable decisions that will help your bottom line. Take a few minutes to set these reports up and explore them. You won't regret it.
What are some useful Google Analytics Custom Reports that you use?

Sunday, September 18, 2011

Google Provides New Options for Paginated Content

At SMX Advanced earlier this year, a hot topic was the use of the rel=”canonical” attribute in conjunction with pagination. Maile Ohye of Google noted that the rel=”canonical” attribute was not intended to cluster multiple pages (articles, product lists, etc.) to page one of that series (although it can be used to cluster multiple pages to a “view all” page).

The discussion was fast and furious and we dove into the various pagination issues that content owners encounter. Maile took the feedback to Google and they got to work on some options. The goal? To present some new solutions at SMX East at a panel set up just to talk through pagination issues. Today, be impressed with my multitasking skills as I write this article that describes these new solutions while I moderate the session!

Google has done two things: evolved how they detect and cluster components of a series with a view all page and launched new rel attributes to enable content owners to specify components of a paginated series. They describe both in blog posts today.

Pagination

 

New Handling of View All Pages

Google has been evolving their detection of a series of component pages and the corresponding view all page. When you have a view all page and paginated URLs with a detectable pattern, Google clusters those together and consolidates the PageRank value and indexing relevance. Basically, all of the paginated URLs are seen as components in a series that rolls up to the view all page. In most cases, Google has found that the best experience for searchers is to rank the view all page in search results. (You can help this process along by using the rel=”canonical” attribute to point all pages to the view all version.)

 

If You Don’t Want The View All Page To Rank Instead of Paginated URLs


If you don’t want the view all version of your page shown and instead want individual paginated URLs to rank, you can block the view all version with robots.txt or meta noindex. You can also use the all new rel=”next”/rel=”prev” attributes, so read on!

New Pagination Options


If you don’t have a view all page, or you don’t want the view all page to be what appears in search results, you can use the new attributes rel=”next” and rel=”prev” to cluster all of the component pages into a single series. All of the indexing properties for all components in the series are consolidated and the most relevant page in the series will rank for each query. (Yay!)

You can use these attributes for article pagination, product lists, and any other types of pagination your site might have. The first page of the series has only a rel=”next” attribute and the last page of the series has only a rel=”prev” attribute, and all other pages have both.  You can still use the rel=”canonical” attribute on all pages in conjunction.

Typically, in this setup, as Google sees all of these component pages as series, the first page of the series will rank, but there may be times when another page is more relevant and will rank instead. In either case, the indexing signals (such as incoming links) are consolidated and shared by the series.
Make sure that the value of rel=”next” and rel=”prev” match the URL (even if it’s non-canonical) as the rel/next values in the series have to match up (you likely will need to dynamically write the values based on the display URL).

There are lots of intricacies to consider here, and I’m working on an in-depth article that runs through everything that came up in the session, so if you have questions, post them here and I’ll add them in!


, Follow her on Twitter at @vanessafox

Monday, May 30, 2011

Is Your Site Under Google Penalty?


Google Penalty



One of the most important aspects of taking care of a site’s search ‘appropriateness’ is knowing what can get you penalized by Google (or any other search engine for that matter).  Knowing how to assess the situation correctly so that you can tell if you have just been served a penalty can help you significantly to get the site back at the top for your search terms.

Unfortunately, it’s a sure thing that Google is not going to publish the criteria it uses for deciding who gets penalized. So we have to make an educated guess. In the SEO community, our opinions come from spending a lot of time–in some cases years–observing what does, and doesn’t get good results. As with just about any other aspect of SEO, most of what I’m about to say here will be met with cyber-cries of ‘but I disagree,’ or ‘I can prove otherwise,’ pr even expletives! That’s the nature of what we do–there’s always a lot of room for disagreement.

If you have been following SEO best-practices closely for some time, it’s highly unlikely that you’ll fall foul of the search engines to the degree that you get penalized. But sometimes as SEO warriors, we inherit a bad situation that someone else has created, and it’s not always obvious at first glance.

 

Google Sandbox or Penalty?

Before I continue, it’s worth mentioning that there’s a difference between a Google penalty and being flung in the Google sandbox. Actually people have questioned whether the sandbox even exists. But I think it’s fair to assume that it does. It is a common phenomenon that a new site will simply fail to show up: it won’t get indexed at all for weeks, or even months. It seems that until a new site or new pages earns its trust as far as Google is concerned, it sometimes don’t show up (are not even indexed) for a long period of time; Ann Smarty has already explained that in detail so I’ll leave it to her. Then all of a sudden those pages or sites appear without the webmaster making any changes, much to everyone’s relief.

 

Penalized: Knowingly or Unknowingly!

Sometimes an unscrupulous marketer–and I don’t use the term SEO here because in my book, search engine optimization does not include underhanded tricks of any kind–will use a technique that he knows may have a backlash later on, in order to achieve short- term gains to impress site owners. He or she will do this on the assumption that by the time the penalty is served up by Google, he or she will be long gone and no one will know what happened (and maybe even call said marketer back and pay them more money to sort it out).
More often though, a penalty is served simply because someone did something unknowingly.

So whether you are a freelance SEO or an in-house SEO you will need to be aware of what it looks like when a site has been penalized so that you can do a little detective work to find out what the problem is and quickly get your site back into the search stream.
The most obvious sign that you’re being penalized is if you’re not showing up in a search for keywords that you’re clearly optimizing for.  But perhaps the first thing you’ll notice is a sudden, drastic falling-off of traffic. Be careful here though: a sudden decline in traffic doesn’t necessarily mean that you’ve been penalized. It could just be that search trends have changed and the terms you were optimizing for are suddenly nowhere near as popular as they were. This does happen, and it’s one of the reasons why we recommend constant review of the search terms you use.

But there are other more subtle signs of search engine penalties.

 

The Specifics of Getting Penalized

  • If you feel that a Google penalty may have been incurred for any reason, the place to start is Google Webmaster Tools. Here you will find complete check lists to help you detect a problem if there is one. Go through the list and make sure that you are complying with Google’s list of best practices.
  • Compare your site’s Microsoft rank with Google. If you are on page 1 over at Bing and Yahoo, yet you’re not even showing on Google, then chances are you have been penalized.

Just to recap, although you can find this information all over the Web, here are items that Google WILL CERTAINLY  impose a penalty for:
  • Keyword stuffing: putting the same two or three keywords over and over again throughout your page will trigger alarm bells over at Google.
  • Cloaking: any form of disguising text is a huge no-no with Google.
  • Obviously-commercial content, where a few sentences that are usually not useful to anyone are woven around a set of keywords, purely for the purposes of Adsense, will probably get you penalized.
  • If you link to a website that is in a ‘bad neighborhood’ you could incur a penalty. Even sharing an IP address (as with shared hosting) can seriously damage your site if you have some notoriously bad sites on there. This is just one reason why it’s worth paying a bit extra to get the best shared web hosting: avoiding being associated with the spam and porn sites.
  • You’re acquiring links too fast and it doesn’t look natural: Google may assume you’re buying them or doing something else unethical to attract attention.
  • There has been disagreement lately over whether duplicate content will get you a penalty. I say it most definitely will (I’ve tested this one out many times myself). Even if, best case scenario, Google chooses to honor the most relevant version of the content, whether that be because it’s most relevant to the website, oldest (and therefore original) version, or for some other reason, who would want to take that chance if you can pick the option to have fresh, unique content on your website or blog?
  • If all your pages have the exact same title tags, again you’re going to get penalized. Each title tag for every page of your site should be unique and carefully chosen.

Wednesday, May 18, 2011

8 Steps to Optimize Your Blog Post


If you’re writing and publishing blog posts, but not putting in the few extra steps to optimize and align them with an overall keyword strategy, then you’re not leveraging the full potential of that content and you’re not making your website pages visible and highlighted for the search engines.


Blog Optimizing: Back to the Basics

SEO Strategy for Blogs, Blog Optimization

Content is a form of online currency that is crucial to any business' online marketing. With consumers relying on search engines for product research and reviews, content is key for ranking among those search results because search engines largely determine the quality and relevancy of the Internet’s countless web pages by looking at the text on those pages.

Just having content, even great content, on your company's website isn't enough to grab the attention of search engines. Businesses must leverage this content using search engine optimization (SEO) tactics. Maintaining a corporate blog is a good SEO tactic that allows for rapid content creation without the constraints of website architecture and web development teams.

Here’s how you can optimize your blog post in eight steps.

1. Find a Compelling Subject

One method for differentiating your content from all the other writing available across the web is to offer a fresh perspective and a unique angle on a given subject matter. If you haven’t spent time working through this step, don’t bother with the rest of the optimization process.

2. Conduct Keyword Research

This step is the perfect litmus test for determining whether your blog post topic is aligned with what people are looking for. When developing your focused keyword list around the blog post topic, make sure to do a sanity check and confirm that consumers are actually using these keywords to search for your product/service.
Save yourself time in the long run and filter out visitors who are unlikely to buy your product by ensuring your keywords align with the purchasing intent of your target audience.

3. Select Keywords

In order to rank high for a given keyword phrase, it’s important that you only designate up to two to three keywords per website page. Limit your blog post to one primary keyword, as well as two or three variations of that keyword (e.g. optimize blog post, optimize blog, blog post optimize, blog optimize).

4. Track Keyword Ranking Trends

Make sure your focus keyword is worth optimizing for. If there are only 10 searches for a given keyword per month, it might not be worth your while.
Look at how your target keyword phrase is trending, in terms of global monthly searches, how competitive the search term is, and whether any of your competitors or one of your pages are already ranking for it.

5. Optimize the Page

Page optimization is crucial for boosting the visibility of your blog post for the search engines. After you create the content, insert your keyword phrase throughout the blog post in specific locations where the search engines will be looking for information about your page (i.e. URL, title tag, H1, H2, emphasized text in the body of the post, alt tags on images).

From here on out, every time you mention this specific keyword phrase on your website, use an internal link to its corresponding blog page. There are also available SEO plugins for certain blog platforms, like WordPress’ popular  “All in One SEO Pack,” to help you control these SEO elements.

6. Syndicate via Social Channels

Syndicate your blog post externally by sharing it across your social networks like Twitter and Facebook. Additionally, post comments with your blog post link on relevant, external articles to attract clicks through to your site.

Make sure to use the blog post’s target keywords in your syndication via tweets and Facebook status updates. Help your audience share your content as quickly and easily as possible by including social sharing buttons on your blog post pages like the tweet, Facebook Like, LinkedIn Share, and AddThis buttons.

Consider adding Facebook's new comments plugin to drive engagement and sharing. Also, make your content available via RSS feed, so subscribers can regularly view your latest content on their news reader of choice.

7. Find Top Links

Inbound links are essential for boosting the search engine rank of a website page. A handful of relevant links will help you better rank. Use a link suggestion tool to help identify and track high-quality, relevant websites that you can reach out to with your blog post and request a link back to your page.

8. Track Keyword Performance

Monitor your blog post on a regular basis, in terms of rank, visits, and leads from its given keyword phrase over time. By checking back on your progress, you can understand what about your content is resonating with your audience and what to improve upon. Evaluate what worked and what didn’t, then repeat the successful tactics with your next piece of content.

Summary

SEO is a gradual process, but by just setting aside an hour a week, you can make a lot of progress over time.

While many view paid search as a quick and easy way to drive traffic without a large time investment, once you switch it off, you lose that traffic. SEO, on the other hand, when done well, can have a long-lasting, sustainable impact for your website.

See Original here (ref): http://searchenginewatch.com/article/2071301/8-Steps-to-Optimize-Your-Blog-Post


Get Meta Optimization Code For Posting in Blogspot here: http://googleseostrategies.blogspot.com/2011/05/meta-optimization-code-for-posting-in.html

.

Tuesday, May 10, 2011

Google +1 for Websites Nears Launch



Google’s answer to the Facebook Like Button will make its debut “in the coming weeks,” according to Google’s development team.

On Tuesday at the Google I/O developer conference in San Francisco, the search giant gave developers a sneak peek at the Google +1 button. It’s very similar to Facebook’s Like Button or the Twitter Tweet button — it provides a way for website visitors to endorse and share an article or web page.

According to Search Engine Land, the buttons will be available in seven different shapes and sizes with and without counters. Publishers can create one of these +1 buttons from a simple form where they can generate the embed code.

As you might expect, Google’s +1 button also comes with a suite of analytics that look similar to the Google Analytics dashboard. Once enough people have used a website’s +1 button, the data will be graphed. Demographic information such as age, gender and location are recorded. The analytics even include +1 data from Google search pages, which could prove useful for publishers that want to improve their presence on the world’s largest search engine.

Google played coy with the exact launch date of the +1 button, but you can expect it to make its debut at the end of May or in early June. The company has a signup form if you want to get notified about the button’s launch.

Google Business: Introducing the +1 Button Video :



See original here: http://mashable.com/2011/05/10/google-1-websites/


How to implement Google +1 Button on your webpage. Find the code below.......



<!-- Place this tag in your head or just before your close body tag -->
<script src='http://apis.google.com/js/plusone.js' type='text/javascript'></script>

<!-- Place this tag where you want the +1 button to render -->
<g:plusone></g:plusone>


Tuesday, April 19, 2011

The importance of robots.txt


How to create a robots.txt file for search engines to effectively navigate and index a website. A well written robots.txt file helps improve search engine rankings by providing important information to the search engine bot.



Although the robots.txt file is a very important file if you want to have a good ranking on search engines, many Web sites don't offer this file.
If your Web site doesn't have a robots.txt file yet, read on to learn how to create one. If you already have a robots.txt file, read our tips to make sure that it doesn't contain errors.

What is robots.txt?

When a search engine crawler comes to your site, it will look for a special file on your site. That file is called robots.txt and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.

The robots.txt file is a simple text file (no HTML), that must be placed in your root directory, for example:
http://www.yourwebsite.com/robots.txt

How do I create a robots.txt file?

As mentioned above, the robots.txt file is a simple text file. Open a simple text editor to create it. The content of a robots.txt file consists of so-called "records".

A record contains the information for a special search engine. Each record consists of two fields: the user agent line and one or more Disallow lines. Here's an example:

User-agent: googlebot
Disallow: /cgi-bin

This robots.txt file would allow the "googlebot", which is the search engine spider of Google, to retrieve every page from your site except for files from the "cgi-bin" directory. All files in the "cgi-bin" directory will be
ignored by googlebot.

The Disallow command works like a wildcard. If you enter

User-agent: googlebot
Disallow: /support

both "/support.html" and "/support/index.html" as well as all other files in the "support" directory would not be indexed by search engines.

If you leave the Disallow line blank, you're telling the search engine that all files may be indexed. In any case, you must enter a Disallow line for every User-agent record.

If you want to give all search engine spiders the same rights, use the following robots.txt content:

User-agent: *
Disallow: /cgi-bin

Where can I find user agent names?

You can find user agent names in your log files by checking for requests to robots.txt. Most often, all search engine spiders should be given the same rights. in that case, use "User-agent: *" as mentioned above.

Things you should avoid

If you don't format your robots.txt file properly, some or all files of your Web site might not get indexed by search engines. To avoid this, do the following:
  1. Don't use comments in the robots.txt file

    Although comments are allowed in a robots.txt file, they might confuse some search engine spiders.
    "Disallow: support # Don't index the support directory" might be misinterepreted as "Disallow: support#Don't index the support directory".

  2. Don't use white space at the beginning of a line. For example, don't write
    User-agent: *
    Disallow: /support
    but
    User-agent: *
    Disallow: /support

  3. Don't change the order of the commands. If your robots.txt file should work, don't mix it up. Don't write
    Disallow: /support
    User-agent: *
    but
    User-agent: *
    Disallow: /support

  4. Don't use more than one directory in a Disallow line. Do not use the following
    User-agent: *
    Disallow: /support /cgi-bin /images/
    Search engine spiders cannot understand that format. The correct syntax for this is
    User-agent: *
    Disallow: /support
    Disallow: /cgi-bin
    Disallow: /images

  5. Be sure to use the right case. The file names on your server are case sensitve. If the name of your directory is "Support", don't write "support" in the robots.txt file.
  6. Don't list all files. If you want a search engine spider to ignore all files in a special directory, you don't have to list all files. For example:

    User-agent: *
    Disallow: /support/orders.html
    Disallow: /support/technical.html
    Disallow: /support/helpdesk.html
    Disallow: /support/index.html
    You can replace this with
    User-agent: *
    Disallow: /support

  7. There is no "Allow" command

    Don't use an "Allow" command in your robots.txt file. Only mention files and directories that you don't want to be indexed. All other files will be indexed automatically if they are linked on your site. 

 

Tips and tricks:

1. How to allow all search engine spiders to index all files
    Use the following content for your robots.txt file if you want to allow all search engine spiders to index all files of your Web site:
    User-agent: * Disallow:
2. How to disallow all spiders to index any file
    If you don't want search engines to index any file of your Web site, use the following:
    User-agent: * Disallow: /
3. Where to find more complex examples.
    If you want to see more complex examples, of robots.txt files, view the robots.txt files of big Web sites:
Your Web site should have a proper robots.txt file if you want to have good rankings on search engines. Only if search engines know what to do with your pages, they can give you a good ranking.