Reviews

Recommended Post Slide Out For Blogger

Join us on Facebook

Please wait..10 Seconds No thanks

a

Showing posts with label Google Tool. Show all posts
Showing posts with label Google Tool. Show all posts

Tuesday, 25 June 2013

Google AdWords Keyword Tool Redirecting Some Users To Keyword Planner Instead

| |
0 comments
Frederic Chanut shared on Google+ an interesting Google AdWords bug with the keyword tool.
Some are reporting that when you login to your AdWords account and click on the tools drop down and select the keyword tool, Google won't take you to the keyword tool. Instead of Google taking you to the keyword tool, Google is taking you to the new keyword planner tool.
As you can imagine, this can be frustrating for advertisers and also scary. It might be a sign that Google may replace the keyword tool with the planner tool?
Here is a screen shot of the behavior:
Personally, logged in or out, the keyword tool works and does not redirect me to the planner tool.

I am not sure if this is a sign that the keyword tool will be replaced by the keyword planner or if this is a weird bug upsetting a subset of advertisers?
Forum discussion at Google+.
Author Bio : Barry Schwartz is the CEO of RustyBrick, a New York Web service firm specializing in customized online technology that helps companies decrease costs and increase sales. Barry is also the founder of the Search Engine Roundtable and the News Editor of Search Engine Land. He is well known & respected for his expertise in the search marketing industry. Barry graduated from the City University of New York and lives with his family in the NYC region. You can follow Barry on Twitter at @rustybrick or on Google + and read his full bio over here.




Read More

Wednesday, 19 June 2013

How to Edit Google AdWords Sitelinks Descriptions

| |
0 comments
Hello Readers,  Now you can edit  Google AdWords Sitelinks Descriptions, Google announced on the AdWords blog that you can now "nominate specific text for your sitelink descriptions" within your AdWords ads.

To do so, you need to have upgraded to the enhanced campaigns.

Here is what you can control:

 

Here is where you can control it:

Read More

Tuesday, 11 June 2013

Google's Matt Cutts On Disavow Tool Mistakes

| |
0 comments

What are common mistakes you see from people using the "disavow links" tool?

 

 


Read More

Wednesday, 26 December 2012

See Google Analytics Data Clearly With These 3 Little Known Tricks

| |
0 comments
When looking at Google Analytics data, sometimes your just one click away from an insight that will change your business.
One simple trick could be the difference between seeing something valuable and not.
Here are three Google Analytics tricks will help you quickly get the business insight you need from your data.

1)  Weighted analytics

I’ve been working to get our bounce rate down here on The Daily Egg.
So, I fired up Google Analytics and decided to take a look at some individual pages that were showing high bounce rates to see if I could find any similarities.
Seems like it would be easy to determine where to start but not so much.   When you are dealing with large amounts of data, you will usually have outliers.  And these outliers can make it impossible to see anything of value.
In this case, I have a number of pages that have 100% bounce rate but very few visits.  This isn’t doing me any good.  It wouldn’t be worth my time to investigate a page with 1 or 2 visits.
Click to enlarge
Weighted Analytics in Google Analytics
This is the data with the “default” sorting method.

Fortunately, Google Analytics has me covered.  By changing the SORT TYPE from DEFAULT to WEIGHTED, I can suddenly see where to start my research.
I won’t go into the magic pixie dust that makes weighted sorting work but suffice it to say that it takes into account traffic volume to bring significant data to the forefront.
Click to enlarge
Weighted Analytics
This is the data with the “weighted” sorting method

2) Data-over-time intervals

Back in the summer, I activated the dormant Crazy Egg Facebook Page.  Nothing too crazy, just posting our articles for our fans and occasionally a status update when I found something interesting to share.
As the end of the year approaches, I would like to see if it makes sense to ramp up our presence on Facebook in 2013.
But looking at data over time can be tricky if you don’t select the right intervals.
In this case, going back 6 months and displaying information by the day is basically worthless.  This view might be OK if I was trying to pinpoint the exact Facebook status updates that did well at driving traffic.
But I just want to know if the posting to Facebook over the last six months has had any affect in the aggregate.
Data over time shown daily
Data over time shown daily

When I view the data by the week, I get a much clearer picture of this data.  This view would be perfect if I was, for example, trying to determine if a Facebook contest I ran generated more traffic than usual.  But, again, that’s not what I am looking for today.

Data Over Time Trick In Google Analytics
Data over time shown weekly.

Viewing the data monthly shows me that our small amount of activity on Facebook has produced more traffic.  This view is (as Goldilocks would say) just right to answer this particular question.
Data over time shown monthly
Ahh, now that’s better.  Data over time shown monthly

3)  Export more than 500 rows in Google Analytics

Sometimes you just need to get out of Google Analytics and into a spreadsheet to get the insight you need.
In my case, I wanted to take a look at our % of email subscriber conversions  by organic keyword.  Understanding the keywords that convert best assists me in making a number of SEO and editorial decisions.  When I do analysis like this, I like to get the data into Excel where I can do some manipulation that can’t be done in Google Analytics.
But Google Analytics limits the number of rows you can export to 500.
In October of 2012 alone, we have visits from over 15,000 different organic keywords.  If I were to export all of them separately and stitch them together in a spreadsheet, I would have to export 30 times.  Ack!
That is, until I learned this easy hack.
First, select ’500′ as the number of rows to display in your report.
Export more than 500 rows in Google Analytics
Then, find that 500 in the big long ugly URL string as pictured below.
Export more than 500 rows Google Analytics
Change that number to the desired number of rows you would like to export, in my case 15,008.  I believe 20,000 is the upper limit for this hack and besides, if you go too crazy with the number of rows you are exporting you will crash your browser on the next step.  :)
Export limit in Google Analytics
After you alter the URL, press the Enter key to load the new URL.
Now you are ready to Export as usual.
So, what do you think of these Google Analytics tricks?  Do you know any of your own you could share with us?

Source : http://blog.crazyegg.com/2012/12/14/google-analytics-tricks/ 
Read More

Wednesday, 17 October 2012

Google Launches Disavow Links Tool

| |
0 comments
 
Google has launched a new and widely anticipated “disavow links” tool. The tool was announced by the head of Google’s web spam team Matt Cutts, when speaking during a keynote at the Pubcon conference today.
The tool is live and can be found here. It has been beta tested by some selected SEOs already for the past few weeks. About 45 minutes after Cutts spoke, Google formally announced the tool on the Google Webmaster Central blog.
Earlier this year, Bing launched its own disavow links tool.

Disavowing Links

Cutts warned that the tool should be used with caution. He also warned that publishers should first try to remove links they are concerned about pointing at them by first working with site owners hosting links or with companies they may have purchased links through.
The format will be to list URLs in a text file, either individually or to exclude all links from a particular site using domain: format like this:
domain:google.com domain:yahoo.com
domain:facebook.com
Both formats can be mixed into a single file, as shown below in an example from Google’s blog post about the new tool:

From the blog post:
In this example, lines that begin with a pound sign (#) are considered comments and Google ignores them The “domain:” keyword indicates that you’d like to disavow links from all pages on a particular site (in this case, “spamdomain1.com”).
You can also request to disavow links on specific pages (in this case, three individual pages on spamdomain2.com).
Once you’ve created your file, you then access the disavow link tool through Google Webmaster Central. You’ll select your site (as the screenshot at the top of this story shows), go through warnings, then select your file and submit:

These Links Will Be Disavowed In … Several Weeks

The process of Google discounting the links to your site won’t be immediate. “It can take weeks for that to go into effect,” Cutts said. He also said that Google reserves the right not to use the submissions if it feels there’s a reason not to trust them. The blog post reflects the same:
Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.
One submitted, there will be an option to download the file you submitted and resubmit it with changes. There’s a file size limit of 2MB (and if you have more than 2MB of links you need to disavow, you should probably just start a new web site).
The delay in processing the file means that if you make a mistake, it may also take weeks to “reavow” links that you like. So be careful. The post addresses this:
To modify which links you would like to ignore, download the current file of disavowed links, change it to include only links you would like to ignore, and then re-upload the file. Please allow time for the new file to propagate through our crawling/indexing system, which can take several weeks.
In questions, Cutts said that using the tool is the same as using the “nofollow” attribute, which allows sites to link to other sites without passing ranking credit to those sites.

Who Needs To Disavow?

Who should use the new tool? It’s been primarily designed for those who were impacted by Google’s Penguin Update, which in particular hit web sites that may have purchased links or gained them through spamming.
In the wake of Penguin, panic ensued among some SEOs and publishers. Some wanted a way to ensure that they could discount bad links and start fresh. Others worried that people might point bad links at their sites in an attempt to harm them with “negative SEO.” A new business of people charging to remove links was even born.
Things got worse in the summer when Google released a new set of link warnings that didn’t clarify if publishers really had a problem they needed to fix — if they could — or not.

How Google Created Its Own Disavow Links Monster

Of course, Google wouldn’t need a disavow link tool if it hadn’t been shifting over the past months to consider bad links a type of negative vote against the site. In the past, Google typically had just ignored bad links.
But by counting bad links as negative votes, Google largely enabled some of the concerns about negative SEO that it hopes, in part, to calm with the new tool. Again from its post:
In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.
I asked Cutts why Google doesn’t simply discount bad links, rather than considering some of them as potentially negative votes. After all, while it’s nice to have this new tool, it would be even better not to need it at all.
As I wrote earlier this year when covering the increasingly creaky link counting system that both Google and Bing rely on:
Links suck. It’s hard to get good links, and even when you do, you might find they don’t count. Meanwhile, who wants to be wasting time “disavowing” links? There’s got to be a better way.
Rather than answer my question, Cutts instead focused on the benefits the new tool brings, especially the ability for people to “clean slate” web sites that may have bad links pointing at them.

More Information



 

Cutts has also prepared a nearly 10 minute long video about the tool, which you can find below:

Google also has a help page about the new tool here, and be sure to read the official blog post, which has a helpful FAQ section and other details.

Read More

Tuesday, 21 August 2012

Google Webmaster Tools Adds Clarity to Traffic Alerts: What To Do If You Receive One

| |
0 comments
For some time (since at least March), Google webmaster tools has been sending messages to site owners about substantial traffic changes (increases or decreases) to pages of a site. Some site owners found these messages perplexing as they weren’t sure exactly what was actionable about the messages. Were the messages just informational, intended to help those who didn’t look at their analytics every day? Did they imply something specific about what Google knew about how the site? For instance, some people thought perhaps this was a warning message about a new penalty on the site or a penalty being removed from a site.
In reality though, these messages were just intended as a heads up to let site owners know when a page that previously brought substantial traffic to the site might be having issues of some kind (in the case of a traffic drop) or when a page may newly have become a traffic driver. Google’s latest blog post highlights that these messages have been revised to make that clearer. This is part of Google’s larger attempt to provide alerts of important issues related to sites.
Message
If you receive a message about a traffic drop, your plan of action should be the same as any traffic drop investigation.

1. Is the page still indexed? If not, technical issues may at play.

Check:
  • If the page has a meta robots tag in the source code preventing indexing (<meta name=”robots” content=”noindex”>).
  • If the page is blocked from crawling via robots.txt (use Google’s robots.txt checker to test this).
  • If the URL is listed in the Google Webmaster Tools crawl errors report (the easiest way to search for a specific URL is to download the CSV and search across all errors at once).

2. Is the page indexed, but with lower ranking?

The best way to check this is to go into Google Webmaster Tools and then navigate to Search Queries and then Top Pages. Once you find the page in the list, you can look at a couple of different things:
  • Choose a date range from before the date of the drop and look at the average ranking for the page, then choose a date range post-drop and compare the average rankings and search volume.
  • Click the expansion arrow to see the queries that brought traffic to that page and click the query that brought the most traffic to see detailed information for the selected date range. If the traffic dropped for that query, go to the Search Queries tab and compare the ranking for that query for a date range both before and after the drop to see if the rankings changed.
Below is an example with the Search Engine Land article on QR codes: http://searchengineland.com/what-is-a-qr-code-and-why-do-you-need-one-27588.
  1. First, we can look at the current average position of the URL (averaged across all queries that bring traffic to the page. Google Traffic Drop
  2. Next, we can see if that ranking has changed over time by comparing two date ranges.Google Traffic Drop In this case, we can see that the ranking dropped from an average of 5.6 to 6.7. This likely is what caused the click through rate to decline from 5% to 3%. We can also see that overall search volume declined. Since in both ranking positions, the result was on the first page of results, then 100% searches would have registered as an impression. Does this mean that overall interest declined over this period or does the page rank for fewer queries? We can look at the query details to find out.
  3. For each date range, click the expansion arrow beside the URL to see the queries the page got traffic for. Google Webmaster Tools Top URLs
  4. When working with a large number of queries, I find it can be helpful to copy and paste the list into Excel. By alphabetizing each list and then comparing them side by side, you can see what queries dropped out, what new queries are bringing traffic now, and how traffic has changed for each query. You can then determine if you’ve lost visibility for a high volume query entirely, or if simply a lot of little long tail queries are adding up to big traffic changes.Rankings Drop
  5. For high volume queries that have lost substantial traffic, you can check the query details report for that query. In this case, we’ll check the highest volume query, [qr code]. When you click the link in the UI, you see the details for that query (URLs that ranked, positions, and so on) for the selected date range. (You have to go back to the previous page to change the date range.) Drop BeforeTraffic Drop In this case, we can see that the ranking for just [qr code] went from an average of 4.4 to 6.1, but that’s the average across all pages that ranked in any position for the query.
  6. To get more details on just this URL/query pairing, we need to go back to the Search Queries report. You can filter by the query to make it easier to find (and can filter on country/platform specifics if you’d like as well. Search Query Filter
  7. In this case, the average ranking for the URL and average ranking for the query are similar, but this may not always be the case. In some cases, a URL ranks for a number of varied queries — some at high positions and others at low positions, which makes the average ranking look fairly low. When looking at a specific query, you’ll see the high (or low) ranking position that goes into the URL average.

3. Are Rankings the Same? If so, have impressions or click through rate declined?

If rankings have held steady, then seasonality may be at play. As you can see above, at least part of the issue is due to fewer searches related to QR codes. If impressions haven’t declined, then check for changes in click through rate. At my company, Nine By Blue, we’ve built software that charts categorized query data: impressions, clicks, rankings, clicks through rate, and number of queries in a category over time to make it easy to tell which factor may be causing traffic drops. In the example below, you can see for all review-related queries, average ranking and click through have held steady, but impressions are down substantially, due in part because fewer review-related queries are bringing traffic to the site. In this case, you know that while seasonality may be at play, it’s also worth investigating why the site is no longer getting traffic for some queries.
metrics
If click through rate is down, check your search results display. Did you lose your rich snippet? Is a technical issue preventing the descriptive title from appearing? We often assume a rankings drop, when any number of other factors might be causing issues. If you do find, as a result of this investigation that a rankings drop is the problem, it’s possible Google has penalized the site, but also check the queries that have lost ranking. Has the content on the page changed that makes it less relevant to those queries? Has the page lost incoming links with that anchor text? Have you changed your internal link structure in some way?
If you’re planning to attend SMX East, and are interested in diving into this topic further, check out the performance and technical metrics sessions, which focus specifically on these issues
Read More