Reviews

Recommended Post Slide Out For Blogger

Join us on Facebook

Please wait..10 Seconds No thanks

a

Showing posts with label SEO tricks. Show all posts
Showing posts with label SEO tricks. Show all posts

Sunday, 2 June 2013

Robots Exclusion Protocol

| |
0 comments

The “Robot Exclusion Protocol” (Robots Exclusion Protocol) defines the format of the robots.txt file and meta tag “robots”. It has been standardized and approved June 30, 1994. Various extensions have been proposed after that date. We’ll talk while also indicating which part of the original standard and what is part and extensions that may not be understood by all robots.
How the webmaster sends a message to robots
The robot exclusion protocol defines two complementary techniques of communication between the manager of a website and robots that visit: robots.txt and meta tag “robots”. They allow the webmaster to inform robots. It is important to understand that robots do exactly what they want this information.
If they are “polite” (like Google, Yahoo or Microsoft), they will do their utmost to meet your requests not to visit certain parts of your site. If they are malicious (like spam bots and hackers), they can use the contents of your robots.txt to know where to go do their evil deeds.
At the end of this article, we present a complementary technique that can effectively block spam malware.
Robots.txt
The robots.txt file consists mainly of a series of instructions that indicates which pages you do not want visited by the web robots. The file may include a series of instructions for all robots and specific instructions for any one particular robot.
It is indeed a message for spam and not a device that would make the visit of the robot technically impossible.
See also our pages of detailed information about the robots.txt file.META “robots”
The meta tag “robots” is a line of HTML code to put in the source code of a page. It tells search engine robots what they can do or not with the content of the page. Again, there is a mechanism that involves the collaboration of the search engine.
While the content of the robots.txt file can apply to all robots meta tag “robots” is only for search engine robots.
See also our detailed information on the meta tag “robots”.
X-Robots-Tag
The meta tag “robots” that can be placed in HTML pages. To obtain an equivalent result with PDF files, images, videos or other files, you can use the X-Robots-Tag in the HTTP header returned by the web server. This is a protocol extension orginal. It was introduced by Google in 2007.
. Htaccess
If your site is installed on an Apache server, you can place a file called. Htaccess in the root directory of your site. The file itself has nothing to do with the Robots Exclusion Protocol, but is especially effective because it can be used to block access to the website to certain IP addresses or certain “user agents”. Here it is asking for something more robots, but to impose a ban. This is the only technique that is effective in stopping malicious robots.
Special request removal of content
Google, through its “webmaster tools” allows you to request removal of an urgent or more of your pages in its index. Consider this request as practically irreversible. So use it with great caution.

Source : Robots Exclusion Protocol
Read More

Wednesday, 29 August 2012

Black Hat SEO Techniques to Avoid

| |
0 comments
There are many “SEOexperts” out there who try to convince webmasters that black hat SEO techniques can be effective and long lasting. This is wrong.
Why? Simply because black hat SEO is unethical and can cause you to get banned from search engines. Google will rank your site based on two simple elements:
  1. Content
  2. Back links
If your site has unique content, with the correct relevant keywords and other with quality sites link back to you, your site will be ranked high in the results (SERPS). It is a simple as that… You cannot beat Google. 
Google always finds a way to eliminate spammy tactics, so it is better to make it like you site.
How? Avoid black hat SEO techniques that break search engines rules and regulations.

Top 10 Black Hat SEO Techniques to Avoid
1. No content. Content is king and especially when it is unique. Pages with no content, full of keywords and keyword phrases will not work. This is mainly the reason that your squeeze page is not in SERPS!!

2. Duplicated Content. Google likes unique content and hates spams. Search engine spiders will probably treat your site as spam and ban it from SERPS if it is full of duplicated content.

3. Link Farms. Back links are great only when they come from relevant, high quality websites. Creating useless back links in unrelated, low quality blogs and websites will result you get banned.

4. Doorway and Gateway pages. Doorway pages are “fake” pages invisible to your website visitors which are used in an attempt to “fool” spiders. Google only ranks rich content landing pages accessible to humans, so no reason for a doorway page at all.

5. Unrelated keywords. Choosing the correct keyword and optimize for them can bring you tons of organic traffic, but irrelevant once are totally useless. Let’s assume for example that “free iPods” is a keyword of high demand and low competition and that you can optimize for it. For a while Google will send you traffic. This means that when someone searches for “free iPod” your site will appear in the results, and you will have a visitor but, your site is about Canon Cameras! Can you convert that visitor? Absolutely not… She is looking for a free iPod and you are trying to sell a camera.

6. Blog Spamming. Commenting on blogs is a great way to build back links, but only when the blog is relevant and you have something to contribute to the conversation. Spamming other people blogs using software that generates text full of keywords will get you banned

7. Keyword Stuffing. This means to use many long keywords and repeat then again and again. Google hates keyword stuffing and it will flag your site as spam.

8. Pingback Spam. Pingback spam is to notify ping servers several times per day/hour/minute and give the illusion that the content is new, when it is not. Users will report you in no time.

9. Social Networking Spam. Finding targeted people and send the spam messages about your site/product is consider social networking spamming. No-one likes spam and thus user will report before you get any traffic.

10. Track bank spam. Track bank spam is to abuse trackbacks with links to irrelevant topics and/or websites. Link back only to relevant and legitimate sources.
 
Source : http://helpful-tips-for-seo.blogspot.in/2012/05/how-to-promote-your-website-common.html
Read More

How To Create Backlinks

| |
0 comments
What Are Backlinks And Why Are They Important?

One of the biggest search engines out these days is Google. Google ranks using content, formatting, and linking. The higher that Google calculates the 'quality' of your website, the higher your ranking will be in Google searches. One way to have your website seen as high quality by Google is to have other websites link to it. This is known as backlinking (thus "how to create backlinks").

In this article I will show you how to create backlinks to your website and have it indexed by Google a bit faster (usually). This is by no means meant to be a long and wordy novel on the joys of how to create backlinks. What I'm hoping is to give some simple steps that you can take to jumpstart your site on Google using backlinks. Please comment and let me know if this article was of any help!

(To recap: Backlinks are links from other sites to your websites. Backlinks are important because good backlinks increase your ranking on Google searches.)

NOTE: Nothing beats having great content! Make sure that you have something that will catch people's attention. What good is a link to something that doesn't make people want to keep coming back to your site?

Property of surely at www.everystockphoto.com.

1: Current High PR Websites

There are already plenty of sites that rank highly on Google. Why not use these to boost your site's ranking? Here are 3 ways to create backlinks from these websites:

Post regularly and then ask for a link, or post a link eventually. Make sure that you add information and quality to their site before trying to create a backlink from it. (No one likes a freeloader!) Find older, high PR sites that still have guestbooks setup. Simply enter a nice comment that features links to your website.

Search in Google for high ranking blog sites that feature comments by typing in "[your keywords here]" "powered by wordpress" "leave a comment" -"no comments". Once you gain some results, go through the sites and make sure that they do not have the "no follow" code. (you can do this by pressing "Ctrl+U" then "Ctrl+F". Type in "nofollow". If the word search finds "nofollow" in the website code, then you are unable to add a link. Otherwise, you should be good to link to your website.)

2: Social Bookmarking Sites

A VERY good and popular way to get your website listed on Google is through social bookmarking websites such as Digg.com. Not only do these sites give your website exposure, but they also create tons of backlinks. These sites are indexed by Google pretty regularly, and so your submissions stand a good chance of being indexed quickly. 3 simple steps of getting setup on social bookmarking sites are:

If it applies, make sure to optimize your article / post headlines for maximum interest. (see: "10 Surefire Headline Formulas That Work")

Got to www.socialmarker.com to sign up for, and submit your website to, social bookmarking sites Make sure to submit your major articles and posts (but don't overdo it!). Make sure to add friends and comment on other posts / articles. The more friends that you have, and quality posts and votes that you've made, the more chance you have to get traffic (and backlinks) to your website.

3: Hubpages & Squidoo

Squidoo and Hubpages are both websites that provide users with one webpage to write about any topic. This is pretty useful since, just like the social bookmarking sites, these sites are indexed regularly and rank pretty high on Google. A good strategy is to submit articles, that include links back to your website, to these sites as pages. Submit articles with enough quality and people will link back to your squidoo and hubpages sites.

4: Video Marketing

Video Marketing is a great way to advertise and create backlinks. A Secret: There is usually a lot less competition for the ranking of videos in searches, than the ranking of websites. So there is A LOT of potential to get yourself noticed. Not only will people link to and view your videos, but they will sometimes put your videos on their websites as well! 3 steps to creating backlinks through videos are:

Sign up for the video website and enter information about your website in your website profile. Post funny, shocking, or informative videos that have to do with your website. make sure to use software like Camtasia to edit the video so that it includes a branded watermark (An example: BigBadBully: Teach Your Dog To Sit). Make sure to also post your videos to your blog or website with added text. (YouTube and Google Video are great websites to create backlinks)

5: The Basics

Another option is going back to the basics with text link trade. I'm not talking about the trades that you're used to. This linking system basically adds links to sites that choose to have a link posted from your website. The good thing about it is that the code that THEY use will include a backlink to your site. The code that the person who clicks through their site will use will have a backlink to your website as well. Check it out at.

6: Get Creative!

Some creative ways to create backlinks include:

Create Software that includes links to your website.

Create Plugins that include links to your website.

Create Website Code that includes links to your websites.

Create Website Layouts / Templates that include links to your website (can you imagine how many backlinks this could make for you?!).

Donate To A Non-Profit Organization / Cause that posts up informatin about people who make donations. Have them post up information about your website.

Answer questions on "Yahoo! Answers" and other "answer sites". Make sure to put a link to your website in your answer (preferably to an article or page on your site that addresses the question). Join Twitter and have it follow your posts.

Create a Myspace And Facebook Profile for your website.

Be a guest author on blogs and news websites.

Create an online forum or Yahoo/Google group.

Create an ebook, that has plenty of links to your website, and that you can give away for free.

Submit your site to web directories.

Link inside of craigslist ads.

Link inside of forums.

Join the BBB (Better Business Bureau).

Network with friends and local businesses for links.

Read More

5 Tips to save your website to be penalized

| |
0 comments
These days large number of domain owners are facing a lot of difficulty in getting their websites ranked high on the search engines because of the Google Panda software that prevents websites from getting ranked higher if their content is not rich enough. The quality of content plays a very important role these days specially after several updates from Google like Google Panda and Google Penguin therefore it’s important to work on to your own website before you get in touch with a seo expert. The Google Panda now penalises the entire domain name if the content of the website is not found up to the mark. Mere optimization techniques do not work anymore. This can be achieved by the following the points given below-




Unique Content

The content of your blog or website plays the most important role in ranking your
website on the top. You can no more achieve the top rank by mere keyword stuffing
and increasing the keyword density, or optimising your website according to the
search engines. Many people used techniques like hidden keywords to attract the
spiders of the search engines. With the development of Google Panda and Google
Penguin, all these techniques have started failing. The only way that you can save
your website from getting penalised is by enriching the content written in it.



Good Title Tag & Meta Description that make sense

The content of the website includes that you should write proper titles and tags and
ensure the content is not merely stuffed under the headings. It should be properly
framed and should make sense. Useless articles and senseless words are directly
rejected by these hi-tech algorithms. For that, you should make sure that the content
is properly organised and systematic with proper headings and sub headings. As on
page optimization usually require much in depth knowledge and hands on experience
to develop good tags it’s better to outsource seo to India that will save your time.

Don’t over optimized – don’t always keep the same anchor text to build back
links



Over optimisation is a big cause of penalties posed on the websites these days. The
major purpose of writing the blog in the website should not be for the SEO purposes.
The algorithms easily catch such mistakes and they penalise your website if found
so. The sole purpose of creating the website should be served within the content.
It should be of high quality and written without the intention of SEO in mind. You
should also make sure that the same anchor text should not be used for building of
back links.

Avoid link exchange



Another way to prevent your website being caught by hi-tech algorithms is to avoid
too much link exchange. Using many links may also lead you in the middle of
nowhere.

Avoid spam forum and blog comment

You should take care to avoid the blog comments and spam forums as much as
possible as it might be one of the reasons for getting you penalised.

Source : http://helpful-tips-for-seo.blogspot.in/2012/08/5-tips-to-save-your-website-to-be.html
Read More