This is a place devoted to giving you deeper insight
into the news, trends, people and technology behind Bing.
In the exciting world of today's Internet, where the world's information is literally at your fingertips, where you can endlessly communicate, shop, research, and be entertained, spam is a big downer. The unwanted email spam that fills our inboxes also consumes huge portions of the available bandwidth of our routers and trunk lines. But email is not the only spam game in town.
Web spam is the bane (well, one of the banes) of the search engine and web searcher communities. Search engines want to provide search users with a great experience, helping them find what they want as quickly and as easily as possible. Search users want to use search engines to get the right information they seek as quickly as possible. And webmasters want search users to find their websites, but also to get those search user visitors to become conversions instead of bounces.
Web spam, those unwanted garbage pages that use overtly deceptive search engine optimization (SEO) techniques and contain no valuable content, is a frustration to search engines and search users alike, and ultimately work against the best interests of conversion-seeking webmasters (severely annoying a potential customer is rarely a great sales technique!).
In the previous article that defined web spam and discussed how it is different from junk content, we mentioned that there are two types of web spam. In this article, we're going to delve into the details of the first type: page-level web spam.
Definition of page-level web spam
Page-level web spam uses on-page SEO trickery (not to be confused with link-level web spam, which we'll discuss in an upcoming article). Webmasters and optimizers for these sites do this because they believe they can fool the search engines into giving their webpages a higher-than-deserved ranking based on their content relevancy, often times for subject areas that are completely unrelated to the site's actual content. This is done in an effort to deceive searchers into visiting their spammy sites for a multitude of reasons, none of which usually benefit the end user.
The use of the following questionable SEO techniques will cause Bing to examine your site more deeply for page-level web spam. If your site is determined to be using web spam techniques, your site could be penalized as a result.
Note that Bing recognizes that the core concepts behind many of these techniques can have valid uses. No one is saying that their use always and automatically denotes web spam. The issue of intent behind their use is the distinguishing factor for determining whether or not web spam is present and any site penalties are needed. Please understand that, from a search engine perspective, the web spam effort consistently provides very little to no value whatsoever to end users. The entire effort is directed to fraudulently affect search engine rankings. As Martha Stewart might say, that's not a good thing.
Keyword URL and link stuffing
Definition: This is the use of heavily repeated keywords and phrases with the goal of attaining a more favorable ranking for those words in a search engine index.
Problem: Keywords can be repeated to excess, so much so that they render any text in which they appear unintelligible from a natural language point of view. Those excessive repetitions can also be added in places that are not seen by the end user (meaning outside of displayed page text). Some web spam pages even use repeated keywords that are unrelated to the theme of the page. If any of these conditions are detected, these techniques will draw the attention of Bing as likely web spam.
What we look for: The purveyors of web spam use a variety of methods for keyword stuffing, including:
Note that stuffing the keywords <meta> tag alone is not a reason to be judged as web spam. But <meta> tag stuffing could be an indicator that other web spam techniques may be employed and could draw a search engine to take a closer look at such a site.
It is important that webmasters not overreact to this information. A small amount of relevant keyword repetition is considered common and is not considered web spam as long as it is used naturally within the page content language and the page provides useful, relevant content. They key message is always the same: develop your pages for human readers, not for search engine bots, for the best results. For more information on creating and using keywords wisely, see the blog articles The key to picking the right keywords and Put your keywords where the emphasis is.
Misspelling and computer generated words
Definition: Pages populated with many various spellings of targeted keywords, especially those unrelated to the theme of the page or the site, can indicate that the keyword lists are computer generated.
Problem: Aggressive inclusion of large numbers of misspelled or rare word lists and phrases can be considered web spam when used to excess. The relevance of those words to the theme of the page or the site is the key distinguishing factor here.
What we look for: The Bing team commonly sees the following techniques on web spam sites:
Redirecting and cloaking
Definition: When a web client visits a website, certain traits can be used to identify the user and redirect them to a different page. These include, but are not limited to, redirects based on the referral code, the user agent (bot or human), and IP address.
Problem: Redirecting can be a legitimate technique in some cases such as if a web client is limited in what it can display on a mobile device web browser, or when a web server uses the client's IP address to determine the language in which to present the content (aka geo-targeting). However, problems arise when sites filter their content based on whether the user agent belongs to an end user web browser versus a search engine bot. This type of filtering can run the gamut between showing the bot a keyword-stuffed page to an entirely different set of content, all of which is an attempt to deceive. When used with this intent, this is web spam.
What the webmasters who implement these techniques don't understand is that search engines can detect this attempted deception. We do see when the content presented is user-agent based, and when the differences between the content variations is not done in the same light as that done between mobile and desktop browsers.
What we look for: Some webmasters design their websites to use the following deceptive techniques when the detected user agent is a search engine bot:
The problem for webmasters practicing these techniques is that their technical deceptions are not very effective. Search engines use a number of techniques to uncover such fraudulent practices as redirect and cloaking web spam. When they are revealed, the websites of the perpetrators are penalized, sometimes severely. Well-meaning webmasters or online business owners who hire unscrupulous consultants or carelessly take black hat SEO advice from indiscriminate sources on the Web are setting themselves up for trouble. Reviewing the issues identified in this article as well as the official webmaster guidelines for Bing, Yahoo, and Google, will go a long way to keeping a website on the right track for search.
In the next article on web spam, we'll discuss link-level web spam in detail. We'll also include some information on what to do if your site was pegged as web spam and after the problems have been resolved, how to request reinstatement into the Bing index as a normal website. Stay tuned!
If you have any questions, comments, or suggestions, feel free to post them in our Ranking Feedback and Discussion forum. Until next time...
-- Rick DeJarnette, Bing Webmaster Center
That was a very interesting read. On my blog I once overused a keyword without realising it and had to restart again. Sometimes I get carried away and have realised its more better to optimise the webpage for visitors and not search engines.
This is a great post! But I have a few questions, in terms of keyword density. Sometimes the information presented in a webpage makes sense to the reader only when repetitively using a certain word, in my case, the word 'tax', for example: 'tax agent', 'tax professional', etc. (the meaning is completely different without the word 'tax', and would thus not be the intended one). So in these cases one can very quickly arrive at a large 'density' for certain words (without which the page would have no meaning or the wrong meaning). Thus the question: what can the be done in these cases, and how can Bing distiguish between legitimate cases and spam? Also, how can one know when his/her webpage is considered as spam or border-line spam?
Taximise Pty Ltd
edmond has a very valid point. Also some SEO experts put hidden links to their clients' websites in some very high ranking webpages like place a link to a client's website & the link is actually placed on a spacer which isn't visible. How do bing tackle that?
Great questions! The key here always comes back to how the content appears to the human reader. Is it logical? Is it readable? Does it make sense? In this particular case, the repeated use of the word "tax" in content regarding tax services offered is reasonably expected and thus is fine. In fact, including a solid set of explanatory content that defines these keyword phrases only strengthens the case for reasonably repeating this word. If the use of this repeated word makes sense to the reader and is not a clumsy attempt to stuff the word in where it's not necessary or helpful, and you have a good amount of supporting content to accompany it, you'll be fine. Our crawler sees this usage and understands it is legitimate. Just write for the reader's comprehension and the crawler will not penalize you for keyword stuffing.
The important thing to remember is that true web spam often involves multiple issue violations. As such, it typically takes more than one violation to trigger web spam consequences – having a slightly above average number of keywords won’t automatically torpedo you. Just as you need to do several things well to improve your ranking (build good content, build valuable inbound links, target several keywords, etc.), you need to do several things wrong to really hurt your ranking. That said, if it’s obvious that you are trying to abuse the system, even with just one egregious issue, then penalties will ensue.
Lastly, we don't define any borderline between acceptable and non-acceptable web spam. If you think what you've done might be considered web spam because you know you're trying to game the system, then take a different approach to optimizing your pages. I'll repeat my mantra: write content for the human reader, not the crawler. Develop good, unique content that is readable, understandable, and valuable. If you do this without involving any black-hat, SEO-style trickery in an effort to artificially boost your ranking, then you'll never have to worry about this being an issue.
Thanks for writing!
Bing Webmaster Center team
Thanks for your reply!
Your answer is pretty important for me as I am trying to understand if I did something wrong in terms of content of my webpage (as I have got only my main page indexed to date by Bing): one of the obvious things to question was "keyword stuffing" (I believe is called), as I cannot do anything about repeating the word 'tax' without altering the meaning for a human reader. With this out of the way, I will now focus on "repetition", increasing the value of content I provide, etc. Maybe this will help.
Thanks again, I appreciate.
Taximise Pty Ltd
very interesting post...thanks to share it on community.
Thanks for the post.
Hello, I congratulate you for this very helpful tip.
quite useful information provided by you for the link stuffing and the web spam...thanks for all this information.
for Online Computer Support, Contact http://www.askpcexperts.com
thanks for this helpful information on spaming and link building.
Very useful information
© 2013 Microsoft