Bing Webmaster Guidelines
These guidelines cover a broad range of topics and are intended to help your content be found and indexed within Bing. These guidelines will not cover every instance, nor provide prescriptive actions specific to every website. For more information, you should read our self-help documents and follow the Bing Webmaster Blog. In your Bing Webmaster Tools account, you will find SEO Reports and the SEO Analyzer tool for on-demand scanning of individual pages. Both resources will offer basic guidance and recommendations in regards to site optimizations that you can apply to your site.
Bing seeks content. By providing clear, deep, easy to find content on your website, we are more likely to index and show your content in Bing search results. Websites that are thin on content, showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites quickly tend not to rank well on Bing. Your content should be easy to navigate, should be rich and engaging to the website visitor, and provide the information they seek. In many cases, content produced today will still be relevant years from now. In some cases, however, content produced today will go out of date quickly.
Links pointing to your site help Bing discover new pages on your site. It is also traditionally regarded as a signal of website popularity. The site linking to your content is essentially telling Bing they trust your content. As a result, Bing rewards links that have grown organically; that is, links that have been added over time by content creators on other trusted, relevant websites made to drive real users from their site to your site. Abusive tactics that aim to inflate the number and nature of inbound links such as links buying, participating in link schemes (link farms, link spamming and excessive link manipulation) can lead to your site being delisted from the Bing index.
Social media plays a role in today’s effort to rank well in search results. The most obvious part it plays is via influence. If you are a social influencer, your followers tend to share your information widely, which in turn results in Bing seeing these positive signals. These positive signals can have an impact on how your site ranks organically in the long run.
Being indexed is the first step to developing traffic from Bing. Foloowing are pathways to being indexed on Bing:
- Links to your content help Bing find it, which can lead us to index your content
- Use of features within Bing Webmaster Tools such as Submit URL and Sitemap upload helps to ensure we are aware of your content
Managing how Bingbot crawls your content can be done using the Crawl Control feature inside Bing Webmaster Tools. This feature allows you to control when, and at what pace, Bingbot crawls your website. Webmasters are encouraged to allow Bingbot to crawl quickly and deeply to ensure we find and index as much content as possible.
Page Load Time (PLT)
This element has a direct impact on the satisfaction a user has when they visit your website. Slow page load times can lead a visitor to simply leave your website, seeking their information elsewhere. If your site was served in our search results, a slow loading page may appear to Bing as an unsatifactory result that we showed. Faster page loads are always better, but take care to balance absolute page load speed with a positive, useful user experience.
Robots.txtThe robots.txt file is a touch point for Bingbot to understand how to interact with your website and its content. Using the robots.txt file, you can tell Bingbot where to go, where not to go and by doing so, guide its efforts to crawl your content. A best practice is to have robots.txt file placed at the root of your domain (www.yourwebsite.com/robots.txt) and maintain it often to ensure it remains accurate.
Robots.txt is very powerful and has the capacity to block Bingbot from crawling your content. Should you block Bingbot, we will not crawl your content and your site or content from your site may not appear in our search results.
SitemapThe sitemap file often resides at the root of your host, say, www.yourdomain.com/sitemap.xml, and contains a list of all the URLs from your website. Large sites may wish to create an index file containing links to multiple sitemap.xml documents, each containing URLs from your website. Care should be taken to keep these files as clean as possible, and to remove old URLs once you remove the corresponding content from your website.
Most websites have their sitemap files crawled daily to locate fresh content. It’s important to keep your sitemap files clean and updates to help us find your latest content.
If you move content on your website from one location to another, using a redirect makes sense. It can help preserve value the search engine has assigned to the older URL, helps ensure any bookmarks people have remain useful and keeps visitors to your website engaged with your content. Bing prefers you use a 301 permanent redirect when moving content, should the move be permanent. If the move is temporary, then a 302 temporary redirect will work fine. Please do not use the rel=canonical tag in place of a proper redirect.
The rel=canonical element helps Bing determine which version of a URL is the original, when multiple versions of a URL return the same content. As an example, this can happen when you append a tracking notation to a URL. Two discrete URLs then exist, yet both have identical content. By implementing a rel=canonical, you can advise Bing which URL is the original one, giving us a hint as to where we should place our trust. Do not use this element in place of a proper redirect when moving content.
Search Engine Optimization (SEO)
Search Engine Optimization is a legitimate practice which seeks to improve technical and content aspects of a website, making the content easier to find, relevant, and more accessible to search engine crawlers. Majority of instances render a website more appealing to Bing, though performing SEO-related work is no guarantee of improving rankings or receiving more traffic from Bing. Further, taken to extremes, some SEO practices can be abused which are penalized by the search engines.
The main area of focus when optimizing a website should include:
- <title> tags – keep these clear and relevant
- <meta description> tags – keep these clear and relevant, though use the added space to expand on the <title> tag in a meaningful way
- alt attributes – use this attribute on <img> tags to describe the image, so that we can understand the content of the image
- <h1> tag – helps users understand the content of a page more clearly when properly used
- Internal links – helps create a view of how content inside your website is related. Also helps users navigate easily to related content.
- Links to external sources – be careful who you link to as it’s a signal you trust them. The number of links pointing from your page to external locations should be reasonable.
- Social sharing – enabling social sharing encourages visitors to share your content with their networks
- XML Sitemaps – make sure you have these set up and that you keep them fresh and current
- Navigational structure – keep it clean, simple and easy to crawl
- Graceful degradation – enable a clean down-level experience so crawlers can see your content
- URL structure – avoid using session IDs, &, # and other characters when possible
- Robots.txt– often placed at root of domain, be careful as this file is powerful; reference sitemap.xml (or your sitemap-index file) in this document
- Crawl Control- Define high crawl rate hours in the Bing Webmaster Tools via the Crawl Control feature.
- Fetch as Bingbot Verify that Bingbot is not blocked accidentally at the server level by doing a “Fetch as Bingbot”. You can find this tool under the Diagnostics and Tools section in Bing Webmaster Tools
- Webmasters are encouraged to use the Ignore URL Parameters (found under Configure My Site) tool inside Bing Webmaster Tools to help Bingbot understand which URLs are to be indexed and which URLs from a site may be ignored
- Site Structure
- Links – cross link liberally inside your site between relevant, related content; link to external sites as well
- URL structure and keyword usage - keep it clean and keyword rich when possible
- Clean URLs – no extraneous parameters (sessions, tracking, etc.)
- HTML & XML sitemaps – enable both so users and crawlers can each find what they need – one does not replace the other
- Content hierarchy – structure your content to keep valuable content close to the home page
- Global navigation – springs from hierarchy planning + style of nav (breadcrumb, link lists, etc.) – this helps ensure users can find all your content
- Head copy
- Titles – keep them unique, relevant and 65 characters (or so) long
- Descriptions – keep them unique, relevant, grammatically correct and roughly 160 characters or less
- Body Copy
- Use H1, H2 and other H* tag usage to show content structure on page
- Use only one <H1> tag per page
- ALT tag usage – helps crawlers understand what is in an image
- Keyword usage within the content/text – use the keyword/phrase you are targeting a few times; please also use variations as well
- Anchor text – using targeted keywords as the linked text (anchor text) to support other internal pages
- Build based on keyword research – shows what users are actually looking for
- Keep out of rich media and images – don’t use images to house your content
- Create enough content to fully meet the visitor’s expectations. There are no hard and fast rules on the number of words per page, but providing more relevant content is usually better
- Produce new content frequently – crawlers respond to fresh content by visiting more frequently
- Make it unique – don’t reuse content from other sources – critical – content must be unique in its final form on your page
- Content management – using 301s to reclaim value from retiring content/pages – a 301 redirect can pass some value from the old URL to the new URL
- <rel canonical> to help engines understand which page should be indexed and have value attributed to it
- 404 error page management can help cleanse old pages from search engine indexes; 404 page should return a 404 code, not a 200 OK code. See this link.
- Plan for incoming and outgoing link generation – create a plan around how to build links both internally and externally
- Internal and External link management – execute by building internal links between related content; consider social media to help build external links, or simply ask websites for them; paying for links is risky
- Content selection – plan where to link to – be thoughtful and link to only directly related/relevant items of content internally and externally
- Link promotion via social spaces – can help drive direct traffic to you, and help users discover content to link to for you
- Managing anchor text properly – carefully plan which actual words will be linked – use targeted keywords wherever possible
Abuse and Examples of Things to Avoid
Sites which engage in abuse such as the practices outlined below are considered to be low quality. As a result these sites can incur ranking penalties, have site markup ignored, and may not be selected for indexation. These Bing Webmaster Guidelines describe only some of the most widespread forms of inappropriate, manipulative or misleading behaviors. Microsoft may take action against you or your site for any inappropriate or deceptive practices, even those not described here. If you feel action has been taken against your site you can use Bing Webmaster Tools to contact our support team. Additonally users can report abuse of any of these practices using the feedback link in the footer of bing.com after performing a search which reproduces the issue. An example of this can be seen here.
Cloaking is the practice of showing one version of a webpage to a search crawler like Bingbot, and another to regular visitors. Showing users different content than what the crawlers can see may be seen as a spam tactic and could be detrimental to your website's rankings, leading your site to being de-listed from Bing index. It is recommended to be extremely cautious about responding differently to crawlers as opposed to regular visitors and to not use cloaking as a principle.
Link Schemes, Link Buying, Link Spamming
While link schemes may succeed in increasing the number of links pointing to your website, they will fail to bring quality links to your site, netting yousite no positive gains. Manipulating inbound links to artificially inflate the number of links pointed at a website can lead to your site being delisted from Bing index.
Social media schemes
Like farms are similar to link farms in that they seek to artificially exploit a network effect to game Bing's algorithm. The reality is social media schemes are easy to see in action and their value is deprecated. Auto follows encourage follower growth on social sites such as Twitter, and work by automatically following anyone who follows you. Over time this creates a scenario where the number of followers you have is more or less the same as the number of people following you. This does not indicate you have a strong influence. Following relatively few people while having a high follower count would tend to indicate a stronger influential voice.
Meta refresh redirectsThese redirects reside in the code of a website and are programmed for a preset time interval. They automatically redirect a visitor when the time expires, redirecting them to other content. Rather than using meta refresh redirects, we suggest you use a normal 301 redirect.
Duplicate contentDuplicating content across multiple URLs can lead to Bing losing trust in some of those URLs over time. This issue should be managed by fixing the root cause of the problem. The rel=canonical element can also be used but should be seen as a secondary solution to that of fixing the core problem. If excessive parameterization is causing duplicate content issue, we encourage you to use the Ignore URL Parameters tool.
When creating content, make sure to create your content for real users and readers, not to entice search engines to rank your content better. Stuffing your content with specific keywords with the sole intent of artificially inflating the probability of ranking for specific search terms is in violation of our guidelines and can lead to demotion or even the delisting of your website from our search results.
When sites use accepted tags or markups such as schema.org markup to convey information about their pages, they should be accurate and representative of the page that the tags are on. In particular, sites must not have markup which is:
- Irrelevant to the page it is on
- Inaccurate or misleading