This is a place devoted to giving you deeper insight
into the news, trends, people and technology behind Bing.
Imagine being a content developer for a website. You write a bunch of clever and informative articles, which should deliver a good dose of new visitors and ranking potential to the site. You submit them to the IT department for publishing online, and wait for good things to happen. But instead, it all falls flat. A look at your web analytics tools reveals that the number of site visitors has not increased over the time your new material was published. Further research reveals that your new content is not even in the search engine indexes! To quote the mighty Fred Willard, “Wha’ happened?” Perhaps some commonly seen site errors prevented your new content from being added to the index.
Just like a house, good web content needs a sturdy, reliable platform on which to reside. What good is a gorgeous, million dollar home if it’s sitting on a foundation of rickety 2x4s? No housing inspector would ever climb into such an unstable house to review it. And when a search engine crawler (aka bot) comes across a website littered with coding errors and serious problems with structure and design, it may, too, abandon its effort to crawl it. If that happens, no matter how good and compelling the content might be, it will never make into the index.
So how do you know if your site is has a rock-solid foundation or is just barely standing up? You need to get into your code. You can use a few good tools to help detect problems, but ultimately you’ll need to understand what the tools are saying when they indicate things are broken so you can fix them. Let’s get into a few of the site errors that are either pretty common or pretty important, and cover what you need to know to avoid their deleterious effects.
Invalid mark-up code
If your page mark-up code is bad, you’re bound to have crawling problems. But you might not know that the problems exist if your testing merely consists of, “How does it look in my PC’s browser?” Modern desktop browsers are pretty adept at munging through what you probably meant to do into a workable, on-screen presentation. They can often deal with code that is footloose and fancy-free when it comes to standards compliance. But the search engine bots are not as flexible as desktop browsers, and code problems can often trip them up and bring the crawling of your site to a halt. In addition to that, mobile device browsers are not likely to be as accommodating with poorly written code as desktop browsers, either. Anything you can do to make your code solid and standards compliant is good, for both your users and the bots.
To see where your code stands, use a good mark-up code validator. Most good development environments will offer either a built-in validator or references to such tools online. A particularly detailed validator is the W3C Markup Validation Service, a free, online HTML validator from the folks who bring you the coding language standards. It doesn’t validate entire websites recursively, just one page at a time, but it is still a very good source for detecting and identifying the issues behind coding errors.
Examine the results of the validator scan. What did it find? Check to see if you have some of these more common coding problems in your pages:
Tip: Test your pages in multiple browsers. One may be far more tolerant than another, and you really need to accommodate the least tolerant browser to allow the highest portion of your site’s visitors to have a good experience.
A bad link encompasses more than mere mistyped or expired URLs. It also includes structural and site design problems that may not break a site, but can prevent it from reaching its full potential. But bad links are a very common problem, if only because the target of the link is typically outside of the linking webmaster’s control (at least external ones!). The webmasters of the sites you linked to usually fail to courteously let the rest of the known universe know when they’ve changed the URL of their pages! OK, so that was a bit facetious (as if that would be feasible, if even possible, to do!), but the problem remains. Pages you link to are regularly moved, deleted, or renamed without your knowledge, and those who tried to link to them look like a fool. As if that were the only repercussion. Broken links are bad form for search engine optimization (SEO), which means this error can eventually affect your page’s rank.
You need to regularly run a link checker as part of your site maintenance work. There are a lot of tools out there on the Web to do this work, but some only check one link at a time. That’s fine if you have a three-page site and only six outbound links! Besides, you could easily do that by clicking through your site in a browser! You need a tool the scan all the pages on your site in one fell swoop and give a consolidated report on the findings. Many mark-up language development environments offer tools for this, so check there first. I also recommend that you look at the webmaster tools offered by the search engines. Bing’s Webmaster Center offers a Crawl Issues tool that provides feedback on detected broken links. Use the File Not Found (404) filter to get a report on broken links. Also note that Google and Yahoo! also offer their own version of webmaster tools with similar functionality.
The following is a list of link issues to look for on your site:
Tip: By the way, 404 errors are not limited to your outbound links. Other webmasters might incorrectly code or not keep up with changes to your site, resulting in 404s for users who want to see your content. Help keep those folks on your site by creating informative and useful, custom 404 pages. For more on this, see our previous discussion on this issue in Site Architecture and SEO – file/page issues.
Other coding errors
There are other coding errors or omissions that can adversely affect the way the bot collects information about a site. Let’s cover these as well.
These tags are key locations in the page for using keywords and key phrases to associate for relevance with your content. The <title> tag is required in HTML and XHTML documents, and the other two might as well be (all of them are very strategic for SEO). Each one is to be used only once per page, and all must have text (no images or just blank spaces!) between the tags. That text in those locations is considered important keyword text by the bot (for it defines the content of the page), so make the most of it. Don’t duplicate text strings between these tags, either. That’s a wasted opportunity for defining more keywords.
The last issue I want to mention is the use of 302 redirects. We talked about 301 redirects before, and how to strategically use them. 302s are only temporary redirects, and unlike with 301s, no link juice credit is passed to the redirected page. Using a 302 redirect is not a coding error per se, but much of the time it is a strategic error from the perspective of SEO. Unless you have a genuinely temporary need to redirect a page, stick with 301s as an SEO best practice.
If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. Until next time…
-- Rick DeJarnette, Bing Webmaster Center
Excellent blog Rick. I think this explains some of the loss of visibility that my website have suffered on google in the last few weeks. I am not a techie so not sure if this is definitely the case but it may explain something. The only problem now is how to make those amendments and 301 redirect!
Great advice. Bad links are a big site killer.
So my question is google has no problems with the blog element my site. (wordpress)
The static html pages neither google nor bing have issues, as they are indexed.
But the articles I post on my WP blog are not indexed by Bing at all.
It all codes clearly, and the other engines pick up on it well. But Bing, no.
How does one go about solving this issue?
That's interesting. I have always been told that markup language did not figure into the algorithm. Maybe it does with Bing more than Google.
I run a site that is all Flex except for a few CFML pages. Any advice much appreciated.
yes,it is good advice.
Well great blog loaded with few things you must do to improve your search engine optimization. SEO efforts will yield nothing if your content does not get indexed. It is pretty simple.
Well a few coding errors and few bad links could hurt your rankings.
In addition, one should also review the web server the website is hosted on as well. Good point on the 302s. 301 vs 302 redirects is another issue many webmasters have issues with and well worth reviewing.
Monopoly pf google is hurting as more and more
How to know backlinks in bing search ????
command link:.... don't work
sorry for offtop
Please answer for my post very need
I guess title tags are not important. The title of this page being an example.
I am using HTML 5. Does anyone know if Bing can parse properly HTML 5 tags? I know it is not a finalised version of HTML but because of my content (photography) it is the only one which allows for semantic mark-up of images.
My website: http://www.face-china.com
validated by W3C ,but nearly no index in bing , the architecture is good .
Thanks for your suggestions. The specific difference, I see here is the way you had explained the guidelines. Anyhow, I am really happy that Bing has become much user friendly to all. Thanks for that.
and hey 'IceTi', you can see backlinks if you sign into Bing webmaster tools. Here is the link: http://www.bing.com/webmaste
© 2013 Microsoft