This is a place devoted to giving you deeper insight
into the news, trends, people and technology behind Bing.
Today we're pleased to announce an update to the Sitemaps Protocol, in collaboration with Google, and Yahoo! This update should help many new sites adopt the protocol by increasing our flexibility on where Sitemaps are hosted.
Essentially, the change allows a webmaster to store their Sitemap files just about anywhere, using a reference in the Robots.txt file to establish a trusted relationship between the Sitemap file and the domain or folder.
Here's how it works: Say you run a web site like MSN.com, which has a bunch of sub domains like health.msn.com, travel.msn.com and moneycentral.msn.com. And, due to a technical requirement, you would like to host all of your Sitemaps in one location like sitemaps.msn.com. Until now the protocol did not support this scenario, each Sitemap would have needed to be hosted directly under the domain it described. This update now introduces support for this scenario, with the requirement that you simply include a reference to the Sitemap in your Robots.txt file. For example, moneycentral.msn.com/robots.txt would need to include this line:
The catch is that all the URLs in the Sitemap file all need to be within the same domain as the robots.txt file (i.e. moneycentral.msn.com/* in this example). Note that this applies equally for Sitemap index files and for compressed files.
Here are a few other useful notes about our implementation:
http://www.bing.com/webmaster/ping.aspx?siteMap=[your sitemap web address]
This change comes directly from feedback we received from webmasters, thank you for helping us improve our product! If you have any additional feedback or questions, please check out our Sitemap Discussion forum.
--Fabrice Canel, Program Manager, Live Search Crawler
Finally, good news in the sitemap world! The way it is now, the lack of standardization makes our webmasters daily lives a big mess. Thanks God you finally addressed this issue seriously.
I agree with above author, it is always a hassle to get the sitemaps working for all different search engines, I am looking forward to this new standard.
Great news. This was a serious issue that Microsoft have to face
Microsoft: Thanks for submitting your sitemap.
Great job! That is the best convenience! Thanks!
Great news, i think live will got a better market share in future and more people will use live search.
I have a question want to ask.
My site have over 200,000 pages, so i divide it to ten xml-sitemaps, but at live webmaster, i just can upload one sitemap.
At this time, should i make an index-sitemap.
And when i upload my sitemap to live webmaster, i only need upload the index-sitemap?
Great improvement, thanks. This has made webmaster life a lot easier.
Oh, I think it is the best way to submit sitemap and robot.txt files.
Thanks so much! It saves my time and memory.
Great! Thanks again!
Thanks for support multiple sitemaps! My site have over 15000 pages.
Excellent improvements, good to see the older bugs fixed
Thanks for support multiple sitemaps! my url over 5000.
great improvements, hope to see more advanced function.
It's more amazing for webmaster
My site from 2007 till now, it's over 5000 visits per day.
I just have to congrat for Live Search!
it´s too good as Google webmaster tools, so many great functions.
© 2013 Microsoft