This is a place devoted to giving you deeper insight
into the news, trends, people and technology behind Bing.
When building or managing a website, it almost goes without saying that you need to be in control. You need to be able to make changes to certain elements if you home to optimize the website properly. Let’s take a look at nine items you should be sure are firmly within your control when you start optimizing a website. And we’re not just talking search engine optimization here. Think broader than that. Think of truly building a better, more useful website, for both the engines and your visitors.
Being able to tell the engine which version of your URL you’d like to have attributed as the original is pretty useful. This handy command can help you build value on the version of the URL that matters most to you, and help combine value attributed to many version of the URL into one location, helping boost the rank of that one, original version of the URL.
The tough part here is usually getting the code installed on each page, and of course, each instance needs to point to one selected URL for this to work. We’d rather you didn’t use the rel=canonical to cover issues where your CMS needs work. If the CMS is causing instances of duplicate URLs to occur, you should fix the problem. We see increasing usage of the rel=canonical across huge numbers of pages on large websites. While we don’t really like this, either, we can work with it, as we understand the need to balance the workload and the returns.
The bottom line, though, remains that you need to be able to manage the rel=canonical, and if you don’t have control over when its deployed, which URLs they point to and when it is used, you need to work it out.
Seems like a no-brainer, this one, but so many websites remain without a robots.txt file. In some cases it’s a purely missed opportunity, or the site owner is unaware of what a robots.txt file is. In other cases, though, it’s the inability to place a file on the root of your domain.
Regardless of the blocker, the robots.txt file is one of the most important files you can place on a web-server. Given it is the location search crawlers reference to understand how to interact with the website, it’s a pretty powerful document. If you do not have access to place your robots.txt in the correct location, you need to understand why this control is lacking. Then solve the problem.
This is another file missing from a huge number of websites today. Another important file the search crawlers look for. One that is referenced inside the robots.txt mentioned above, and one which can help get more of your pages into the search index. Overall, it’s almost as important as the robots.txt file, and if you cannot place these files in a location the crawlers can find, you need to fix this issue.
This file typically lives on the root of the domain, but for larger websites, where multiple files may exist to capture all of the URLs present, maybe only a sitemap directory file is on the root. Whichever your case, it’s important he crawlers can find it, and if you cannot access the root on your server to place files there, it’s a missed opportunity.
Marking up your content has been around for a few years now, and with the launch earlier this year of www.schema.org, the major engines have made a clear statement there is value in marking up your content. By embedding these elements in your page code, we can extract information more accurately and use that information to provide increasingly richer search results. You are credited as the source for the data used. This is important work for sites seeking to differentiate themselves from the pack as we move into 2012.
Websites need to balance the future value versus the workload to implement, and still need to keep in mind that implementing these elements won’t immediately increase rankings. This works helps us better understand relevancy.
What’s important here is planning for the work and ensuring you have buy in across your organization.
Title, Description, Alt Tags, etc.
Managing your title tags, meta descriptions, alt tags, etc. is still important. All these basic, on page/technical seo factors add up to help us understand what your content is relevant for. The bottom line with these items is you need to be able to manage them. If you cannot change these elements on a per-page basis, you lack needed control. We only mention three here, but you can think of all of them.
That meta description you don’t care to alert and let appear across all you pages? While writing unique ones for each page won’t suddenly vault your pages to number 1 in the rankings, it can make the difference between a searcher clicking on your result in the search results or not. Think of the meta description as the “call to action” then, when read by a searcher, tells them why they should click your result. Better to have your words appearing in our search results than random text we take from the page because your meta description was low quality.
We use the meta description as an example, but the same level of thought should be applied to all of your on page optimization efforts.
This might seem pretty obvious, but with so many website still aggregating content or using article services to build out pages, its worth mentioning. We talked about how to build good content a little while back and the value of “article-site content”, but we still see websites trying to get ahead in the rankings by basing their websites on a thin content model. Such sites are often very polished looking, and while may provide value in aggregating a lot of items in one location, they still aren’t adding anything new and unique to the conversation. A website designed primarily to hold affiliate links that get the user off the website quickly and into a shopping cart on another website is an example of a thin content website, though not the only example. Affiliate links on a website can be completely useful, but when the content on the site is duplicating that from the original website, there’s simply no reason why that thin content site should outrank the real deal.
Control as applied to content means you can influence the creation of quality content on the website. If the website is not producing unique, quality content, it won’t last long in the search results.
This is pretty straight forward. You need to be able to place the verification code in place to use webmaster tools. This oculd be in the form of embedding a tag in the <head> cod eof your webpage, or by a notation added in the DNS for the website. No matter the option used, you need to have access to make this happen, and if you do not have that access, you lack the control needed to then be able to access some of the richest data about your website online – our webmaster tools data.
If you don’t have a website your visitors love, you’re missing an opportunity. Get cracking on a user experience review and see where you’re bleeding users. By staying tuned in to what users like and dislike about your website, you can make the myriad small changes needed to field a UX-winning site. And if you keep visitors happy, they share you more often with friends, netting you more links. Visitors are also more likely to come back to you again in the future if they like the site and find it easy to get what they’re after.
It might seem like a small thing, to focus on the user experience, but that UX directly influences the happiness of visitors and the engines can see that in how they reach to your site when they see it ranked in search results. If you don’t have input on UX improvements, you need to push. This is an important aspect of optimizing any website.
Social sharing integration
This almost goes without saying, but we still see so many websites not involved socially with their visitors. Social isn’t going away folks, and while it does take work, skipping social integration is a missed opportunity. People share what they like with friends. If you have social sharing icons embedded in your pages, you are much more likely to get shared by visitors. At the very least, you need to cover this angle. Get the buttons embedded into your pages so your visitors can share you content with their friends and followers.
The next step is to engage with them socially by having conversations. That is entirely within your control.
These nine items are not in any order and aren't meant to cover every single thing you need to be in control of, but should at least get you started down the right paths. And remember, control may mean many things ranging from you making the actual changes to you exerting influence over others who do the work. In the end, though, if your job is to optimize the website, you need to be in control of the things that influence your website.
Excellent article Duane, After reading your post about the Fall updates to Bing Webmasters Tools, I finally took the plunge and joined the Bing community. Call it an early New Years Resolution but it’s time I get serious and buckle down on the learning curve. Your latest article drove home that point all too well. I was wondering if you could expound a bit on UX , what are some pointers that you can pass along to help me that help a website in this regard? Thanks!
Really great information Duane but there are 2 point i don't care before UX improvements and Rich Snippets Rich Snippets i will apply all things in my new site
Simple but very effective!
Guilty of confusing canonical tags... this post showed me I've got some work to do!
Though most of the points are general, but i am sure not all of us will be taking care of these general points completely and if apply them complete, it will really going to be effective.
Very useful article and tips for optimizing thank you very much
This is a very useful article, great.
I recently developed my website and I want to know more on UX user experience.I dont know whether I am posting a question on right place, what should I do to increate user experience.
hello. how i can to control crawling rate in my site
Great idea it's bust our site on the search engine and really help full to improve our caroling rate.
nice article, thanks for sharing
Thanks For sharing but any other option available for sitemap upload ?
I only have some trouble with the "Social sharing integration". As far as I know many social networks have their own robot to disallow the SE , When I share the information in them, only the visitor can get the info , but the se can not get the info, how do you get them , | can guess that if you can record the visitor's access record through the broswer.
As I share my site : www.laptopbattery4.co.uk in Facebook but I can not get any Analysis that it affect my rank in bing.com.
© 2013 Microsoft