Write For Us

SEO For ASP.NET Web Site

Every ASP.NET developer (or at least most of us) wants a lot of visitors to their web sites. Google, Yahoo and other search engines can send plenty of visits especially if your web site is shown on first page of their search results. And vice versa, if your web site is shown on thirtieth page or not indexed you will not see any benefit of search engines. Because everyone wants to win first page there is strong competition and you need to take care of every factor that affects how much your page will be friendly to search engines.


There is more than a 100 important factors used from search engines to rank page. Most of them are just speculations, since Google, Yahoo and others don't want to reveal its ranking algorithm. Also, their algorithms are changed very frequently (a hundreds of times yearly) to improve user experience and provide correct results. Even Google has not capacity to manually change page ranking if they think that some page should be better or worse ranked. Instead, they try to find out what mistake in algorithm caused wrong ranking and try to correct it on that way. Because of that, Search Engine Optimization (SEO) is very dynamic field, but basics and most important things are still the same.

Create unique title for every page

Every page of web site needs to have its own unique title. Title should be short, descriptive, meaningful, contains keywords and relevant to content of the page. Do not insert repeated phrase like company name on the beginning of the title of every page. Let your most relevant information appears first. Title tag can be edited at design time, but if you have some content management system you can change it by using Page.Title property with code like this:

[ C# ]

Page.Title = "My unique and keywords rich title";

[ VB.NET ]

Page.Title = "My unique and keywords rich title"

When someone uses Google search, query terms will be showed as bolded text in search results. Because of this, you need to place targeted keywords in title to make it noticeable and therefore get more clicks to your site. Of course, to get a visit your title must please human visitors too, not just search engines. You can't just list keywords in title without any meaning. Instead of that, let your title tags be a precise descriptions of every single page and you'll be fine with both search engines and people. Don't use too long titles because search engines will truncate it anyway. Keep title under 65 characters long.

Use description and keywords meta tags

Description and keywords meta tags were very important for search engine optimization in past, but they are widely abused. Today meta keywords tag is practically useless, but meta description tag is still important. Although it would not improve your position in search results it is beneficial indirectly. Google often uses meta description tag when display search results below title as short description of your page. So, if you have catchy meta description tag you can get more visits even if you are not first in search results. Set unique page description for every page. Like page title, you can change meta tags in markup or dynamically in server side code:

[ C# ]

protected void Page_Init(object sender, EventArgs e)
  // Add meta description tag
  HtmlMeta metaDescription = new HtmlMeta();
  metaDescription.Name = "Description";
  metaDescription.Content = "Short, unique and keywords rich page description.";
  // Add meta keywords tag
  HtmlMeta metaKeywords = new HtmlMeta();
  metaKeywords.Name = "Keywords";
  metaKeywords.Content = "selected,page,keywords";

[ VB.NET ]

Protected Sub Page_Init(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Init
  ' Add meta description tag
  Dim metaDescription As HtmlMeta = New HtmlMeta()
  metaDescription.Name = "Description"
  metaDescription.Content = "Short, unique and keywords rich page description."
  ' Add meta keywords tag
  Dim metaKeywords As HtmlMeta = New HtmlMeta()
  metaKeywords.Name = "Keywords"
  metaKeywords.Content = "selected,page,keywords"
End Sub

This approach will also work if you use master pages.

Using of H1 tag

H1 tag is very important, but in the same time very easy way to improve your position in search results. It is best if h1 tag has the same content like title tag. Just place same short, relevant, keyword rich phrase to both h1 and title tags and this single effort will significantly elevate your ranking. Like any other HTML tag, you can change h1 tag directly in markup, or dynamically if you add runat="server" and set its id, like in code bellow:

<h1 runat="server" id="MyPageHeader" ></h1>

Now you can manipulate h1 tag with ASP.NET server side code:

[ C# ]

MyPageHeader.InnerText = "This Is My Catchy Header";

[ VB.NET ]

MyPageHeader.InnerText = "This Is My Catchy Header"

There are also important <h2> and <h3> tags that you can use for sub headers and <strong> tag to make some keywords more significant in text.

ASP.NET SEO Url Redirecting

Sometimes you need to move a page to other url or move complete web site to other domain. Common example is, if you upgrade web site created in classic ASP to ASP.NET you need to change file extensions from .asp to .aspx. If some visitor comes to your old link from search engine or directly, he or she should be redirected to new url. There are two possible redirections:
1. Temporary redirection, returns message "302 Found". This redirection should be used only when necessary, very rarely for search engines optimization.
2. Permanent redirection, message returned is "301 Moved Permanently". This redirection tells spiders that page or site is moved to another url. It is used in SEO to transfer link popularity to new address.

Response.Redirect will return 302 redirection so it can't be used for search engine optimization in ASP.NET. To redirect permanently, use code like this:

[ C# ]

Response.Status = "301 Moved Permanently";
Response.AddHeader("Location", "/your-new-url/");

[ VB.NET ]

Response.Status = "301 Moved Permanently"
Response.AddHeader("Location", "/your-new-url/")

This code is sufficient if you need to move single page. But, if you want to move complete web site it is best to do it in Internet Information Server (IIS). Go to old site properties and select Home Directory tab. Choose "A redirection to a URL" and write new url in text box bellow, like shown on image:


Don't forget to check "A permanent redirection for this resource" check box to get 301 redirection. Avoid using of postback

Some programmers place Button or LinkButton control on web form and use Response.Redirect to navigate to another page on control's click event. That is a problem because ASP.NET controls uses JavaScript to make post backs. Since web spiders can't work with JavaScript a lot of pages can't be indexed and will not appear in search results. To make web pages visible to search engines try to avoid using of postback. Instead of Button or LinkButton controls, place simple <a> tag. Text on links should be descriptive, avoid meaningless links like "Read more", "Click here" etc.

If you must use a postback, then provide alternative way of navigation, with simple plain hyperlinks. This can be implemented in form of site map. Site map could be a page that contains listed links to all pages on web site. When web spider visits site map page, it will find all other pages easily.

SEO Friendly URLs: Url Rewriting

Web spiders don't like query strings parameters in urls. If you are getting some data from database, it is common to use query string like ShowProduct.aspx?id=23445. Although this url looks logical from programmer's perspective it is not user friendly and is usually not ranked well on search engines. You need to use urls that contain keywords separated by hyphens. So, instead of /ShowProduct.aspx?id?=23445 should be something like /My-Product-Name.aspx. Url that contains keywords is easier to read by human visitors and is better ranked on search engines. Also, site with SEO friendly urls is more secure since you can hide id or even file extension.

Url rewriting in ASP.NET can be implemented on many levels, directly on page, in Global.asax, by using a custom HTTP module or HTTP handler, on level of web server etc.

Url rewriting on page is a hard coding way, but it can be useful in some scenarios. In page with friendly url, use Server.Transfer method to call real page. So, in this case you must actually have a frendly url page as a file and place just one line of code in it to transfer execution:


Url rewriting in Global.asax uses RewritePath method in Application_BeginRequest event. Implementation could look like this:

[ C# ]

void Application_BeginRequest(object sender, EventArgs e)
  HttpApplication app = (HttpApplication)sender;
  if (app.Request.Url.AbsolutePath.StartsWith("http://www.example.com/Friendly-Page/"))

[ VB.NET ]

Protected Sub Application_BeginRequest(ByVal sender As Object, ByVal e As System.EventArgs)
  Dim app As HttpApplication = sender
  If (app.Request.Url.AbsolutePath.StartsWith("http://www.example.com/Friendly-Page/")) Then
  End If
End Sub

If you are interested in url rewriting with HTTP module or HTTP handler check URL Rewriting in ASP.NET tutorial where both methods are explained.

To get fast results you can try http://urlrewriter.net/. This is free open source URL rewriter for ASP.NET written in C#, easy to use and used in many large and small sites, including this web site.

ViewState and SEO in ASP.NET

There is a speculation that search engines read only limited number of bytes from each page (first 100K of web page). ViewState value is a string presented as hidden field on client side. If you have large ViewState on the beginning of the ASP.NET page, then it is possible to web spiders avoid your real content. That could be harmful for your ranking in search results. The simple solution is to turn off ViewState if you don't need it, or at least not use it for every single control. If you really need a ViewState there is new option in web.config to place it to the bottom of the page:

<pages renderAllHiddenFieldsAtTopOfForm="false" />

Following the same idea, you should remove any unnecessary HTML, JavaScript code and CSS styles to get smaller page. You still can and should use JavaScript and CSS where needed, but call them from external file. This will also reduce repetitive work.

However, this talk about limited loading of pages is just a speculation, personally I don't believe it is completely true. But, search engines for sure try to please their users and users like fast web sites. Because of that pages that load faster will be ranked better. Loading time is one of the ranking factors. By removing or reducing ViewState, deleting needless HTML tags and HTML comments, moving JavaScript and CSS to external file and leaving on page useful content only you will improve your ranking both directly and indirectly. Final thing could be validating your HTML output on http://validator.w3.org.

SEO Friendly paging with GridView, Repeater and other data controls

Default paging in ASP.NET data controls uses postback and javascript. Search engine spiders checks the links and usually avoid JavaScript so using default pager is not SEO friendly option. The solution could be to use custom paging for GridView, Repeater and other data controls, or to get more options and make this task easy for you use our SEO Pager Control, specialized for search engine optimization and dealing with large tables. More about paging in ASP.NET data controls you can find in Data Paging In ASP.NET tutorial.

New SEO features in ASP.NET 4.0

Direct manipulation of meta description and keywords tags

ASP.NET 4.0 has new options to change meta description and meta keywords tags. Of course, you already can do this in any older version in ASP.NET by adding runat="server" to meta description or meta keyword tag and then calling it from server side code, ASP.NET 4.0 way is simpler and faster, introduces new MetaDescription and MetaKeywords properties of Page object, used like in code bellow:

[ C# ]

Page.MetaDescription = "This is my great page";
Page.MetaKeywords = "great,page";

[ VB.NET ]

Page.MetaDescription = "This is my great page"
Page.MetaKeywords = "great,page"

Response.RedirectPermanent for 301 permanent redirection

ASP.NET 4.0 includes another interesting SEO feature. Response.Redirect method returns 302 temporary redirection. To use 301 redirection with earlier versions of ASP.NET you should use a code shown in ASP.NET SEO Url Redirecting section. With ASP.NET 4.0 this job is simpler:

[ C# ]

Response.RedirectPermanent("New-Page.aspx", true);

[ VB.NET ]

Response.RedirectPermanent("New-Page.aspx", true)

Dealing with www. subdomain

This is common problem. Since www. is just a subdomain, search engines could see duplicate content and split the reputation between www.example.com/Your-Page.aspx and example.com/Your-Page.aspx. It is much better to have only one link on first page than two links on thirtieth page. Allow only one link to the same content. If you discover that visitors could navigate to same content through few different urls, use 301 redirect to focus all ranking to one url.

AJAX and Search Engine Optimization

Ajax uses JavaScript and search engines don't like JavaScript. So, your Ajax site have big potential problem. It is very possible that your content will not be indexed. You can disable JavaScript in your browser and try to access all content on site. If you can't see it without JavaScript, spiders will not see it too.

Possible solutions are to provide site map with links to all content, or add additional way of navigation that uses static links, or to make your content initially loaded without JavaScript.

SEO Sessions problems

By default, ASP.NET uses cookies to store session. Since web spiders usually don't accept cookies it is possible that some of your content be invisible if session variables are required to show it. You can change settings and store session in url (you can do this in web.config by changing the cookieless parameter of sessionState to true), but that is even worse because link popularity will be divided to many different urls with duplicate content. There is an option to set cookieless value to AutoDetect. This means that ASP.NET will use cookies if browser supports it so for SEO issues it is not solving a problem.

The solution could be to set cookieless attribute to false, and provide a way to spiders to find all content through simple links.


As you see, it is very easy to make ASP.NET web site that is invisible for search engines :). Just use Button or LinkButton controls for navigation and nobody will find your site through a Google search. In opposite, to receive maximum of traffic from search engines you need to follow their rules and be more search engines friendly. I hope I explained well some basics of search engine optimization (SEO) for ASP.NET web site. If you are interested to find more about this subject, especially if you build commercial web sites, check Professional Search Engine Optimization with ASP.NET: A Developer's Guide to SEO (Wrox Professional Guides) book. There are many other books that covers search engine optimization but this is focused to specific ASP.NET SEO issues.

Finally, all this technical tips and tricks are irrelevant if you haven't quality content inside of your body tag. If your content is great, people will link to your page naturally. Incoming links from relevant sites are the most important factor for ranking. I hope you found some interesting ideas in this tutorial. Happy coding!

Tutorial toolbar:  Tell A Friend  |  Add to favorites  |  Feedback  |   Google

comments powered by Disqus