Skip to main content
News

Upgrading our website from ASP to ASP.NET

By April 8, 2008No Comments

About 6 months ago, I got this crazy notion that it was time for a website upgrade. Our company’s website simply did not reflect the quality of work in our client portfolio. However, I was scared to death of losing our great Google and Yahoo! SERP rankings especially since we needed to upgrade from ASP to ASP.NET which involved a URL name change (gulp). About 50% of our leads come from the web, so this was a big deal.

With our Seattle .NET expert, Promolab, we were able to maintain or improve our SERP rankings. Thanks to Jim and Cari Drake for providing the following steps in how we went about upgrading from ASP to ASP.NET.

Our first step was to plan out the page structure for the new site. Not only is this an absolute must to determine the navigation structure but it helps define the needed content as well. This plan was also what we used to plan our 301 redirects. Our goal was not only to never have a broken link on the site but to direct search engines to crawl the appropriate pages for indexing and never come up short on a page that no longer exists. We mapped out each existing .asp page to its new corresponding .aspx page. We then built the site from the plan using the new page names.

Once the site was complete our final checklist before going live was threefold.

First, using backup copies of all the old .asp pages, we replaced the content of each .asp page with a 301 redirect going to its corresponding new .aspx page. If an older page was being retired, we simply set the redirect to the home page or a page with similar content. See http://www.webconfs.com/how-to-redirect-a-webpage.php for 301 redirect examples. You can test your code by visiting one of many free tools like the redirect checker at http://www.webconfs.com/redirect-check.php.

Second, we took the time to build a Google-compliant sitemap.xml page containing each new page on the site. For help with sitemaps, see http://www.sitemaps.org/protocol.php. The sitemap can then be registered with Google. See http://www.google.com/webmasters/

And last, we added the appropriate robots.txt containing the location of our sitemap.xml file to the site to ensure that the search engines are crawling what we want them to crawl. see http://googlewebmastercentral.blogspot.com/2008/03/speaking-language-of-robots.html for more information.

Happy upgrading!

Brian Shilling

Author Brian Shilling

Brian is our Executive Vice President of Client Operations with experience leading diverse teams of marketers and designers in strategic marketing, content creation, and crafting comprehensive messaging and positioning platforms for our healthcare and tech clients. To learn more about Brian's experiences and qualifications, visit our leadership team page.

More posts by Brian Shilling