Top 10 SEO Problems with Legacy Websites | Parallel PathInternet Marketing Blog | Parallel Path
Parallel Path | Ten Year Anniversary
We’re celebrating 10 years of providing
innovative digital solutions.

Top 10 SEO Problems with Legacy Websites

Matt Stone

website maintenance

Looking into the workings of your old website can be a lot like looking in your attic – you don’t get up there very much and when you do, you find lots of stuff that isn’t any good. And, you occasionally locate items that you should probably get rid of or fix. Since your site was built and deployed, many changes may have taken place in the SEO world. Failure to take these into account in operating your site can result in your website not performing as well as it should.

Search algorithms changed a great deal over the past several years. Companies added content to their sites for specific campaigns, or reorganized the management of their online presence. Any of these changes impact how well a website is able to attract online visitors. So, the older your site, the more important it is to conduct a thorough cleaning of your website’s “attic” and identify issues that are preventing better performance.

Items to Evaluate for Older, Legacy Websites

Here are ten of the top problem areas we see with older or legacy websites that inhibit the ability to attract visitors via the search engines.

Outdated URL Structure

Many websites were built and then added onto over the years. Many site operators added new content by appending onto the existing URL structure, so that the web addresses have many directories and are far too long to be useful.

When creating and posting content, many organizations simply added it in a fashion similar to:
yoursite.com/pages/content/newcontent/newercontent

URL structures with many “/” marks dilute the page authority that might otherwise be created with a shorter and carefully crafted URL. With each “/” mark, it is estimated that the page loses about 30% of the page authority that might otherwise be retained with a shorter URL.

URLs should ideally contain words and terms that readers use in a search. Older sites frequently use terms like “pages,” “products” or even “default” that were left over from the development process. These are terms that readers will never use in a search, are sure signs that the URLs in your site need to be redeveloped and deployed. And once you set up a new URL, don’t forget to set a 301 redirect from the old page to your new one!

Many Subdomains and New Websites

As companies grew and organizations changed, many assigned management of various parts of the website to different internal groups. This may have made content management easier in larger organizations, but many went too far and created separate subdomains and even entirely separate websites so that these groups could handle their own content more easily. By separating the website into several pieces or by installing subdomains, these companies shot themselves in the foot by diluting their own website strength.

Google views subdomains as separate websites. And of course new URLs are fully separate websites. In this era of acute competition all available website or domain authority is needed to effectively compete for high traffic search terms. Many high profile companies spent years merging their subdomains and satellite websites back into a single core URL to combine and rebuild website strength to compete for more visitor traffic.

Old Redirects

www. and non-www Website Versions
Google also views the www and non-www version of a website as two separate websites. Having both a www and non-www website version available amounts essentially to “competing with yourself” for strength and visibility. Having only one version – managed by a universal redirect – combines all available page and domain authority into one, stronger version of the website.

Multiple Redirects
This is a fairly common but difficult to track problem. If the website is older and has been worked on by different people over the years, there may be redirects set from an original page to a newer one and then from the newer one to a currently visible page. These multiple redirects do not pass as much page authority from the original URL on to the destination as would a single redirect from the old page to the live one. Finding these instances may be done with a free redirect checker. If your company has had a number of developers work on the site over the years, a redirect review is a good process to undertake.

Outdated XML Sitemap

The sitemap is the guide to search engines to help inventory and index all pages of the website. These pages are generally not meant for public viewing but have to be accessible to the search engine robots. It is easy to forget about these pages because they are not intended for public display. Accordingly, it is important to make sure all the pages within the inventory are properly accounted for within your sitemap and that the listed pages have a reasonably current time-stamp.

Many free tools are available to generate a properly formed XML sitemap, including Screaming Frog and Free XML Sitemap Generator.

Outdated Robots.txt File

Several changes have taken place recently in the way Google crawls a website. Not too long ago, it was advisable to prevent page and display assets from being crawled. CSS files, layout assets, and other formatting files that impact page display were deemed unimportant. Now however, it is important to allow the search engines to crawl this information so that the entire page may be rendered properly. In the end, Google wants to know that your page displays well and they need access to all the page assets in order to make that determination. So.. pull down the restrictions on page layout assets within your robots.txt file!

Another update to the robots.txt file is to make sure that your file contains a reference to your xml sitemap. Adding a sitemap reference ensures that the crawlers will access your sitemap. To do this, just add this line to your robots.txt file.

Sitemap: http://www.yoursite.com/sitemap.xml

Old Keywords

A great deal can change in a short amount of time. Websites that deal in technical or entertainment content can see relevant keyword terms change over time. It sometimes seems that people hang around and just invent new ways to describe things! A page or site that is optimized for an older term may be limiting your visibility and limiting your traffic, so a keyword review should be conducted at least once each year.

A part of the keyword review process also involves identifying emerging terms for which your site may not be visible. Creating new content to focus on these newer terms and phrases is a good way to keep your site growing and accessible to readers seeking this information.

No Schema Markup

It isn’t that schema markup is anything all that new, but Google has let it be known that it “may” be using the presence of schema markup as a ranking factor in the future. Schema has been around for a number of years, but for many organizations without a full technical team, implementing it was a challenge. And since it was somewhat less important in the past, many older sites do not contain any schema markup at all.

Applying schema to your website may still be a challenge for smaller organizations, as it is a bit more technical, but it is a task that should be done. Consultants can help. For a little background reading, the schema.org page can provide some good details.

Insecure Sites Using http://

Another up and coming change in the Google world is conversion to secure websites. Older sites were typically constructed using a simple http:// format. Now, Google favors or is in the process of favoring secure websites in its search algorithms, so converting yours over to https is an absolute must.

Old Landing Pages

This is a fairly common problem for organizations that have undertaken numerous outreach efforts over the years. Many web managers created landing pages for specific marketing efforts, including some duplicates that were designed for different campaigns. Old landing pages that receive few visits may be decommissioned, but should still have a 301 redirect set to take that odd visitor to a meaningful page. Older landing pages may have 301 redirects set to other content to add bits of page authority for newer content.

Non-Existent or Incorrect Canonical Tags

When websites are expanded or built onto, a common omission is setting a canonical tag. In most cases, the canonical tag will prioritize the URL and page that it is found on, but occasionally, it will direct the search engines to prioritize another page within the site. Particularly for instances where some duplication of content is present, it is very important to set a canonical tag that prioritizes the one URL that is the most important. Campaign landing pages are commonly deployed without canonical tags. A thorough review of a legacy website should include a look at each canonical tag.

Time for a Website Redesign?

After looking at your older website, there may simply be too many problems to effectively fix one by one. Sites with web pages that end in .aspx, .html or other file extension are usually good candidates for a website redesign.
Website redesigns cost real money however. And, changing technologies make selecting a platform a challenge that many business managers would rather not deal with. Still, it may be time to bite the bullet and spend the time, effort and money to redesign your site so that your company can reach new readers and customers. If not a redesign, then at the very least, review these ten items for things that may be going on with your website that are preventing it from reaching its full potential as a marketing outreach tool.

For more information, please contact the staff at Parallel Path. We work with companies to utilize all tools at their disposal to increase website visibility and traffic.

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow

Get every new post delivered to your Inbox

Join other followers: