Managing a website redesign project for SEO

Without an understanding of the SEO implications, a website redesign carries a significant level of risk, and you could see a dramatic drop in your existing organic rankings, traffic and revenue. In our webinar “SEO for Website Redesign – Getting it Right From the Start”, we gathered a lot of great ideas from two people with a lot of experience in making sure redesigned websites don’t crash and burn with search engines.

First up was Jeff Muendel, the SEO Manager, eCommerce Group, at Grainger. Jeff explained what a manager in a large organization needs to do to make sure a new website redesign project stays on track and is properly optimized for search engines from the beginning. For Jeff, it comes down to three key principles:

1. Engage

• Get engaged in the redesign project and process early on.

2. Clarify

• Clarify the needs of natural search.

3. Verify

• Verify that your SEO requirements are understood and incorporated.

Engage

When should you get involved with SEO in the redesign process? If you consider that there are three main elements of SEO – website architecture; inbound links; content – then architecture is your primary focus at the start of the process. Jeff recommended engaging at the point of wireframe production, because this is the point where the primary website architecture is being defined.

Clarify

During each iteration of the website redesign project, clarify the needs and incorporate them into project requirements.
– Document those attributes that make the site easy for search engines to index.
– Don’t just tell; explain why these requirements are needed.
– Clarify what needs to be avoided!
o If this is understood and they don’t get it all right, at least they won’t get it all wrong.

Verify

At the stage where the site redesign is nearing completion in a “sandbox” setting, verify that the SEO requirements are in place.
– Is the site architecture compatible with search engine crawlers?
– Is the textual content populating correctly?
– Is there a 301 redirect plan in place?
– Are XML sitemaps in place?

Jeff also pointed out that you need to have a ‘clean up’ plan in place, too, because despite everyone’s best efforts, it probably won’t be perfect. Make it part of the plan to find issues over time, and correct as needed. And remember to take notes for the next redesign…because it will happen again.

Charlotte Bourne is a Senior Search Marketing Strategist at Mediative, and has lots of experience guiding marketers through web redesign projects and keeping the SEO on track.Googlebot

Charlotte says you need to think like Googlebot

The goal of the search engines is to provide the most relevant search results to a user, based on a given query. The goal of SEO is to make the search engines understand that your content is the most relevant search result for a given query.

Crawling

When you want search engines to index your website’s pages, think about how the their ‘bots crawl the web. Charlotte offers some essential tech tips:
– A 301 redirect plan MUST be in place if you are changing the location of content to allow page rank and authority pass to the new page.
– XML Sitemaps, which you can consider as a secondary crawl tool, are not a replacement for a clear site structure and logical information architecture, but they are really useful when:
o you have content but no links to it
o you have a very large website with deeper content, like deeper product pages on an e-commerce site
– HTML Sitemaps are an additional crawl tool, not just an aid to some of your human visitors.
– Use robots.txt, to make sure search engines don’t crawl pages you want to remain private.

Be careless with these technologies and you may create SEO headaches:
– AJAX
– Flash
– iFrames
– JavaScript
– Dynamic parameters

Indexing

Getting your pages indexed should be no trouble for certain file types:
– Html and equivalents
– PDFs
– Optimized images and videos
– MP3

Beware: low quality pages and duplicate content hurt your SEO. The latest Google updates are removing them from their index. To know how your site is doing, you can monitor Google Webmaster Tools; keep an eye on your indexation rate, which can act as an early alarm that some of your web pages are being removed from the index.

Ranking

Once your pages have been indexed, they have a chance to rank. There are over 200 factors that Google considers but the main factors that Google considers are:
– Domain authority
– Page authority
These are a function of how many people link to your site and page

To optimize each page, pay attention to the page’s meta data, the header tags, and of course the page copy. Keep in mind that there is more than one kind of search, so remember to take advantage of search verticals, such as:
– Blended search
– Social search
– Local search
– Mobile search

Usability

Ultimately, a website that is easy for people to use is also good for SEO! The reason is that the factors that influence the search engine rankings change over time, but good usability will never be discounted. The search engines are trying to deliver relevant results, and websites that are engaging will benefit in the long run. Charlotte briefly touched on some methods of testing the user experience to get insights into improving a site’s usability, including:
– Remote User Testing
– Expert Usability Assessment
– User Testing
– Informal testing

Results

It’s important to analyze your website’s results, and you can start by establishing baseline metrics:
– Organic search traffic
– Rankings
– 404 errors
– and on the usability side, measure:
o time on site
o bounce rate
o conversion rate

After the webinar presentation, there was a little time to field some of the questions that had come in. Here are a couple of the questions.

Q: When is the best time to implement 301 redirects?
A: Jeff’s reply was whenever you open the flood gates to the new site, or even possibly a little in advance (but not too far) to get the bots discovering the pages, but be careful not to confuse people by implementing them too soon. If you are merging two sites, migrate your top content to the new site, then redirect.

There were some questions about a site’s crawlability…
Charlotte suggested making sure your IT team knows the difference between a 301 and a 302 redirect, and check that your robots.txt file is correct!

Q: Can you suggest tools to help understand how crawlable your site is?
A: The search engines themselves can tell you something, using the site:colon command, and Google Webmaster. XENU Link Sleuth is a good tool for reporting on 301/302 errors.

You can catch the full webinar on-demand now.