Preserving Seo

I far too often hear “You can’t help but loose SEO when you refresh a site”. This misguided theory is wrong. Unfortunately, I often see people wreck search ranking with a site rebuild, but that does not have to be the case.

If we take the time to understand how search engine optimization works, it can then be fairly easy to understand how to preserve that all-important search ranking.

First things first …

How Does SEO Work? (Spoiler alert, it’s just magic)

It is too complicated a thing for anyone person to completely understand how SEO works, but many of us have our theories. I take the stance that, among other things, search engines are looking to reduce their costs by giving their clients (the person searching) the best experience possible.

Consider that Google processes over 3.5 billion searches per day. That volume requires tremendous amounts of expensive hardware. If a search engine can reduce search requests by even 1%, they will end up saving large amounts of money. The number one way to reduce search requests would be to provide the optimal result on the first query.

Once upon a time, relevant content was enough to receive top rankings. Today however, things like mobile readiness, page errors, page load time and cross browser compatibility all play a role. This multi-faceted look at SEO explains why the same search will produce different results, not only from desktop and mobile, but also per browser.

An organization as robust and advanced as Google or Bing can easily determine if a website will render poorly in a particular browser. If the person searching is using Internet Explorer, the search engine is highly unlikely to recommend websites that will not render well in that browser.

Keep this concept in mind as we think about preserving SEO. Consider what would constitute a good experience for the search engine’s client – the person searching for information on the internet.

To Cut Down a Tree in Five Minutes, Spend The First Three Minutes Sharpening Your Axe

Always start by cleaning up (or creating) the Web Master Tools for the current site. This step should the very first task for any website refresh.

If the client has Google and Bing webmaster tools set up, update all the information here. If the client does not have these resources, set them up immediately. Create an up to date XML sitemap and submit it to both Google and Bing.

Add the XML sitemap to the site at /sitemap.xml, or /sitemap-index.xml, depending on how large your sitemap is. Then indicate the location of the XML sitemap in your /robots.txt. These updates to the XML sitemap are the fastest ways to get pages indexed in any search engine.

While you are in webmaster tools, correct all crawl errors. Crawl errors stem from either a moved page (404 not found error) or a miss-step in your robots.txt. Fixing crawl errors will save you headache later in the process, but also let’s search engines know that you are actively working on the site.

Make a notation in your analytics as to the when the refresh started, and when which updates were made. It is great to be able to look back and know when you made what changes.

If it has not been set up, be sure to set up in site searches. In site search data is an exceptionally valuable way to understand your user base. Anything users are searching for has value to them, and is under represented in the current site. Always set up in site search and be sure to check the data.

Make small changes to the current site

When you make the XML sitemap, also create an up to date HTML sitemap, and add it to the website at /sitemap. The HTML sitemap is one of four “required” pages for all websites. If you do not have them,  make the other three required pages – /linking-policy, /privacy-policy and /terms-and-conditions.

These “required” pages are all highly important for their own respective reasons, but many websites are missing them. Take the time to create good policy, and document it on the site where people and search engines can both find it.

Once you have created the four required pages, link to them from the footer. If possible, link to these pages from every page of the website, but at the very least, the home page.

On top of these “web 101” changes, I also check off as many site wide QA/QC items I can. I use a quality assurance and quality control checklist to ensure every build meets the best quality I can. I’ll tick off the easiest of these, such as caching, compression, minification, keep alive, Google Plus publisher integration, a favicon suite and a killer 404 page.

I will then start to review site architecture. Where it really makes sense, I’ll move and rename pages as needed to create a solid Silo structure. Typically I am looking to un-flatten the sitemap or correct really bad URLs.

The most common correction to URLs I make is the proper use of dashes. Keep in mind that to search engines /about-us is read as about us. Conversely, /about_us is read as aboutus.

Any resources I move, are immediately redirected using a 301 redirect. I never, ever, want a user to be encountered with a 404 a page. Monitor your webmaster tools, as they will indicate to you any resources they were unable to find.

These small changes to the current website let search engines know you are working on it, so they crawl the site more often. Remember that most websites that need a refresh have been stagnate for  years.

Search engines often do not re-crawl these static sites becasue it costs money and resources. They focus their crawling efforts on sites that are being regularly updated. By making small changes to your site, you nudge yourself away from the old stale sites, and towards the regularly updated sites.

Now we start to build the new site

I always build on a server, and on a development URL. If the website URL is examplesite.com, I’ll build on examplesite.co or example-site.com. Please note, this is just a development URL. When we go live, the URL is changed to match the current site.

As soon as I create the server, I password protect the root directory. The password protection keeps anyone from un-intentionally seeing the site, but also prevents search engines from indexing the site.

We do not want to create a situation where the client is competing with themselves for search results. Furthermore, if a search engine indexes a development site, it could cause issues of duplicate content. Search engines will not index a site they cannot access, but you must protect it right away.

While building out the new site, I pay close attention to URLs. I name as much as I can the same, unless there is a major reason to re-name something. I consider /services to have the exact same SEO weight as /our-services. Unless I see a major reason to move or rename a page, I keep all URLs the same.

Consider search engines now detect SEO over optimization and punish it. So long as the page name is reasonable, I keep it. Where I discover a major flaw in page naming, I correct the page, and 301 redirect from the old to the new.

One area I typically do major renovations; creating silo structure. If you have a services page, you also want pages underneath this page. I’ll move /example-service to /services/example-service. The same goes for staff, products, about us, etc.

Any content that can be grouped, and stored in a silo should be. Silo structure makes it easier for search engines to crawl and index content becasue it unflattens your sitemap.

If you have two hundred pages, and they are all the same level, /some-page, how would a search engine know which are important? However, if you have /services, and ten services pages below /service, that page quickly stands out as a page of importance.

Ever wanted to create that menu of pages that appears in a search result and under the main URL? Silo structure, along with sideways or internal linking, is the number one way to create this menu of pages. Search engines are able to identify the most important pages, because of the site structure, and they provide shortcuts to them.

While building, I will be mindful of meta data. Again, where the meta data makes sense, keep it the same. I can not overstate search engines’ ability to detect SEO over-optimization. Don’t overthink the meta data, but clean it up where it is warranted or missing.

Meta data is another great place to make small changes to the current site. Match the meta data on the current site to what you changed used on dev. This gives the search engines something to small chew on while you are building.

The same method applies to site content. Re-write pages that need it; keep the old content that is good. Bring some of those content updates to the current site.

Ideally, we will slowly roll the old site into the new. It is much easier for the search engines to take in and process small gradual changes as opposed to one big bang. There is an added bonus of testing and monitoring the SEO impact of your new refresh.

If you are making notations in Analytics as you make small adjustments to the current site, you can track the effect those changes have.

Understanding the complexities of each clients traffic patterns takes time. By being closely involved in the analytics data from day one, I get insights I can use in the refresh. I’ll often discover trends that I can exploit or correct in the new build. As I see these opportunities for the refresh, I add the most impactful to the current site and monitor the progress.

Before you go live …

Then comes the big effort, check every indexed page, and confirm you have accounted for them on the new site.

I use Google for this step. If you Google search site:http://example.com/, inserting your desired URL,  you will get every indexed page for the URL.

I then go page by page, adding the trailing portion of the live URL to the end of my development URL. If the live URL is examplesite.com/about-us  I check for example-site.com/about-us. I do this with every single indexed URL.

If my URLS match, and example-site.com/about-us is a page, I move on. If example-site.com/about-us  is not a page, and I get a 404 on dev, I correct this on dev, with a relative URL redirect.

Because of the nature to the sites I build, I create my redirects in .htaccess. I would use something like   Redirect 301 /example-service /services/example-service   to redirect /about-us to /about-example-company. I do my redirects on dev, because the instant my new site goes live, every page is already redirected.

Think back on search engines wanting to provide their clients a good experience. If a page suddenly renders a 404, page not found error, a search engine will near immediately no longer list that URL in a search result. Why would they?

Many people who find a 404 page will go right back to the search engine.  If a search engine sends people to a URL which resolves in a 404 error, they would be slightly increasing the chances of a new search.

A new search is a new request and only adds to the massive quantity of searches they have to process. Therefor, knowing what pages result in a 404, and not listing these pages in a search result is good for business, because it reduces cost.

Last step – Create a new HTML sitemap and a new XML sitemap. Have these placed at their respective URLs. Have that XML sitemap ready to submit to your webmaster accounts.

Ready for Go Live

If everything has been built correctly, you should be able to push the new site live in minutes, and with zero downtime. That is a different topic all together, but it is worth mentioning. Go live should not be a scary or lengthly process. You should have a well practiced, bullet proof method for a basically instant go live.

The new site goes live

The second the new site is live, submit your new XML sitemap to Google and Bing webmaster tools. This officially alerts the search engines that major changes have been made to the site architecture. The search engine will then place you in a queue to have the site re-crawled and re-indexed.

If you started this process by cleaning up your webmaster tools, you will have a nice tight place to start form. If you have been making small changes to the site during the course of the build, the site will have already been re-crawled several times. You have effectively hedged your bets in the indexing battle by telling search engines this day was coming.

The minute after you submit XML sitemap, re-check all indexed pages. Perform the same site:http://example.com/ search, and confirm all pages are accounted for. Create a 301 redirect for any pages as needed. Theoretically, you should not have any 404 errors, but it always pays to double check after go live.

A stitch in time saves nine.

Preserving search engine optimization, all comes back to the preparations you make long before go live.

Many developers launch a new site,  then a week, or a month later the client calls to complain their traffic is down, or the phones are not ringing, or contact forms are not coming in.

The developers then may decide to make a few redirects. At this point it is a daunting and difficult task to try to determine what redirects need to be made. Furthermore, the developer has been paid, and is off to the next project.

Often the developer tells the client “You can’t help but loose SEO when you refresh a site. It’ll come back though.” Or, “Maybe you should contact an SEO expert, I’m just a developer”.

It is an all too common and such an unfortunate disservice to the client.