This thorough website migration guide will help you address possible SEO issues when doing the website migration. This guide helps you identify and mitigate possible SEO issues such as URL issues, meta data, sitemap, structured data etc.
Page Contents
Mixed Resources
- Do not have resources coming from mixed sources i.e. http and https.
URL Structure
- Do not change any URL structure. If there are any changes coming, please keep a track of it
Similar URL
- If you are going to create a new URL which will give similar information to an old URL on the website, set up 301 redirects correctly
301 and canonical setup
- Crawl the old website URLs before migration
- Crawl the new website URLs from testing server or after the launch
- Map the above URLs
- Set up the right 301s and canonicals
Metadata Changes
- Try to retain the meta data such as meta title, meta description as much as possible from the old website
- If you are any automation to automatically generate the meta title or meta description, please review that
- Crawl the old website meta data before migration
- Crawl the new website meta data from testing server or from the launched website and compare both of the metadata changes and try to avoid many changes
Structured data
- Structured data needs to be reviewed as it might contain still be pointing or mentioning the old brands and URLs
- Some of the structured data that you might want to revisit is
- website
- business
- product
- contact etc.
Main and secondary navigation
- Discuss and finalize on the main and secondary navigation items
- The navigation also needs to defined through breadcrumb schema
XML Sitemap
- Entire sitemap needs to be unique as per each host. Tech team needs to refer to this URL for sitemap implementation.
HTML Sitemap – After Launch
- HTML sitemap on the website also needs to be changed/ upgraded
- Review the HTML sitemap on the testing server or on the newly launched and migrated website
Custom 404 pages
- There might be a lot of 404 pages initially. A custom 404 page needs to be developed along with HTML sitemap option initially on the 404 page so that users can find the right pages
Page specific items that needs to be updated
- Meta title
- Meta description
- h1–h6 headings
- Add or amend the default canonical tag
- Set the meta robots attributes to index/noindex/follow/nofollow
- Add or edit the alt text of each image
- Include Open Graph fields for description, URL, image, type, sitename
- Include Twitter Open Graph fields for card, URL, title, description, image
- Bulk upload or amend redirects
- Update the robots.txt file
Identifying priority pages
- Usually around 20% of the pages are responsible for 80% of the traffic.
- In that case, we have to identify the top priority pages responsible for the majority of traffic.
- We can look at the following data in identifying such pages
- Organic visits during the last 12 months (Analytics)
- Revenue, conversions, and conversion rate during the last 12 months (Analytics)
- Pageviews during the last 12 months (Analytics)
- Number of clicks from the last 180 days (Search Console)
- Top linked pages (SEMRUSH and Search Console)
Internal URL Linking & Changes
- Internal URL linking should be changed to right corresponding URL
- This whole scenario points back to setting up server level automated redirects
- If the above is not possible, then it has to be done manually on page by page basis
- Use a tool like SEMRush to keep track of the changes in the internal linking
Backlinking reachout
- High DA backlinks needs to be reached out and requested for change in the URL
Keywords rank tracking
- Just like top 20% pages, top keywords should also be listed out separately.
- Keywords corresponding to the top ranking pages should also be listed out separately
- Keywords ranking should be monitored on a higher priority basis
Old site crawl data
The following data from the old site should be stored somewhere for reference:
- page titles
- meta descriptions
- h1-h6 headings
- server status
- canonical tags
- noindex/nofollow pages
- inlinks/outlinks etc.
- Robots.txt
- XML sitemaps
- HREFLANG
Google Analytics
- A separate GA property should be defined and all the old views should be implemented beforehand.
- Old GA data should also be retained for reference
Internal linking review
Internal links must be reviewed throughout the entire site, including links found in:
- Main & secondary navigation
- Header & footer links
- Body content links
- Pagination links
- Horizontal links (related articles, similar products, etc)
- Vertical links (e.g. breadcrumb navigation)
- Cross-site links (e.g. links across international sites)
Search Console data
Data worth exporting includes the following:
- Search analytics queries & pages
- Crawl errors
- Blocked resources
- Mobile usability issues
- URL parameters
- Structured data errors
- Links to your site
- Internal links
- Index status
Setting up redirects guidelines
- 301 Permanent redirects will tell search engines to index the new URLs as well as forward any ranking signals from the old URLs to the new ones.
- 302 (temporary) redirects should only be used in situations where a redirect does not need to live permanently and therefore indexing the new URL isn’t a priority.
- Meta refresh and JavaScript redirects should be avoided. Even though Google is getting better and better at crawling JavaScript, there are no guarantees these will get discovered or pass ranking signals to the new pages.
Pre Launch testing
The purpose of this testing would be to:
- Make sure that the user journey is same as the old site
- Identifying content-related issues between the old and new site
- Identifying things like redirects, canonical tags, or XML sitemaps etc.
- Make sure that search engines do not index the new site before testing.
Launch Day
- Keep the downtime of the old site minimum while replacing it with the new one.
- Web server must respond with a 503 server response to any URL request at the time of replacement.
Post Launch
- Check robots.txt file to make sure search engines are not blocked from crawling
- Checktop pages redirects (e.g. do requests for the old site’s top pages redirect correctly?)
- Check top pages canonical tags
- Check top pages server responses
- Noindex/nofollow directives, in case they are unintentional
Search Console actions
- Test & upload the XML sitemap(s)
- Set the Preferred location of the domain (www or non-www)
- Set the International targeting (if applicable)
- Configure the URL parameters to tackle early any potential duplicate content issues.
- Upload the Disavow file (if applicable)
- Use the Change of Address tool (if switching domains)
Phase 4: Post-launch SEO review
- Publishing URLs in bulk might leave the website crawled but not indexed. To address that publish HTML sitemap on the website
- Use Tools such as SEMRush, Screaming frog, or DeepCrawl to track the number of errors, warnings, recommendations etc. in the first week after the launch
- Review crawl errors on the daily basis for the first week
Site migration performance measurement
- Site speed of the check on desktop and mobile
- Keep track of the soft user engagement metrics such as bounce rate, pages per session etc.
- Keep track of the hard metrics such as conversion rates, goal completion rate
- Keep track of the Number of Submitted vs crawled vs indexed pages if there are multiple redirects and new URLs