0203 637 43639am-6pm Mon-Fri

Client Profile: The Long+Short

long-and-short-banner

Using Technical SEO To Boost A Popular News Website’s Performance in Google

The Long+Short is a quarterly, online publication that delivers journalistic and storytelling content on the topic of innovation. The publication is free, and the content for each season explores a particular innovation theme.


Project overview

The Long+Short website was redesigned and moved to a new platform in late 2015. The previous website structure, built on a legacy platform had some architectural issues which were affecting SEO performance. The website was also suffering from poor user experience as it was quite difficult to navigate through the content. The Long+Short team had already planned to overhaul the design to increase traffic and engagement – particularly from Social Networks.


Website Migration & SEO

Web Method were approached during the project to assist with the migration to the new site, specifically with Search Engine Optimisation (SEO) in mind. The existing site already had a large amount of content in place on a non-keyword friendly URL structure which would need to be migrated seamlessly. The site also had a substantial number of deep backlinks and these would need to be preserved on the new site to protect keyword rankings and organic traffic. Our tasks were as follows, and we’ll explain how we did them in this article:

  1. To ensure a seamless website migration to the new platform without any loss of SEO traffic
  2. Perform SEO healthchecks for the new website to make sure that it is configured optimally for technical SEO
  3. Conduct SEO benchmarking performance before the migration and one month after to measure the difference in performance.

long-and-short-580

URL Structure & Migration

For all the migrated URLs, 301s needed to be put in place on the new platform to preserve link equity and for the new pages to rank more quickly in Google.

Firstly, we needed to decide on a new URL structure that would be optimised for SEO. Like many news and publication websites, the articles are grouped into themes (categories) and we stripped all unnecessary keywords to make the URLs as clean as possible. This created a set of canonical hub/theme pages which could be optimised for target keywords and linked internally to boost their authority and rank:

e.g. http://thelongandshort.org/<theme> —> http://longandshort.org/cities

For articles, these had a one-to-one relationship with categories so we weren’t faced with a common E-Commerce problem whereby a website has a product in more than one location. This can result in duplicate content and hurt rankings if canonicals aren’t employed correctly. Article URLs were structured as below, we decided to keep the <theme> segment of the URL because often the article title didn’t include relevant keywords:

e.g. http://thelongandshort.org/<theme>/<article-title> —> http://thelongandshort.org/cities/le-corbusier-rebuild-paris-plan-voisin

Canonicals were also employed across the site to remove any potential issues with duplicate content. This can easily be caused by development changes that might accidentally expose the entire website in duplicate form.

e.g. if a canonical isn’t in place for http://thelongandshort.org/, then if GoogleBot found a URL such as http://thelongandshort.org/?track=1 then it’s possible that the whole site will be crawled with that ?track=1 querystring on it.


Web Crawlers & Backlinks – Finding The URLs to 301 Redirect

The previous website had no sitemap.xml in place and no way of exporting the URLs indexed by Google in an easy list. Therefore the only way for getting a list of URLs was to crawl the site – for which we used ScreamingFrog and DeepCrawl.

The site structure meant that our crawlers hit some dead ends, so we also captured landing page URLs from Google Analytics.

We also used Google Search Console and Ahrefs amongst some other SEO auditing tools to capture the list of URLs with backlinks pointing to the site – including links to images.

We crawled all of these and came up with a comprehensive list of every page requiring a redirect for the Long+Short team to configure.


Project Summary – Why Technical SEO Matters For Websites

Read: SEO Differences between News and E-Commerce websites

We saw a significant uplift across core SEO KPIs after the Long+Short website re-platformed. There were no supplementary content creation, outreach or PR activities that could explain this increase. It’s impossible to put the benefits down to a single factor, but thorough technical SEO auditing & digital marketing audits are often as a result of a combination of smaller factors that make up the whole.

We know that improving the site structure to make it more logically sectioned into a simple hierarchy made it more easily crawlable by Google. This, combined with clean, keyword-optimised URLs increases its ranking potential. Moving to a new web host reduced page load time and reduced the speed at which Google can crawl the site.

Finally, correct configuration of canonicals, sitemap.xml and robots.txt with all legacy URLs being 301 redirected put the new site in a very healthy technical state. We were able to verify all of this by running advanced, deep crawls of the site to mine the necessary data.

Monitoring the performance in Google Search Console and Google Analytics showed the increased relevance for a far wider set of keywords driving traffic to a wider set of landing pages.

HOW TO: Technical SEO Website Healthchecks

Aside from URL structure, redirects & canonicals, two very important files for technical SEO should be set up and configured properly – sitemap.xml & robots.txt.

Sitemap.xml

The presence and correct configuration of Sitemap.xml and robots.txt for a website is very important.

For larger sites, sitemaps should ideally be split out using a sitemap index file. This will improve indexing in Google and also allow you to diagnose indexing issues in Google Search Console. In this case only one sitemap was required, but we checked that all the URLs we found on the new site were included in the sitemap and vice-versa. If URLs are found in one but not the other, it can indicate a larger problem with the site architecture.

Robots.txt

Robots.txt is a very powerful and often misunderstood configuration file. People often think that URLs in robots.txt won’t be indexed, which is wrong as if Google finds a URL on the site or in the sitemap then it will go into its index regardless if it’s blocked in robots.txt. A very complicated or long robots.txt often indicates significant issues with the crawlability or architecture of a site. Or, it can indicate an attempt to craft PageRank throughout the site by directing GoogleBot. Both situations are inadvisable and it’s worth getting an expert opinion on the configuration of this file.


SEO Performance Measurement & Benchmarking

There are a number of tools and techniques you can measure or benchmark the SEO performance of a website. It doesn’t need to be difficult though if you focus on the few metrics that matter.

Google Search Console & Google Analytics

Google Search Console is a great dashboard for identifying the SEO health of a website. It’s far from perfect, but the information is straight from the horse’s (Google’s) mouth, unlike other SEO tools created by 3rd parties. It’s also free to use in full. It’s good practice to continually resolve GSC-reported errors and regularly keep an eye on performance.

Google Analytics should be used in conjunction with Google Search Console – they offer very different things but in combination are very powerful. Google Analytics doesn’t give any meaningful keyword information, but does give longer term & more detailed reporting metrics on user behaviour on site e.g. Bounce Rate, Conversion Rate, Pages per visit, Time on site. Google Search Console (Search Analytics) gives up to 90 days worth of keyword information such as impressions, clicks and position but no on-site performance data.

The two systems won’t match up exactly, but in the instance where you see traffic to a landing page increase in Google Analytics, it’s useful to use Google Search Console to get an idea as to which keywords are driving those clicks.

Other SEO Tools

For measuring core SEO performance, there’s really only one metric that matters – traffic (Sessions / New Users in Google Analytics). An E-Commerce site would also likely include Transactions and E-Commerce Conversion Rate to measure the value of that Organic traffic.

There are hundreds of decent SEO tools that will allow you to report on wider metrics – such as competitive analysis, search visibility, keyword rankings & backlinks. These are all supplementary to the core tools already offered by Google for free.

Third-party SEO tools will help you diagnose some issues and track other SEO performance aspects, but the most useful ones tend to be when used for keyword research/planning and website crawling. Google’s tools are limited in these areas.

Client Profile: The Long+Short

The Long+Short is a quarterly, online publication that delivers journalistic and storytelling content on the topic of innovation. We provided Technical SEO Consultancy services to devise the optimal site architecture, migration strategy and Web Analytics Benchmark reporting.