Blog

Utilizing the Wayback Machine for SEO

2 March 2015
By: Andy Romain

The Wayback Machine from the non-profit Internet Archive organization is one of the coolest tools that I use on a regular basis. The tool isn’t intended for SEO but you can use it for search engine optimization with great results.

Brief History of the Wayback Machine

The Wayback Machine is a digital archive of the World Wide Web and other information like books, movies and music. The tool was created by the Internet Archive, a non-profit organization, based in San Francisco, California. The service enables users to see archived versions of web pages across time and since 1996, they have been archiving cached pages of web sites. Generally speaking, they revisit sites every few weeks or months and archive a new version if the content has changed. Prominent webpages like the homepages of Amazon.com and Google.com appear to be archived up to 30 or more times per day on average. The intent is to capture and archive content that would otherwise be lost whenever a site is changed or closed down. Their grand vision is to archive the entire Internet!

Wayback Machine Interface

The tool is incredibly easy to use, simply enter in the URL and you will be presented with a neat graph and calendar with visual elements depicting when and how often the URL has been archived over time.

History of Amazon.com

Wayback Machine Interface

Wayback Machine in Action

The two screenshots below of Amazon.com’s homepage in 2006 and in 2015 show just how powerful this tool is:

Amazon.com in 2006

 

Amazon.com's Homepage in 2006

Amazon.com in 2015

Amazon.com's Homepage in 2015

From the archived webpage you can even attempt to click around to other archived pages or view the page source which means that the archived page is not simply a static image or a screenshot.

Usage in SEO

The Wayback Machine has three major uses in SEO:

  1. Researching older links that may have disappeared
  2. Tracking website modifications that resulted in traffic changes
  3. Using archived pages as evidence

Researching older links that may have disappeared

  • Combined with a broken link checker you can get a visual of the broken link. Far more insightful than trying to decipher what www.domainname.com/2014/product/1 is. This will allow you to 301 redirect the webpage for example to the correct page with more confidence. Or perhaps inquire as to why a valuable page or set of pages no longer link to your site while providing a visual to the person or organization in control of that website.
  • You can use the Wayback Machine to help revive deleted content. In most cases this content will be backed up in the cloud and/or offline but I’ve found that some quick formatting or wording changes sometimes make it to the HTML version of the page aren’t made in the corresponding word processing document so the latest archived version may be the most recent version of the content.

Track website modifications that resulted in traffic changes

  • If you or your team was responsible for a site architecture overhaul the Wayback Machine could be instrumental had you not taken screenshots prior to the redesign. Perhaps there is a correlation between a particular navigation structure and a sudden drop or increase in organic web traffic and you need to or would like to recall what the site looked like prior to the redesign.
  • Combined with a file or document comparison tool you can compare the code side by side and highlight the differences. Again, perhaps there are major or specific website changes that correspond to peaks and valleys in organic traffic.
  • You may also want to see how the site and competitor websites evolved over time and where you think the marketplace is headed in terms of SEO, marketing and development.

Using archived pages as evidence

  • You can use the date of an archived page as proof that you did not make a change to a website that is perceived to have had or simply did have a negative impact. Since you or your team did not have access to the site when the change was made for example.
  • You could also use an archived page as proof of link acquisition for a link that was acquired after the webpage went live.

The above assumes that the webpages in question are archived of course. I have come across instances of inner pages not being archived, especially for less popular sites. I have also come across a few sites which block the Wayback Machine in their robots.txt file altogether. Regardless, given that 455 billion web pages are currently archived by the Wayback Machine, especially top level pages from what I can see, it has plenty to offer as a quality SEO Tool.

Slow and Steady Does Not Win the Race…

27 February 2015
By: Eve Tyler

Attention website owners: Google is reportedly introducing a red warning label in the search results for websites that take a long time to load.

On Tuesday, Google+ user K Neeraj Kayastha posted a screenshot of an Android search where 2 of the websites included appeared with red “Slow” labels underneath the title: red label

The label appears to indicate when a website will take a long time to load to warn users before they click the link.

So what could this mean? Essentially, it seems to be the latest message to site owners to review their user experience and make sure they are optimised for mobile, or sites could lose significant amounts of traffic to better-performing competitors. When Search Engine Land reached out to Google for a comment on the update, Google rather cryptically answered: ‘we’re always experimenting.’

This news further supports the growing importance of mobile SEO: Google previously tested a “mobile-friendly” label in the mobile search results, which went live in November 2014. So take some time to evaluate your site so you don’t get stuck with a red label!

 

Are your Ads Relevant? Well Facebook can now tell you!

12 February 2015
By: Kathryn Green

download

 

Facebook are changing the world of advertising forever by creating a new tool that will help advertisers compete for expensive and scarce ad slots. However, unlike before, the tool will be able to say how relevant an advertisers ads are to their target market – making sure that ad space is not being wasted. This will allow advertisers to monitor their scores over time and tweak less relevant ads when needed, potentially lowering an ads price and boosting it’s performance.

The tool will score an ads relevance between 1 and 10, with 10 being highly relevant. This is scored by how positively or negatively the target audience may respond which depends on different aspects. A positive response would depend on video views, shares and clicks on the advert, whereas, a negative response would depend on audience hiding the advert or reporting it as spam. Facebook will take all this into consideration, and when an ad has been served 500 times, the tool will score an ads relevance. This stops irrelevant content flooding peoples news feed and only shows what a user will find interesting.

Throughout the years, the average price for a Facebook ad has been on a steep incline, however, the number of ads Facebook served has dropped. For example, in the first quarter of 2014 the price per ad rose by 335%, unlike the number of ads served, which decreased by 65% – the competition for ads to be shown is fierce.

Facebook don’t just want the ad that has the most expensive bid to show any more as cost does not make an advert superior. Judging an ad on expense may not attract the right audience and will alienate users and eventually – like what happened to Myspace – they may move on to another social media platform. Facebook is going to become more clear with advertisers about important relevance score is and why is matters when serving ads.

Now with this new tool, Facebook will not just be looking at bids any more and will be taking in to consideration relevance score. However, it must be made clear that relevance score varies depending on the objective of each advertiser, such as an ad for an app download will prioritise the ‘Click to Install Button’ when calculating a relevance score. The tool will have the biggest impact on a brand who when seeing the adverts wants a user to take action such as clicking through to the site, however, brand advertisers just aiming for gaining attention will see a smaller impact on delivery and cost. Also, ads bought on a guaranteed basis (Facebook agreeing on a certain number of impressions) will not be affected.

 

Google To Launch Mobile Ranking Algorithm?

11 February 2015
By: Alice Riley

Mobile algorithm

It goes without saying that mobile usability is a must for websites in 2015. However, there is now one more reason to make your website mobile-friendly. Google is thought to be in the process of creating a mobile ranking algorithm, which would factor in mobile usability when ranking websites in the mobile search. Which would make sense as 1 in 7 searches now comes from a mobile device.

Google is yet to confirm this, however over the past couple of years they have been taking steps leading up to this.

The Broken Mobile Site Penalty

Back in June 2013, Google introduced penalties in the mobile search for websites with mobile usability issues. They announced that to improve the search experience for smartphone users, they are going to roll out several ranking changes to address websites that are misconfigured for smartphones. They aimed to address the main problems experienced by smartphone users, such as fualty redirects and seeing an “error” screen when trying to access a website.

Mobile Usability Report

Last October, Google added a new feature to Google Webmaster Tools. The Mobile Usability Report shows users the common mobile usability issues with their website to enable them to be fixed and improve the user experience. The most common issues flagged up in the report are flash content, fonts that are too small to read, fixed-width viewpoints, content which isn’t sized to the viewpoint, and links which are too close together to tap.

Mobile Friendly Test Tool

Google introduced a new tool to Google Webmaster in November 2014, which enables website owners to check whether or not their website is mobile friendly. The tool will give a website either a “pass” or a “fail” grade, and let the user know what the issues are with the site.

Mobile Friendly Label In Search Results

In November 2014, Google also introduced the “mobile friendly” label to its mobile search results. There is now a text label underneath the URL letting the user know if a website is mobile friendly, to help mobile searches know which websites are best to click on. Google looks for features such as avoiding software not common on mobile devices, readable text, content which is sized to the screen and links that are far enough to tap.

Special Rankings For Mobile Friendly Sites

Around this time, Google also experimented with giving special treatment to websites who have earned the “mobile friendly” label. It is said that Google will give a “boost” to websites who have adapted their websites for mobiles in the mobile search results.

Google Sending Mobile Visibility Warnings

Last month, Google began sending notifications to the owners of websites that are not mobile friendly. These notifications are being send via Google Webmaster Tools and email, and warn users to “fix mobile usability issues found on…” or their websites will be “displayed and ranked appropriately for smartphone users”.

These are clear signs that a new development in Google’s algorithm is coming. These mobile visibility warnings and predicted changes in the mobile search could be the final stage in completing the transition from desktop PCs to a fully mobile internet.

Upgraded URLs

10 February 2015
By: Hayley Shannon

One of the ongoing frustrations we encounter as PPC specialists is losing the months of hard earned data following updates to destination URL tracking. When using value tracking parameters or a 3rd party tracking service, these changes are all too frequent. A work around to this is creating duplicate ads with new URLs, however this because time consuming and often difficult to manage (100 paused versions of the same ads with unique URLs? No thanks!). Google’s newly rolled out Upgraded URLs may be a partial fix to this issue.

While Upgraded URLs will not be of benefit to complete landing page changes, they will allow us to flexibly update tracking parameters while maintaining data and Quality Score. Google has redefined landing pages by treating Destination URLs and the newly created Final URLs as separate entities. By establishing the Final URL, you are telling Google what the domain is (similar to Display URLs), and by using the newly developed tracking template and custom parameter fields, you can easily append the tracking parameters to the Final URL.

This not only provides access to new insights into ad interaction, it will (hopefully!) save advertisers considerable time in managing URL tracking updates.

I am eager to see if this is another great Google innovation, or if it will be sunsetted like many of its other Beta roll-outs.

Google and Twitter’s New Deal: The Lowdown and Potential Impact

6 February 2015
Mark Pitt By: Mark Pitt

Twitter and Google Firehose deal

 It has been reported on Bloomberg that Google have struck a deal with Twitter to allow them to index tweets immediately after they are published. This reignites a deal which ended without renewal in 2011.

Currently, Google does a decent job of crawling Twitter to find popular tweets but does not have the capabilities to capture and index all tweets in real time. Therefore just a small handful of tweets currently appear in Google search results compared to the amount potentially possible.

Twitter will now be giving Google access to a feed of tweets (over 6,000 a minute), meaning tweets may appear in Google search results very quickly after being posted.

Twitter links have been appearing in Google search results since 2011, but mostly to account pages rather than individual tweets.

Potential Impact

The future impact of this deal is yet to be seen, but we can speculate that Twitter may become more utilised by SEO professionals in their regular activities. Tweets may become more optimised towards specific keywords and phrases.

This could for example make organic search a channel for driving traffic towards temporary offers, products and discounts announced on Twitter, or for grabbing organic search traffic towards topical and trending stories.

Ultimately, tweets could open to a whole new audience of search engine traffic, rather than just twitter users. Even logged out or unregistered users will be able to view tweets in some form.

The importance of using Twitter for SEO could grow more if Twitter were to extend the firehose to include things like Twitter cards and location data.

On the negative side, it appears that ‘Blackhat SEOs’ (Aka spammers) are getting a little excited, with comments in Black Hat forums such as:

“I see so much potential with this. I’ve already started snatching up vanity urls… As most have i’m sure but going for more longtail…”

“let the spam games begin”

Let’s hope we don’t see spammy tweets filling our search results in the near future, as can sometimes happen after new Google updates, such as the Google Pigeon update which negatively affected local big brands initially with spammers prevailing. Fortunately though, further Google updates are usually able to iron out bugs and prevents things like this from continuing.

In the long-run, the re-integration of Tweets in search results could be a major change in organic search engine results, and perhaps the first of many tweaks by Google to give social signals more weight as a ranking factor.

 « Next