Developer and SEO/marketing teams can proactively test web pages for SEO impact
April 28, 2020 07:00 AM Eastern Daylight Time
NEW YORK–DeepCrawl, the world’s leading cloud-based technical SEO platform, today announced the launch of Automator, a tool that provides SEO quality assurance (QA) by allowing developers to test their code for SEO impact before pushing to production. Automator is designed to enable improved collaboration between developers and SEO/marketing teams so they can easily and proactively mitigate any risks that may lead to a loss in site traffic. As a smart, automated and frictionless tool, Automator provides greater efficiency and significant cost savings for customers.
According to the Systems Sciences Institute at IBM, the cost to fix a bug found at the implementation stage is approximately six times more expensive than one identified during production. For any brand making constant website deployments and updates, human error increases the risk of impacting search visibility, rankings and traffic. Automator can run more than 160 tests for SEO in multiple pre-production and QA environments and flag any critical issues that the new release may potentially cause. This helps developers and SEO/marketing teams to work in tandem to avoid linking to or creating broken pages, ensure metadata meets best practices, and be alerted of any SEO regressions. As such, using Automator, brands can mitigate the risk of deindexing revenue-driving pages, which can impact the bottom line.
“DeepCrawl Automator is a very reliable tool. We used Automator, for example, to check if anything is redirecting where it shouldn’t be,” said Sebastian Simon, Senior SEO Manager of Heine. “Before, we had to check everything manually, but with Automator, we can set up tests beforehand and really see what happens. It’s a great relief to know there is something that will notify us if anything has changed.”
Developer teams can deploy Automator as part of the QA strategy to run in parallel with manual, unit and integration tests. During four months of beta testing, Automator found SEO defects in approximately 35% of releases, which had the potential to impact revenues. As such, when QA engineers have access to a 360-degree SEO QA analysis, they can make an informed decision on issuing a release without impacting the production environment. Automator can be customized to automatically block a release if tests fail or proceed with the release while issuing warnings via email or Slack.
“Automator is a unique tool designed to help agile organizations protect revenue from the largest digital channel – SEO,” said Michal Magdziarz, CEO of DeepCrawl. “Especially during this time when brands’ online presence is so critical, Automator can be implemented to reduce the risks associated with a decline in rankings due to code changes which can disrupt overall site performance.”
Automator is implemented into CI/CD platforms, such as Jenkins, Github Actions or TeamCity, using native integrations, shell scripts, or a GraphQL API.
DeepCrawl is a pioneering technical SEO platform that helps brands to accelerate growth and mitigate losses in organic search performance. Its enterprise cloud-based web-crawling technologies and solutions help brands to diagnose and fix technical and performance issues to generate increased profitability. DeepCrawl has offices in London, New York and Poland. DeepCrawl’s investors include Five Elms Capital and Beringea. For more information, visit https://www.deepcrawl.com/.