Natural Search Traffic Decreases
At iProspect, we believe that data fuels performance, and that information drives results faster than anything else. But what should a brand do when the data exposes an unexplained anomaly? We encountered just such a case with a client earlier this year when, after a long period of steady increases natural search traffic, we saw a sudden decrease that we couldn’t attribute to changes in Google’s algorithms. To help us get to the bottom of the issue, we mobilized our natural search team to dive into the analytics in collaboration with our data & insights (D&I) team. Working together, these teams regularly provide critical insights by combining data analysis with either SEO efforts or, as in this case, outside industry factors. This multi-faceted research approach enabled us to quickly determine that the Google App had been miscategorizing traffic in some analytics platforms. Based on this initial discovery, we decided to investigate further.
The Issue: “Dark Search”
The miscategorization was the result of something called “dark search.” Usually, dark search refers to natural search traffic that has been erroneously categorized as direct traffic in analytics platforms. However, the term can also apply to natural search that has been bucketed with referring traffic or other channels. Dark search can be very problematic for natural search because if it goes unnoticed for long periods of time, it can skew the natural search numbers negatively.
As we continued to look for influencing factors, we discovered that an update made to the Google App for Android (sometimes called Google Now) on April 27, 2016 may have caused an increase in dark search traffic. The analytics showed a spike in referring traffic and a dip in natural search traffic that coincided with the Google App update. The close collaboration between the iProspect natural search and data & insights teams allowed us to not only discover the anomaly and diagnose the problem, but also to come up with an effective solution for the client.
In this case, referring traffic includes any site visit that comes from a 3rd-party website or channel that is not already categorized as direct, natural search, paid search, display, etc. The green line in the chart below represents referring traffic that is shown to increase at the time of the Google App update. Referring traffic excluding Google Search App is represented by the orange line.
Re-Attributing the Traffic
Several factors must be considered when analyzing a shift of traffic across channels. These factors range from market-driven factors like seasonality to technical factors like tag management. In this case, however, the influencing factor was an outside force beyond our client’s control. Early in the analysis phase, it was unclear if this traffic loss was either not being counted at all or if there was a shift happening. This was an instance in which looking only at the data from the analytics platform would not have been enough to fully identify the issue. iProspect’s D&I team identifies data issues by analyzing the data in question, but in this instance the issue was showing up in clickstream data, indicating that the problem was being caused elsewhere. Outside research by the natural search team was the primary factor that led us down the correct path to actionable insights concerning the natural search traffic.
Once we were able to identify the channel to which this loss was being redirected, we were also able to work with the client to adjust how Adobe Analytics attributed these visits. It was obvious that referrals were increasing, but outside research was needed to bring the numbers into focus so we could create a client action plan to realign the data so it reflected reality more accurately. With analytics, it’s dangerously easy to get bogged down looking only at the data coming out of an analytics platform. However, in this case, researching other aspects of the situation provided a more qualitative approach to the quantitative problem, and this proved to be the key to our success in identifying and fixing this potentially far-reaching issue.
After identifying all other clients that were affected, we informed them of the issue and arranged to meet with the various client IT and analytics teams to discuss the issue and identify solutions. The client teams deeper dive on the problem found that the Google App traffic in Adobe Site Catalyst was labeled as “com.google.android.googlequicksearchbox.” In addition, their research revealed data that showed the traffic coming through as a Typed/Bookmarked referrer type instead of as a Search Engine referrer type. This was why it was being incorrectly bucketed to “Referring Domain.” Based on this additional data insight, the internal team was able to resolve the issue by updating the marketing channel definitions in Adobe Analytics.
As a result of surfacing this issue, Adobe will be adding the uncovered logic to their built-in natural search detection rules across all their platforms. This case provides a great example of successful cross-functional teamwork that led to not only the solution to a complicated client issue, but also to a universal enhancement on Adobe’s platform.
On average, the glitch was incorrectly bucketing 22,500 visits per day into referring traffic when they should have been credited to natural search. Below is the chart reflecting what Adobe Analytics was reporting for natural search (represented by the orange line) compared to the correct natural search traffic adjusted for Google Search App (represented by the green line).
If this issue had gone unnoticed, it would have added up to over 8MM incorrectly categorized visits per year. That figure represents approximately 8% of total natural search traffic. These substantial numbers can make a big difference in performance, especially when the team is trying to reach hefty client goals and KPIs for yearly traffic increases. An 8% loss due to a tagging issue could have severely damaged our efforts and greatly hindered our ability to meet client expectations. Research, planning, and collaboration with different client teams played a huge role in uncovering the issue and providing a swift solution.
Any time an anomaly or irregular change occurs in a client campaign, it’s very important to look not only at clickstream data, but also at outside data factors. By looking at both factors, a brand will get much greater visibility into the actual cause and effect of a data issue. The scenario described above is a perfect example of how uncovering an outside factor had a massive affect on data performance. Although this may appear to be an isolated incident, we strongly recommend that brands regularly take a look at their referring traffic report, regardless of the analytics platform being used. And, if it’s discovered that traffic from the Google App is being miscategorized, it is important to contact the analytics platform rep to discuss rolling out a fix or an update to prevent data from being skewed in the future.
Nick Morrelli, Analytics and Research Manager, also contributed to this blog post.