All Posts • Blog

Big Data and the Hierarchy of Analytical Needs

Mar 30th 2019 Google Analytics Photo by Patrick Tomasso on Unsplash

It's been nearly a decade since the term big data went viral, with blogs and publications touting how data is a competitive advantage. Fast-forward to today, and data certainly is a competitive advantage, but only if your organization has staffed, tooled, and built the processes necessary to extract information from that data – including identifying what needs to be measured.

When it comes to Google Analytics, a robust implementation can result in even a modestly trafficked website producing a staggering amount of data, all for the hard-to-beat price of free. We often see organizations (large and small!) do at least this much, and that's undoubtedly a great start. But what next?

The Hierarchy

Over time, we've found that there are two common mindsets blocking organizations from making effective use of their data:

  1. Decision makers often expect ad hoc insights given that the data (or analyst) exists, but without much, if any, structure in place to guide research and analysis.
  2. Shiny objects like A/B testing tools, machine learning features, etc, distract organizations from the wealth of information they already have available.

To combat these mindsets, we offer a mental model for dealing with leveling up the analytical research capabilities of your organization by building key foundational layers.

The above visualization serves to represent what an organization should be focused on implementing a practice of before moving onto the next layer. If you, for example, are pushing for hypothesis testing before your baseline measurement models are in place, your organization is not using data as effectively as it could be.

Layer #1: Baseline Measurement

Baseline measurement means a few things.

First and foremost, it means having democratized access to the basics. Without a process in place to support a culture of data-driven decision-making, stakeholders will, of course, default to making decisions in whatever way they can. Today, in your organization, how long does it take for your stakeholders to, for example, determine if their design needs to support legacy browsers? What breakpoints should front end developers support? What languages should product require localization around? In practice, even if stakeholders have access to this information, it takes finding it, processing it, summarizing it, and ultimately, answering the original question in a way that supports decision-making.

Second, baseline measurement means understanding what behaviors your organization is trying to affect change in. Understanding and building consensus around goals and target behaviors is no small feat, but an absolutely critical step to level up your organization's analytical savvy and move towards a data-driven culture.

Has your organization defined goals and target behaviors? If not, it may be because stakeholders don't have democratized access to the basics. By providing tools and training to work data into lower level decision-making, stakeholders will organically see how data can help aid decision-making, and more abstract goal and behavior-based conversations can start to happen with increased prioritization, active attention, and all around less friction.

Layer #2: Hypothesis Testing

It's tempting to dive right into hypothesis testing without formalizing baseline measurement, and if your organization has done that, it doesn't invalidate your testing – functional employees likely have good idea about pain points.

When we suggest hypothesis testing as the next layer, we do so from the perspective of widespread cultural adoption. Pockets of A/B testing and hypothesis generation is a good thing, but how does it become a first-party practice in every team's toolkit, to ultimately maximize the ROI on yet another tool investmentBy understanding goals and target behaviors.

By defining goals and target behaviors, it becomes clear what specific products, features, workflows, and interactions need attention, because all product development winds up pointing back to key performance indicators the same way that business intelligence data points back to revenue and profit. These goals and target behaviors ensure team members focus on affecting change in the areas of the user experience that most benefit users. Of course, this focus on goals and target behaviors drives positive financial outcomes by iterating towards experiences that reduce frustration, increase adoption and retention, and maximize for conversion against your website's key actions.

Layer #3: Exploratory Analysis

Exploratory analysis is where organizational expectations tend to start, and it makes sense! Your organization invested in collecting data that describes nearly every interaction your website had with your users. You painstakingly ensured cross domain tracking goes off without a hitch. You removed (or added!) query parameters to find that perfect balance of informative URLs to a manageable page cardinality. It's been what – a year?! Tell us what you learned!

Sadly, it's not that simple. Insights don't jump out of data, and organizations set expectations that analysts bring actionable insights to the table on some rolling basis. And at that, budgets may require analysts that are entry level with limited domain expertise, knowledge of historical decisions, and attendance at meetings with valuable context.

This expectation is simply nonsensical, even if the occasional gem leads to a profitable decision. Organizational success mandates repeatable processes, so how can we make exploratory analysis repeatable? Simple: narrow the scope.

If your organization has baseline measurement figured out and continuously tests hypotheses, exploratory analysis is finally a reasonable extension. Analysts no longer have to sit down in front of the Google Analytics Audience Overview, filled to the brim with anxiety, wondering how they'll find an insight worthy of justifying their salary. Instead, the organization's baseline measurement will indicate specific problems and testing results will indicate specific areas that minor iterative tests just don't seem to affect change in. This provides purpose and direction to an exploratory analysis: we've identified a bottleneck, and we can't iterate the problem away. What kind of questions can we ask to better understand the users that reach this bottleneck? Now we have actionable exploratory work and a story worth presenting!

We specialize in making data actionable.

Terms Privacy team@tellanalytics.com   Burlington, VT

© 2024 Tell Analytics, LLC