Software leaders have experienced no shortage of major shifts and revolutions over the years….

Emerging technologies and tool sets to drive greater voice of customer integration

The transition from the era of quality engineering 1.0 to the rapidly coming QE 3.0

Increased pressure to be superior with not just product quality and time to market, but holistic customer experience

And, now, the tidal wave of data that is not only headed our way but, rather, is already here

Leaders of software factories, applications and products are navigating a range of data-driven initiatives that include the use of AI/ML algorithms, incorporating data ops within a DevOps working model, and reporting on new metrics within the SDLC. Whereas data might have been somebody else’s responsibility years ago (the business), the responsibility is now also landing on software leaders.

While it seems like this revolution came fast, in truth, the data tidal wave has been mounting for well over a decade. I remember the first time I truly felt the data imperative, more than 15 years ago.

I was working with one of our clients (a large healthcare company) on process and transformation, and the concept of what I refer to as “data mining, morphing and making” started to become a part of our scope of work. Suddenly, we were equally as concerned with guaranteeing quality of product and quality of process, as we were with building data tools and using data for testing and verification. Among our focuses was ensuring the data we were creating could be useful in a multitude of purposes. We had to make the data unique and useable so that it could truly help the business. At the time, the metrics we were collecting were mostly related to Quality of Product: test coverage, state of test completion and results, defect recidivism rate, find/fix trending, etc.

Fast forward more than a decade later, and we’ve moved well beyond the stipulation for software leaders needing to have useable data sets. Today, it’s all about leveraging data for true business analytics.

In addition to operational metrics, software teams are now concerned with process/efficiency metrics: team velocity and sprint backlog completion, time lag from “feature done” to “feature in production,” team increase in velocity over time, defects that escape containment (pass the sprint and get into production). And it’s not just about understanding how to develop metrics based on the data you own, but also the data that you pull in from outside your four walls. It’s about making sure that data can be used to develop the metrics you use to derive next-level insights about how to drive future improvements to your quality of product and quality of process.

So where should quality leaders begin when it comes to navigating their own data lakes, as well as the larger data ocean? There are 5 core things that I believe should be top of mind:

  1. DevOps as Table Stakes: We already know DevOps — arguably one of the most impactful software development trends — is becoming ubiquitous. Research shows 83% of IT decision makers say their organizations are implementing DevOps practices, and it’s quickly becoming the most popular Build and Release methodology globally. A first step to navigating the data ocean is to make sure you have a DevOps foundation in place so that you can ensure and trust how you collect data. It’s about ensuring data quality… making sure you have a foundation for pulling data into your tools automatically and reliably before you try to do anything with it. Once your DevOps foundation is square, now it’s time to focus on data provenance.
  2. Data Provenance = Trust: If we want our teams, end users, and business partners to trust the data that we are providing them, we need to be able to answer questions about data origin, generation, attribution, sources, journey/chain of custody, etc. Attention and focus on data provenance allow us to answer questions like when, why and how was our data produced. By being able to answer these questions, we enable our business peers to feel comfortable using our data to make decisions that ultimately impact things like customer experience, product development, and the software lifecycle.
  3. Data Literacy: Data literacy is fundamental to data ops. It takes a consensus view from data users and consumers on shared definitions of data states and terms (e.g., in healthcare, what is an “approved” provider? When is a claim “completed?”) that are then captured in a Data Dictionary, which represents the universal source of truth for data.
  4. Fit For Purpose: We need a way to ensure that the test data we create is “fit for purpose”: i.e., can be used for all scenarios/test cases and will work the same as real-world data. This is where the “workflow” aspect of data quality comes in to play. We need to focus on how to measure data quality at any point along its journey and make sure we can identify parts within the operational workflow — both up- and downstream — where data state can change and thus data quality can be unintentionally impacted.
  5. Self-Service: A final core component as it relates to navigating the “data ocean” (i.e., all the data you own and don’t own) is enabling self-service capabilities in our data tools so that our teams can do things with data on their own. We want our teams (from our testers to our architects to our engineers…and their business partners!) to be able to access the data they need at the moment they need it. We would be fumbling the football on the two-yard line if we do not think about the front-end presentation of our data: from the tools we are using to the visualizations we depend on, etc. When want to make sure our data is accessible and dependable in a way that is most useful for our teams and business partners to see.

When you have done all these five things, as a quality leader you are prepared to handle the data ocean. You will be able to stay ahead of lifecycle trends, your competitors, the latest methodologies and, most importantly, have the answers to the questions the business asks you at the exact moment they ask it.

Remember the tidal wave is NOT mounting. It’s already here, and the responsibility for navigating the ocean is completely our responsibility.

You May Also Like:
DataOps — Where Data, Quality, and Engineering Meet
Quality: A Seat in Your Mind
The Keeper of Quality