Data Science x Public Health

You’ve Been Using Dashboards Wrong — Here’s What Actually Happens

BJANALYTICS

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 4:54

Dashboards are supposed to make decision-making easier.
They make data visible, trends accessible, and performance look measurable in real time.

But what if that visibility is giving leaders the wrong kind of confidence?

In this episode, we break down why dashboards often fail to improve real decision-making—especially in public health and data science. From missing context and misleading metrics to surveillance lag and what I call metric theater, this episode explains why seeing more data is not the same as understanding more.

👉 Enjoyed the episode? Follow the show to get new episodes automatically.

If you found the content helpful, consider leaving a rating or review—it helps support the podcast.

For business and sponsorship inquiries, email us at:
📧 contact@bjanalytics.com

Youtube: https://www.youtube.com/@BJANALYTICS

Instagram: https://www.instagram.com/bjanalyticsconsulting/

Twitter/X: https://x.com/BJANALYTICS

Threads: https://www.threads.com/@bjanalyticsconsulting

SPEAKER_01

Welcome to today's deep dive into our source text, Beyond the Dashboard, Cultivating Analytic Clarity. And, you know, our mission for you listening today is to uncover this completely backwards truth, which is, well, why the very tools designed to give you total visibility at work might actually be destroying your underlying understanding.

SPEAKER_00

Aaron Powell Right, because when you picture a modern control room or like an executive suite, you immediately think of these massive glowing screens, right? Endless charts, real-time tracking, maps constantly updating. I mean, it projects this aura of absolute mastery.

SPEAKER_01

Aaron Powell Oh, totally. You naturally equate that um visibility with having everything under control. But this text argues that while dashboards collect incredible amounts of data and make leaders feel informed in real time, they actually increase visibility while decreasing actual understanding.

SPEAKER_00

Exactly. They are built to display activity. To understand why this happens, we have to look at what dashboards are fundamentally built to do, which is to compress complexity. Because during an operational crisis, you know, you don't have time to read a 40-page statistical report. Scanning a screen gives you immediate speed.

SPEAKER_01

Okay, let's unpack this. Because if I'm managing, say, a major supply chain bottleneck, speed is literally everything. I need a quick visual summary to know exactly where things stand.

SPEAKER_00

Yeah, but the problem is the massive trade-off between speed and validity. So a rising line on a graph, it doesn't tell you why it's rising. A red alert indicator doesn't mean the threshold that triggered it actually makes sense in the real world.

SPEAKER_01

Right. It's a bit like looking at a bright green light on your car's console that says engine okay without realizing the mechanic never actually hooked up the sensors.

SPEAKER_00

That is exactly it. That is the speed versus validity trap. People start trusting what is easily visible on the screen over what is actually valid in reality. I mean, the dashboard only shows what it was designed to show.

SPEAKER_01

Here's where it gets really interesting though. But wait, if I'm under extreme operational stress, I don't have time to dig into the raw data. Wouldn't putting all my key performance indicators, like those crucial KPIs and little sparkline trend charts onto one giant screen, give me more of a grip on the situation.

SPEAKER_00

What's fascinating here is that because users naturally crave that comforting illusion of control you just described, designers feel this intense pressure to put easy-to-measure metrics on the screen just to fill the space. They throw in everything, even if those metrics don't actually help you make a specific decision. The text calls this metric theater. Teams end up optimizing for what is easy to display, confusing frequent measurement with meaningful intelligence.

SPEAKER_01

And the public health examples really expose how dangerous this is. Meaning, you know, you might see a terrifying spike in COVID cases on your map, but the dashboard doesn't tell you it's just because the county doubled its testing capacity that day, not because the disease is actually spreading faster.

SPEAKER_00

Or consider reporting lags. If a county lab doesn't process data over the weekend, your dashboard is going to show a massive surge every single Tuesday when the backlog finally clears.

SPEAKER_01

Wait, really? Just from a weekend backlog?

SPEAKER_00

Yeah. And if a dashboard doesn't actively communicate that context, the messy reality turns, you know, a simple administrative delay into a red alert policy signal. The text points out that we accidentally turn surveillance artifacts into policy signals.

SPEAKER_01

So you end up making sweeping policy decisions based on a Tuesday paperwork jam.

SPEAKER_00

Exactly.

SPEAKER_01

So what does this all mean for how we actually use these tools? Since polished data can so easily trick us into making terrible decisions, how do we actually build a useful dashboard?

SPEAKER_00

It really comes down to design intent. A good dashboard shouldn't act like some all-knowing oracle. It has to be a decision support tool.

SPEAKER_01

Meaning what, showing less data?

SPEAKER_00

Yeah, instead of a wall of 50 numbers, it emphasizes a small handful of truly meaningful metrics.

SPEAKER_01

Right. Forcing yourself to choose what actually matters for the decision at hand rather than visualizing every piece of data you own.

SPEAKER_00

It also has to explicitly communicate uncertainty. So instead of plotting a single definitive hard line on a graph that projects false confidence, a good dashboard uses confidence intervals.

SPEAKER_01

Showing like a shaded range of where the real number likely falls.

SPEAKER_00

Exactly. It forces the user to acknowledge the margin of error. It clearly states definitions and flags reporting lags right on the screen.

SPEAKER_01

It makes the limitations of the data visible to the user.

SPEAKER_00

Most importantly, it tells users where deeper analysis is required. It's meant to provoke better questions rather than pretending to provide every final answer.

SPEAKER_01

And that is the crucial takeaway for you today. Dashboards are interfaces, not analysis. They show movement, but you still have to bring the interpretation.

SPEAKER_00

Right. They can support your judgment, but they can never replace it.

SPEAKER_01

Think about the slickest, most polished dashboard you rely on at work right now is its authoritative professional design actually stopping you from asking better questions about the messy reality underneath.