Science of Justice
Our science, your art.
You've got the vision; we've got the data.
Is our science the right fit for your practice? Is the earth round? Let’s find out. We have created a unique suite of machine intelligence solutions that provide you with the best information in your legal cases. We explore insightful results through our proprietary algorithms with experts with decades of experience working with behavioral science issues or collaborating with legal advisors for successful case outcomes.
Science of Justice
Architect The Decision, Or The Jury Will
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
We argue that the “strong facts equal strong case” formula is broken, and lay out a new model where trial lawyers become decision architects who win the heart first, then the mind. Using research, case studies, and tools, we map how to design stories that resist bias, reduce cognitive load, and produce engineered settlements.
• why facts alone fail under modern juror psychology
• system one drives early moral judgments
• confirmation bias and the Hannah study
• narrative drift and the epidural defense verdict
• intake as strategy and the power of language
• invisible ceilings created on day one
• sequential virtual focus groups over gut instinct
• cognitive load management and moral anchors
• moving from demographics to psychographics
• using SJQs and scaled questions to reveal bias
• predictive modeling to value and negotiate cases
• redefining what makes a “good case”
https://scienceofjustice.com/
You know, there's this this foundational belief that has driven civil litigation for decades. It's an equation that most plaintiff attorneys learn in law school and they really cling to it. It's almost like a comfort blanket.
SPEAKER_01:It is. And the equation is simple. Strong facts equal a strong case. Right. If liability is clear, the negligence is on paper, the injury is undeniable, the verdict should just take care of itself.
SPEAKER_00:Aaron Powell It's a comforting idea, isn't it? But there's a huge problem with that equation today. Trevor Burrus, Jr.
SPEAKER_01:It's broken.
SPEAKER_00:Completely.
SPEAKER_01:It is fundamentally broken. We're seeing this widening gap between what the evidence says should happen and what juries are actually doing.
SPEAKER_00:Aaron Powell So you have cases that should be they look like slam dunks.
SPEAKER_01:Aaron Ross Powell Ironclad liability. Cases we used to call slam dunks, and they're just drastically underperforming.
SPEAKER_00:Trevor Burrus And on the other side.
SPEAKER_01:Trevor Burrus And conversely, you see these messy, really complex cases that, you know, on paper, they probably shouldn't work, and yet they're the ones resulting in massive record-breaking verdicts.
SPEAKER_00:Aaron Powell So the failure here isn't about effort. It's not about advocacy.
SPEAKER_01:Aaron Powell No, it's a failure in how we even define a good case anymore.
SPEAKER_00:Aaron Ross Powell That's the core frustration we need to really get into. You can do everything right, you know, legally, and still completely lose the room psychologically. We're moving from a model where the lawyer is just a presenter of facts to well to a new reality where the elite trial lawyer has to be a decision architect.
SPEAKER_01:Aaron Powell And that's not just a fancy title. It represents a total shift in how we understand the human beings who are actually sitting in that jury box.
SPEAKER_00:Aaron Powell So to understand why that old facts first model is failing, we have to look at the juror.
SPEAKER_01:Aaron Powell The modern juror. Exactly. This isn't the same person from 30 years ago. I mean today's juror is.
SPEAKER_00:Yeah.
SPEAKER_01:They're media saturated, cognitively overloaded.
SPEAKER_00:They've been trained by Netflix and TikTok.
SPEAKER_01:Right. And the 24-hour news cycle, they expect instant, coherent stories. They just don't process information the way a lawyer drafts a brief. It's not linear, it's not strictly logical.
SPEAKER_00:Aaron Powell This is where we really have to bring in Daniel Kahneman's work, right? The idea of system one and system two thinking.
SPEAKER_01:Absolutely. It's the operating system of the human brain. And it explains so much of what goes wrong and right in a courtroom.
SPEAKER_00:So break that down for us, System One.
SPEAKER_01:Aaron Ross Powell System One is fast. It's intuitive. It's that gut level reaction. It's the part of the brain that makes these snap judgments about whether someone is trustworthy or if they're dangerous.
SPEAKER_00:Aaron Powell And lawyers, we tend to think we're always talking to System Two.
SPEAKER_01:Aaron Powell That's the mistake. System two is slow, it's deliberate, it's logical. And the mistake lawyers make is assuming that jurors use system two to decide the verdict. They don't.
SPEAKER_00:They use system one for.
SPEAKER_01:They use system one to decide who is the good guy or the bad guy. And they often do it within the first few minutes of the case. Trevor Burrus, Jr.
SPEAKER_00:And system two just comes in later to clean up.
SPEAKER_01:It just arrives later to rationalize that gut decision that's already been made. It finds the facts to support the feeling.
SPEAKER_00:Aaron Powell So the mission for the modern plaintiff lawyer has to change then. You're not just there to litigate, you're there to architect the entire decision-making environment.
SPEAKER_01:Aaron Powell That's it. You have to structure the entire life cycle of the case. I mean, from the moment that client first calls your office all the way to the final closing argument, you have to align it with how these human beings actually process information.
SPEAKER_00:Aaron Powell And if you can do that.
SPEAKER_01:Well, that's the promise of looking at litigation through this lens. If you can use data and psychology to understand these mechanisms, you can prevent those invisible ceilings on your case value.
SPEAKER_00:And you can reduce the massive stress that comes from all that uncertainty.
SPEAKER_01:Ultimately, you win the heart first. Because if you don't win the heart, the mind will never ever follow.
SPEAKER_00:Okay, so let's get specific. Let's talk about why the traditional approach falls apart. There are three let's say three failed assumptions that most firms operate on, and they're frankly dangerous.
SPEAKER_01:The first one we touched on, the idea that facts speak for themselves. A raw fact is completely inert. It has no power on its own. It only gains meaning when it passes through the filter of a juror's personal experience, their life, their biases.
SPEAKER_00:There's a study that illustrates this perfectly, the Hannah case study.
SPEAKER_01:It's a famous study, and it's so powerful. Researchers had participants watch a video of a young girl, Hannah, taking an academic test.
SPEAKER_00:And everyone saw the exact same video, right?
SPEAKER_01:Identical video. And in it, Hannah's performance was well, it was ambiguous. She'd get some really difficult questions right, but then she'd miss some easy ones. It was a mixed bag.
SPEAKER_00:But here's the twist.
SPEAKER_01:Here's the twist. The researchers split the participants into two groups. Group A was told beforehand that Hannah came from an affluent, educated background.
SPEAKER_00:Okay.
SPEAKER_01:Group B was told she came from a poor urban background.
SPEAKER_00:And that one little piece of preframing.
SPEAKER_01:It changed everything. It changed their entire reality. Group A, the ones who thought she was wealthy, they rated her academic potential as very high.
SPEAKER_00:And they focused on what?
SPEAKER_01:They focused entirely on the difficult questions she got right. And they just sort of explained away the misses. But group B, who thought she was poor, they rated her potential as low.
SPEAKER_00:And they did the opposite.
SPEAKER_01:The complete opposite. They fixated on the easy questions she missed. This is confirmation bias in action. The evidence didn't change their minds. Their bias shaped how they saw the evidence.
SPEAKER_00:That is that's terrifying for a trial lawyer. It means if a juror walks in with a bias against your client, maybe they think, you know, people who sue are just looking for a payday.
SPEAKER_01:They will literally filter the evidence to support that belief. They will ignore your expert witness, they'll ignore the documents, and they'll focus on one tiny contradiction just to prove to themselves that they were right all along. Which brings us right to that second failed assumption. Persuasion happens at trial.
SPEAKER_00:Right. We love to think that the closing argument is where the magic happens, or that some brilliant cross-examination is going to win the day.
SPEAKER_01:But that's often too late.
SPEAKER_00:The reality is that impressions form almost immediately. If a juror forms a negative gut check, that's that system one response during opening statements, or maybe even just from seeing your client in the hallway, the game is largely over.
SPEAKER_01:And once that gut decision is made, it's not like contradictory facts are just ignored. It's worse than that. They're often distorted. They get twisted to fit the belief that's already been locked in. The juror stops being an observer and becomes an advocate for their own gut feeling.
SPEAKER_00:Okay. And the third assumption might be the most insidious, especially for successful firms, and that's experienced substitutes for testing.
SPEAKER_01:Oh, you see this all the time. A senior partner sits at a conference table with the associates. They look at the file, they look at the facts, and they all agree this is a slam dunk. We know how this plays out.
SPEAKER_00:That's the internal consensus trap. And it feels safe, right? Because everyone in the room agrees.
SPEAKER_01:Aaron Powell, but everyone in that room is a lawyer. They are not representative of the jury pool in that venue.
SPEAKER_00:Aaron Powell, so it's an echo chamber. Trevor Burrus, Jr.
SPEAKER_01:It creates this feedback loop of internal bias amplification. The legal team's unexamined assumptions start to shape the entire strategy rather than the actual reality of how a juror in that specific city or county is going to react. Trevor Burrus, Jr.
SPEAKER_00:You're projecting your own logic onto a group of people who just they don't share your training, they don't share your worldview.
SPEAKER_01:Aaron Powell And that disconnect, that gap, opens the door to something that we call narrative drift.
SPEAKER_00:Trevor Burrus And that is what keeps lawyers up at night. It's when the story you think you're telling isn't the story the jury's actually hearing.
SPEAKER_01:Aaron Powell Because when a story has gaps, or when it just doesn't feel true to a juror's moral view of the world, they don't just sit there confused.
SPEAKER_00:They fill in the blanks themselves.
SPEAKER_01:They invent a new narrative that makes sense to them.
SPEAKER_00:Aaron Ross Powell There's a medical malpractice case that is the perfect and honestly the scariest illustration of this. It was considered an unlosable case.
SPEAKER_01:Aaron Powell I know the one you're talking about. A woman goes in for an epidural and is rendered quadriplegic.
SPEAKER_00:And the imaging was just it was indisputable. Her spinal cord was pristine before the procedure and clearly damaged immediately after. Liability seemed as clear as it could possibly get.
SPEAKER_01:And yet the jury returned a defense verdict. On paper, it makes absolutely no sense.
SPEAKER_00:So what happened?
SPEAKER_01:Well, when they interviewed the jurors later, they found out there was a stealth juror on the panel, a young woman who worked as a medical assistant. And in the jury room, she created a completely new narrative out of thin air. She told the other jurors that the injury was actually an age-related degenerative condition.
SPEAKER_00:But there was zero evidence for that. I mean, no expert testified to that. Nothing.
SPEAKER_01:It didn't matter. The jury was deeply uncomfortable with the idea of blaming the doctor. Yeah. It just felt harsh. It felt wrong to them. They were looking for an exit ramp.
SPEAKER_00:And she gave it to them.
SPEAKER_01:She gave them a story that alleviated that discomfort. And they chose her convenient fabrication over the uncomfortable facts because it aligned with their system one desire to believe that the world is fair and that doctors are good people.
SPEAKER_00:That is narrative drift just completely destroying a case. So, okay, if we accept that facts aren't enough and that trial is often way too late to fix these kinds of perception issues, then we have to go upstream.
SPEAKER_01:Way upstream.
SPEAKER_00:We have to look at the very, very beginning of the case. And this brings us to this idea of intake as strategy.
SPEAKER_01:This is a perspective by Keith Clark, who founded jury analyst. His argument is that most firms treat intake as an administrative task. Trevor Burrus, Jr.
SPEAKER_00:Just filling out forms.
SPEAKER_01:Right, getting signatures. But he argues it has to be treated as architecture. Intake is the first point of narrative construction for the entire case.
SPEAKER_00:And Clark talks about the invisible ceiling. Explain that.
SPEAKER_01:The idea is that poor decisions, choices made during that first phone call or that first meeting, can set a hard cap on the case's potential value. And no amount of trial brilliance three years down the road can ever remove that ceiling.
SPEAKER_00:Aaron Powell A big part of this is just the language you use, right? The linguistic framing.
SPEAKER_01:Aaron Powell The words you use from day one matter so much. Take the simple difference between using the word accident versus collision.
SPEAKER_00:Okay.
SPEAKER_01:If you put accident on that intake form, and then everyone in the firm starts calling it the accident file, you are subtly and unconsciously reinforcing a narrative of blamelessness.
SPEAKER_00:Right. But collision is different. It implies violence. It implies physics and force and and preventability. Trevor Burrus, Jr.
SPEAKER_01:It plants a seed that someone is responsible. It sets a completely different tone.
SPEAKER_00:Aaron Powell So the intake checklist needs to be more than just the basic facts of liability.
SPEAKER_01:Way beyond that. You have to assess the human factors immediately. You have to be asking, what will it be like working with this person for the next three years?
SPEAKER_00:Aaron Ross Powell Is the client credible? Are they a good historian of their own life?
SPEAKER_01:Aaron Powell And even more importantly, does this client have psychological baggage that is going to alienate a jury? Are there things in their past that are just going to be deal breakers?
SPEAKER_00:Aaron Powell And we also have to look at what you call narrative stability.
SPEAKER_01:Yes. Can this person tell the same core story today, tomorrow, and then again in a deposition six months from now under pressure? Because we know jurors punish inconsistency much, much more severely than they punish a weak fact. A story that changes is a story they can't trust.
SPEAKER_00:This is how you start to identify what you call ceiling compression.
SPEAKER_01:Exactly. Let's say you take on a case against a major corporate defendant. Your internal consensus, your gut, might tell you that the jury will hate the corporation and automatically side with your client.
SPEAKER_00:So you lock into that David versus Goliath narrative right from the start.
SPEAKER_01:But what if you haven't tested that assumption? What if the jurors in that specific venue actually view your client as money hungry and see the corporation as a major local job creator?
SPEAKER_00:If you don't catch that at intake, you've locked yourself into a losing story before you've even filed the complaint.
SPEAKER_01:You have. You limit your discovery, you limit your settlement demands, and you ultimately limit the verdict because you're fighting a battle that you can't win with that jury.
SPEAKER_00:So this is why the modern decision architect has to move from just guesswork to actual data.
SPEAKER_01:The toolkit has to change. We can't rely on gut feelings anymore. Discovery needs to become a feedback loop, not just a checklist of things to do.
SPEAKER_00:And the primary tool here would be the virtual focus group. Now we're not talking about the old school mock trial.
SPEAKER_01:No, not at all. The one where you rent a hotel conference room the weekend before trial and spend$50,000. Right. Those are autopsies. But the time you do a mock trial right before the real trial, it's way too late to change the fundamental architecture of the case. You just find out how you're going to lose.
SPEAKER_00:So we're talking about something different.
SPEAKER_01:We're talking about sequential testing, starting early in the discovery process. You run a virtual panel to test a hypothesis, you get data back, you refine the narrative based on that data, and then you test it again. It's iterative.
SPEAKER_00:There's a case study about an 18-wheeler collision that really shows the power of this method. Trevor Burrus, Jr.
SPEAKER_01:It's a great example. You had a large truck that makes left turn and gets struck by a small car.
SPEAKER_00:Yeah.
SPEAKER_01:The plaintiff, the driver of the car, has no memory of the crag at all.
SPEAKER_00:And the initial facts looked really bad for the plaintiff.
SPEAKER_01:They were terrible. The police report blamed the plaintiff. The truck driver claimed the plaintiff was speeding and on their phone. So the legal team, they ran an initial baseline focus group.
SPEAKER_00:And the result.
SPEAKER_01:It was a disaster. The mock jurors assigned 80% of the fault to the plaintiff.
SPEAKER_00:So in the old model, you might just settle that case for pennies on the dollar or maybe even drop it.
SPEAKER_01:It looks like a total loser, right? Yeah. But this team, they acted as decision architects. They didn't give up. They decided to dig into the data.
SPEAKER_00:So what did they do?
SPEAKER_01:They ran four sequential virtual focus groups over a period of about 16 months. They tested different ways of presenting this story, and what they found was completely counterintuitive.
SPEAKER_00:How so?
SPEAKER_01:When the lawyers tried to defend the plaintiff against the speeding and the phone allegations, the jurors actually punished them for it. Why? Because it just highlighted the defense's main argument. It's that don't think of an elephant problem. If I stand up and tell you my client wasn't speeding, the first thing you start thinking about is my client speeding.
SPEAKER_00:So what did the data show them was the right path?
SPEAKER_01:The data showed that the jurors only responded positively to rules of the road violations by the truck driver. So things like things like improper route planning or making an unsafe turn into oncoming traffic. It was about the truck driver's choices, his professional responsibilities. So the team completely pivoted their strategy.
SPEAKER_00:They stopped defending their own client.
SPEAKER_01:They stopped defending the plaintiff and focused 100% on the truck driver's moral violations as a professional.
SPEAKER_00:And I understand they also used video evidence in a clever way.
SPEAKER_01:They did. They tested, showing surveillance video of their client, even if he looked okay, alongside a very honest interview to boost his credibility. It made him human.
SPEAKER_00:And the final result of all this testing?
SPEAKER_01:A complete flip. Yeah. By ignoring all the defensive arguments and focusing only on the truck driver's choices, the final verdict assigned only 20% fault to the plaintiff.
SPEAKER_00:Wow.
SPEAKER_01:The data moved the needle 60 points. Intuition would have lost that case. Data won it.
SPEAKER_00:This really links back to another key concept, which is cognitive load management. Lawyers, we love detail.
SPEAKER_01:We love it. We want to explain every single minute of the timeline. But jurors, jurors shut down when they're overwhelmed.
SPEAKER_00:And that benefits the defense.
SPEAKER_01:High cognitive load is it gets to the defense. When a juror is confused or overwhelmed, they look for the easiest exit. And the easiest exit is usually, you know what? It was just an accident. Nobody's really to blame.
SPEAKER_00:So the job of the decision architect is to simplify.
SPEAKER_01:You have to simplify that complex expert testimony into simple moral violations. You need to use what we call anchors, mental shortcuts that the jury can grab onto and hold on to when the testimony gets dense or confusing.
SPEAKER_00:Okay. So we've architected the case from intake, we've tested the narrative. Now let's talk about the people we're actually trying to persuade. Jury selection. For a long time, this has been driven by demographics. You know the thinking. We want women or we don't want engineers.
SPEAKER_01:Aaron Powell The demographics trap. It's a major, major source of error in jury selection.
SPEAKER_00:Aaron Powell I see attorneys assuming that an older white male is automatically going to be conservative and pro-defense.
SPEAKER_01:Trevor Burrus And that becomes a self-fulfilling prophecy. You start asking questions based on that assumption, you alienate that juror, and then you get the result you expected. But it wasn't based on reality, it was based on your own bias.
SPEAKER_00:Aaron Powell So we need to move from demographics to what you call psychographics. What does that look like in practice? Psychographics focus on stable underlying attitudes and worldviews. We want to know: is this person authoritarian or are they empathetic? Do they believe in individual responsibility above all else, or do they see systemic issues at play? And that's not tied to age or race.
SPEAKER_01:Aaron Powell Not at all. You can have a 25-year-old barista who is a strict authoritarian, and you can have a 70-year-old veteran who is deeply empathetic. Demographics miss that completely.
SPEAKER_00:And there are tools now that can help with this. Things like the jury analyst simulator use machine intelligence to build these venue-specific personas based on psychographics. But how do you get this information out of potential jurors in a courtroom? Because we all know that standard question, can you be fair, is useless.
SPEAKER_01:Trevor Burrus, Jr. It's worse than useless. It triggers something called social desirability bias. No one wants to stand up in a public courtroom and admit that they're unfair or biased.
SPEAKER_00:The social pressure is too great.
SPEAKER_01:It forces them to say yes. So you get zero useful data from that question.
SPEAKER_00:Aaron Powell So how do we combat that? What's the solution?
SPEAKER_01:Aaron Ross Powell The solution is using supplemental juror questionnaires or SJQs that use scaled questions. So instead of asking for a binary yes or no on fairness.
SPEAKER_00:You reframe it. You reframe it. You ask. On a scale of one to ten, how hard would it be for you to assume the defendant is not guilty right now?
SPEAKER_01:That's brilliant. Because it removes the judgment. It lets them admit their bias without having to admit they're a bad person.
SPEAKER_00:Exactly. If someone circles a seven or an eight on difficulty, you know you have a problem, regardless of what they say out loud in open court. It forces the revelation of bias without attacking them.
SPEAKER_01:When you put all these pieces together, we're really painting a picture of a total transformation. And for the attorney, this must shift the entire emotional landscape of the job.
SPEAKER_00:Oh, it moves you from a state of constant anxiety to a state of control. The old model is just filled with uncertainty.
SPEAKER_01:You're lying awake at night wondering if the client is going to blow it on the stand or if the jury is going to buy the defense's story. You're managing your client based on hope. But with predictive modeling, with this kind of data, you know where the landmines are before discovery even ends. You're not guessing anymore.
SPEAKER_00:And this has to change settlement negotiations completely.
SPEAKER_01:It makes settlement an engineered outcome. You're not negotiating based on what you think the case is worth. You're negotiating based on simulated juror response curves from that specific venue.
SPEAKER_00:So you can actually show the defense data.
SPEAKER_01:You can show them data that says in this venue with this demographic, this specific damage category resonates at this level.
SPEAKER_00:Their numbers lose power when you have empirical valuation data. You're not arguing opinions anymore. You're arguing math. And ultimately, all of this allows you to achieve that core goal we talked about: winning the hearts, then the minds. You win the system one gut check.
SPEAKER_01:Because you have tested your narrative to make sure it creates the right emotional reaction. And then you provide the system two logic, the facts, the evidence to arm those jurors to fight for you in the deliberation room.
SPEAKER_00:It really does require abandoning the ego, though, doesn't it? You have to let go of that feeling of, I know this case.
SPEAKER_01:And embrace the humility of let's test what this case actually is.
SPEAKER_00:That's the hardest part for so many senior lawyers.
SPEAKER_01:Aaron Powell Admitting that their intuition might be wrong. But the data shows again and again that even the most experienced, brilliant lawyers are subject to their own confirmation bias. The testing protects you from yourself.
SPEAKER_00:So as we bring this all together, let's look at that final recalibration. We have to redefine what a good case really is.
SPEAKER_01:It's no longer just a stack of paper with favorable facts.
SPEAKER_00:It's something more.
SPEAKER_01:A good case is one where the perception of the facts has been architected to align with human belief systems. It's a case where those invisible ceilings have been identified and removed through smart intake and linguistic framing.
SPEAKER_00:It's a case where the narrative has been inoculated against that dangerous drift. I think I want to leave our listeners with a final thought to mull over. We all have that inventory of cases sitting in the firm right now, the cases we're so sure about, the cases where the internal consensus is this is a winner. Ask yourself this. Where is your firm's internal consensus creating an invisible ceiling right now? Which case are you banking on that hasn't actually been tested against the reality of a modern jury? Scary question. But asking it might be the difference between a compromise settlement and a record breaking verdict. It's time to stop being just a presenter of evidence.
SPEAKER_01:And start being an architect of decisions. The science is there, it's up to us to use it.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.