MINDWORKS

Mini: The Data-Theory Balance (Steve Kozlowski, Tara Brown and Samantha Perry)

January 19, 2021 Daniel Serfaty
MINDWORKS
Mini: The Data-Theory Balance (Steve Kozlowski, Tara Brown and Samantha Perry)
Show Notes Transcript

In the digital age, the way we collect data has been revolutionized. Now, there are a magnitude of different ways that we can collect data from a plethora of sources, all done with astonishing speed. However, the speed at which we formulate of new theories has not drastically changed. This begs the question “can theories keep up with data?” Join host Daniel Serfaty for a MINDWORKS Mini featuring Dr. Steve Kozlowski, Dr. Tara Brown, and Dr. Samantha Perry as they tackle this provocative question. 

Daniel Serfaty: I think like in many other fields, we are seeing a major shift between daytime theory. It's almost like the data or the data that is possibly available is getting ahead of the theory. Before we needed a theory first to go and test the hypothesis and collect data in order to test its hypothesis, now the data is here and we are trying to build in a sense the theory or guidance, the prescription because the data is already here. That's an interesting tension that is pervasive in many scientific fields involving human behavior and performance these days.

Samantha Perry: When I think about some of [crosstalk 01:00:34] that we have on, and some of the psychological phenomenon that we've measured in metric with self-reports I feel like we've taken it for granted that that is how that construct is measured. But I think some of the theories lend more in depth data to the answer to it, but we haven't considered that really in some of our theoretical development. We laid out what is cohesion and it has an inherently longitudinal inherently behavioral component to it as it emerges over time, but we measure it with snapshots and that's something that we're comfortable with. But thinking back on the theory, can we not associate it with the data that we're capturing now in a much more realistic way, but it's difficult to make that case in the literature, even though it really does tie to the original theory. I just thought that that was-

Stephen Kozlowski: I would comment but I know Tara has something that she wants to say.

Daniel Serfaty: I would love to hear your comment after Tara's, Steve, by all means, Tara.

Tara Brown: I have a lot of thoughts on the data theory balance. My personal experience is that we are a bit ahead of the theory in terms of the data that's available to us now. And what that puts us in danger of is, I think becoming too atheoretical. I think we're in this interesting tension of having to create theory. But as Sam said I think there are concepts and conceptual information within the existing theory that we can't lose track of. One of the ways that we handle it in terms of thinking through these unobtrusive of novel measurement approaches for cohesion and other team states is really taking a top down and bottom up approach to it. So really grounding what behaviors or characteristics exist that align with how we conceptualize something like cohesion. And then matching those to the data that's available to make sure that the indicators and the unobtrusive data that we are pulling into our measurement of cohesion is at least grounded in theory, even if the way that we compile it ends up making the assessment, is more data-driven or diverges from what we typically do within the literature.

I think having that grounding within the literature, within the theory is really important. Then I think if we become too atheoretical to say, we have all of this data available from teams, and we're going to just throw it in some machine learning algorithms and see what spits out, and say, this is cohesion, I think we are in danger of ending up in a place where we can't really explain what we're finding. But I think the other challenge with that is, even when you go through that process of developing these theoretically driven indicators and gathering data on it, it's still extremely rich but complex set of data that as an organizational psychologist, I don't think we have a way of making sense of outside of bringing in other data scientists and folks, like Steve was saying, that can help us think about that data in a way that would allow us to think outside of the box analytically.

But I think there's a decision point and assumption upon assumption that has to happen when you get that kind of data about, how do you aggregate it? Not just to the team level, but across time. And what are the assumptions you're making that help make those decisions? I would say it's easy for us to fall back on, that's too complicated. And so that's why I think our field continues to stick with the tried and true. But it's also the fact that we're bringing these novel methods and novel approaches that are theoretically driven, but at a different level of granularity than we typically have measured these constructs. Therefore there's a lot of resistance in the journal and publication outlets of, are we really getting at the same construct? Is what we're getting at really cohesion or is it some behavioral result of cohesion that shouldn't really be called cohesion? I think we open up Pandora's box in a good way, but there's a lot of questions that emerge as soon as you start going down this innovative path.

Stephen Kozlowski: I like to think of myself as a theorist. So theory should reign supreme. But I would also point out that methods constraint theory. So most of the theory in my ox field is basically constrained by, you're going to turn your thinking into a hypothesis with core measures of those constructs, and then you're going to use some corelationally based techniques of correlation. And the ability to correlate the data is at the base limiting the way I think about how things work, which is why most of our theories are static, and really don't think about how they play out over time.

How does a phenomenon emerge? That it's not a correlation that you can examine. You really have to look at really the underpinnings and some different ways of visualizing that data or that phenomenon as it's going to manifest. I really think that rather than, this theory has to lead everything, it has to appreciate where does theory come from? Where did Darwin come up with his theory of evolution? In sitting in a chair, drink a scotch or whatever, and come up with evolution. He observed, he collected a lot of data and then he tried to make sense of the data.

I agree with Tara, there's a danger of relying too much on machine learning techniques where we don't know what the machine knows so we don't know how it came to that conclusion. Of course, then the quality of the data becomes really critical. But there's value in having that data and using those and other techniques to try to figure out what in fact is going on here to begin to inform theory and quite frankly to begin, to get theorists, to be thinking more dynamically. Because most the theories are really static. Even when people think about dynamic they think, here it's the theory at time one and at time two and a time three, which is not dynamics. We have complex connections, feedback loops, things of that nature.

I really think that the methods and the data can help Porsche theory to begin to catch up with these techniques. And we're at that point. We're at that point where that needs to be happening. Yeah, I think it's an exciting time if you're interested in dynamics of phenomenon and systems, because we now are beginning to see this kind of informing from different disciplines that really help each other out in ways that certainly I didn't get when I was trained as IO psychologists.

Daniel Serfaty: Yeah. I think the three of you make excellent points. I think this is a debate that it is not just certainly for team research, it's not even for psychological research, you see that again and again now in the pharmaceutical research, in other places like that, when the data advocates or not theories say, quantity will trump quality. And there is an elegance at least with those of us who got educated in the classical way, an elegance in theory, that you don't have in massive amount of data. But that tension is, as you said, Steve, is very current. We can turn it into a creative tension and it's very exciting time to be a scientist, because now you can actually have multiple tools at your disposal. You have the data and you have the theories and you have the models and you have the methods. And all of these together can lead to a deeper understanding of teams.