Diagnosing The Workplace: Not Just An HR Podcast

Should We Use Personality Tests In Hiring And Promotions?

Roman 3 Season 3 Episode 18

Send us a Message!

In this episode, we explore the truth and reality behind personality tests and psychometrics. As great of a tool as psychometrics are (learning styles, communication preferences, etc.) they are only good for the single purpose that they are designed for.

Our prescription for this episode is to avoid commercial adaptations of academically validated assessments, most are bad science. If you want something that will actually work, reach out to us!

You can reach out to us to talk more about psychometrics and resources, just reach out to us at info@roman3.ca or through our LinkedIn page at https://www.linkedin.com/company/roman3

Don't forget to sign up for our New Quarterly Newsletter that launched this fall!

About Our Hosts!
James is an experienced business coach with a specialization in HR management and talent attraction and retention. 

Coby is a skilled educator and has an extensive background in building workforce and organizational capacity. 

For a little more on our ideas and concepts, check out our Knowledge Suite or our YouTube Channel, Solutions Explained by Roman 3.

[ANNOUNCER]:

Breaking down everyday workplace issues and diagnosing the hidden sickness,  not just the obvious symptom. Our hosts, James and Coby.

[COBY]:

Did we lose a patient?

[JAMES]:

No, that's just my lunch.

[COBY]:

Hey, thanks for joining us. I'm Coby, he's James,  and let's get started with a question. Should we use personality tests in hiring and promotions?

[JAMES]:

Okay, so I. I really have to start with a quick story because, my wife was asking me  this morning what we were doing our podcast episode on, and I replied very excitedly,  that we were going to be talking about the use of psychometric assessments in talent acquisition and  mobility. And I'm pretty sure that she fell asleep before I even finished the title of the episode.  So personality tests and hiring and promotions is a much better question for this conversation.  but really on to the question itself. Should you be using them? No, because you're probably doing  it wrong. Can you use them? Absolutely. No reason that you can't. Can they be effective? Absolutely.  I think they can be a really useful tool in how we learn about people and provide for different,  elements, whether that's training, programs, development opportunities. So really, today,  I think, I want to focus our conversation on how psychometrics are commonly used,  why we so often get them wrong, and whether we should use them in our recruitment  processes and part of our internal performance management and talent development programs.

[COBY]:

Right. And I think a good thing to kind of start us off with is,  you know, you listing maybe asking the question, hey, are you guys even qualify  to really even talk about this? And that is's a fair question because one of the things that  we're going to rail against is people being unqualified in using them, customizing them,  creating them, and the whole thing. So I think it's a fair question for you to ask that of  us. but I would say, I mean, our opinion is, of course we are, but I mean, the reason why. Yeah,  the reason why we say that we are is that it's something that we have a good. A good background  in. But specifically, I'm going to pat myself on the back here is that really, this has been a big  part of my career and the focus of my graduate degree was in using psychology and educational  systems to improve workforce development and organizational efficiency. So this is well  within the scope of the work that I've done, the cognitive research and educational, best practices  that I have done forever. This really fits really nicely in it. So, you know, then. And then you and  I have built tools. You have a great aptitude, and A technical background around, around the  user experience to kind of improve you know, testing reliability and completion rates and the  whole thing. And we use adult learning principles into all the work that we do. We're very familiar  with universal design and learning. So this is something that we don't take lightly. We are  constantly checking and rechecking and validating and it's something that we, we don't take lightly.  The fact that we say we're very qualified to talk about this because it's so easy for people to  assume that you know, you know, the tests are really EAS to do just because, you know, I've  done tests in school or I've got this, you know, I have this kind of experience,  I'm sure I can do it. But like we really, it's something that we take very, very seriously.

[JAMES]:

Well and the biggest. So I mean we, we might as well just jump into the conversation  because I mean that's a great kind of setup for, you know, why we are talking about it. We use  psychometrics in our work regularly. It's part of the assessment program that we use it. We use  them in management training, tools. We use them in a number of different environments and we'll  get into that. But I do want to talk about how people are using them now because the majority  of people who are using them now in most of the time, People are using some sort of personality  test which is a form of psychometric assessment as part of the recruitment process. When they do that  they tend to default to a couple primary, very commonly well known programs like Myers Briggs,  DiSC, Strength, finders, those tend to true colors or personality dimensions is another one that kind  of gets tossed in the mix as well. And those are well regarded academically validated theories  that people have built assessments around. And the problem that I have with just picking up a  DiSC assessment and giving it to somebody as part of your recruitment process is that your result  in no way shape or form is going to predict performance. Your Myers Briggs personality,  style is in no way shape or form going to predict performance. Same with true colors. Same with  strength finders. Identifying these aspects of a person's personality or how they approach things.  Even from a behavioral standpoint, it's not going to predict whether or not that person is a good  fit for the job because it has no direct tie in to job performance, job duties, job outcomes.

[COBY]:

Right.

[JAMES]:

So when I, I mean I said in my intro, should you be using them? No,  because you're probably doing it wrong. And if you've just bought Myers Briggs Disc or any one  of the 1,000 variations that DIS has and you're just plugging it into your recruitment process and  you're using that to inform your decisions, then stop because you're doing it wrong.

[COBY]:

Well, the thing is some people are like, well one thing that you said  was these are academically valid theories. And some people might say are these legit?  do personality tests work? And I can say with a fair amount of confidence a  lot of them achieve the goal that for what they were created for,

[JAMES]:

Originally created for.

[COBY]:

What they were originally created for. That's not the same thing as saying  do they work? Because the thing is, is that so again like the research. So how you assess the  accuracy of testing is you use there's a few different methods for accuracy and validity.  And one common really great test for accuracy is the test-retest method where you basically just  like if someone takes the test multiple times, will the results be consistent? Right. And that's  something that again lot of these, a lot of the original tests created kind of for intended  purpose upon like a long time ago by the original designers. A lot of them had a great test retest,  like Myers Briggs is a good one. And DiSC two of them original things developed great test retests.

[JAMES]:

I’ve done Myers Briggs assessments many times and I come up with the same  like it's usually one of two profiles depending on the mood I'm in that day,  whether I'm feeling more introverted or extroverted. And it is kind of where.  So I mean you're right, but you're talking about the original intent  of them. What is the original intent of a lot of these personality assessments?

[COBY]:

Well, a lot of them were created as a self awareness tools. They were for people doing  a bit of self discovery. They were about kind of trying to find ways to improve intrapersonal  intelligence. Not interpersonal intelligence where you know other people, but intrapersonal  intelligence where you can learn more about yourself. And again, a lot of them were, you know,  design as kind of like these, application of theories in kind of personality psychology and,  and everything kind of like that as a way of kind of like just trying to help basically help people  kind of like identify, always be placed into buckets as a way to kind of like, you know,  smooth around the edges of the complicatedness that is a person. Right. So a lot of it is about  trying to create a bit of that self awareness for whether you're trying to kind of, you know,  change kind of your habits or trying to kind of you identify how can you can progress through  therapeutic actions to make sure that you have enough, have enough self awareness that you're  actually, you know, knowing how that you can do things effectively is really about that. And so  when people say, you know, these things don't work and they don't work well, that it can be a valid  criticism that they don't work because they're not designed for how many people are using them. So  if you're someone listening like yeah, well I've taken those things and they're junk and they don't  work well. I mean that could be. The test could be either not valid or not accurate. Absolutely. Or  the user experience can be so complicated and long and boring that you don't finish it. So you don't  actually answer the questions kind of truthfully and honestly. But is probably the reason  why people hate on personality tests is that they're overused for not their intended purpose.

[JAMES]:

Yeah. Well then this is the point is that they're being used to determine something  that they were never designed to determine. And you cannot pigeonhole people based on an assess  like to say that like one of the issues that I have is like disk is the, the theory of behind  disk is really cool. we've dug into it ourselves. We actually use a modified disc assessment in one  of our assessment programs. But where, where the problem I see is that because disk is so  well known and because it's. This isn't the right term in academic sense, but it essentially it's an  open source. Anybody can use it and modify it without a specific commercialization license.

[COBY]:

Well, yeah, no, so there is a. So here's the thing. There is DiSC capital D,  small I, capital S, capital C that is owned by the kind of like, like the,  the chain of ownership from development to today. That's the line is there with the  small “I”. The problem is most people don't realize the difference between small I DiSC  and big I DISC. Small I sticks more closely to the original ip, original intellectual property  of it. Whereas big I is what you're talking about. Someone cloned it. And that's the stuff  that circulates that people think is open source because it's a clone of the original and this.

[JAMES]:

And it gets pretty close. But it's why you end up with these like I've attended  business meetings and sessions where it's what type of bird are you? and it's essentially a DiSC  clone that they instead of D, I, S, C, they slap a type of bird on it to indicate your personality.  And that's why I think a lot of people are so frustrated with these assessments because they.  People go quickly into commercializing their own and just selling a product rather than. And they  don't really care about the validity. As long as the person who takes the test they can convince  them that it's reflective of themselves, then that validates the results in that person's mind.  And I'm an eagle because eagles are cool and powerful and I want to be cool and powerful.

[COBY]:

Yeah, absolutely. Ah, and you get a bit of what's called the Halo Bias and stuff like  that when those kind of things happen and things are stacked kind of in their favor because oh,  I’ve attached to something. So I’m going to assume because I like this one thing,  I like everything else and yeah, well the whole idea of the different biases in place  when those things happen. But the problem is even though those specific ones, they're  not necessarily clones, the original they're clones are clones. Yeah. And so like you're  almost playing with the telephone game where three people down that your hearing something  different. That's kind of where the validity and the accuracy kind of comes from because ah,  what happens is a lot of people will sacrifice the validity and the accuracy for the user experience.

[JAMES]:

Yeah, it's like user validation and user experience.

[COBY]:

Yeah. So it's easy to use as quick, it's flashy. So who cares if it works? Kind of is the.  It's easier to sell. So if it works because again because of the halo bias, the halo effect that  there's that idea of it doesn't really matter if it works because people like it and that's kind of  all that matters. But the thing too is that again like the people doing bad versions that shouldn't  be out there, it's kind of hard for. I don't want to rail against that because you. It's a bad  product so. But it looks like it works. It's the performative solution. Right. And we are on record  for hitting that kind of stuff. But even stuff that does was coming from the original. So again  we use some stuff around emotional intelligence and kind of values assessments and those types of  things. And you know we use the original academic version of some, these of some of these tests. And  one thing that we always try to m. Be aware of too is how a lot of them, again without kind of like  using things like adult learning principles and universal design learning can actually there some  of Them are not designed. They're designed for a specific audience or a specific target. They're  not designed for mass consumption. So some you get into inadvertently discriminating against people  and, and you know. Or just making that there's you know. It's not accessible equally to everybody.  And again that's not the tests fault originally because it was designed for a specific use. But  then we take it out from it, make it universal. Then we attach people's careers and hiring  opportunities to them is just a open. You know, it's just a breeding ground for discrimination.

[JAMES]:

Yeah. I mean and I like where you. I like this aspect of it because  this is the same problem that we have with a. With any real standardized testing. Right. Because it oftentimes unless the questions are in craft it with incredible care.  even using common jargon or colloquialisms can cause false positives or false negatives in  the test taking. Right. You know, if you're using common slang that is, you know. Well,  you think is common to your particular area or region and somebody is not familiar with that, it  can throw off the entire basis of the test. Like Some of these tests will use language which like  I don't know. you. I've got great get up and go. Right. That for a lot of us we know what  that means. Right. We don't really need to think about what's the intent behind that question.  But if English is not your first language. Exactly right. And for a lot of Canadians it's not.

[COBY]:

Yeah. And think in some colloquialisms are generational.

[JAMES]:

That's right.

[COBY]:

So I mean. So again And that's one of the things too is that you know  there's. I always have. You know again as someone that really tries to embed as much  universal design learning into all of the work that we do. Just being really aware of  accessibility of stuff. Right. Something that we take very seriously as well. That's always  a challenge. But I mean when colloquialism like, like you said get up and go is a pretty good one.  Are tied to whether I get a job or not is huge and is problematic. Right.

[JAMES]:

Yep. And the other think about the mentality that we have during when we're  going through that recruitment process as an applicant. It doesn't matter that really the  level of job that you're applying for. The recruitment process is stressful.

[COBY]:

Oh, got right there.

[JAMES]:

You are put on the spot. You have to, you know, you want to perform your best,  you want to give the best impression. There's a lot of pressure on you already.

[COBY]:

Right.

[JAMES]:

And then you get into a situation where you are forced to go through this,  personality test or It may be personality test, it may be something else. May be  another form of psychometric that is going to impact your ability to get this job.

[COBY]:

Right.

[JAMES]:

What happens when the questions are not clear? Right. What happens? Because, I mean,  I've seen a lot of questions around these things. Like my coworkers would say that I do this, right?  Well, no. Well, my co-workers never actually said that. So if they've never said that, am I supposed  to respond to it? If I think they would say that I. How am I supposed to respond to it? You know,  all of these different interpretations mess with the validity of the question itself.

[COBY]:

Right. But also increase the stress response because the idea of like, like,  so cortisol is the stress kind of hormone chemical that's in our brains when we're  experiencing stress. Our horizontal levels are really high. That's when we kind of have that  stress fog. Right. We can't fully. We don't have a clear mind. Right.  And that's one of the problems with like, one of the things that I despise about tests in  the recruitment process is that you're like hey, hey there, stressed out job applicant. Take this  test because it's going to define whether or not we want to hire you or not. Right.  So you're taking someone who's already experiencing high levels of cortisol and  has the stress response, and then you're tying the, you know, how likely they are to move on  to the next phase by this thing that, like you say they might be confused over. Right?

[JAMES]:

Yeah.

[COBY]:

And again, something that m. They made with a clear head been able to go,  well, I'm pretty sure what they're saying is this. With a stress response, they,  it could spin them out or could get them frustrated. And some of them are timed too,  which is always weird when they time those ones that's just don't, even if you,  if you're doing any of this, don't do time. That's just stupid. But, you know, then they're, well, on  a clock and this. And so again, you're having all of these, having all of these problematic pieces.

[JAMES]:

And, and you can see some people saying, well, I want to put  my new applicants under pressure because I want to see how they perform well if.

[COBY]:

The job requires them to take personality tests and under stress. Absolutely.

[JAMES]:

Yeah. But this is, this is the thing, like I Could see somebody  making that argument. Right. Well, it's okay to stress them out because the job is stressful,  but it still has no bearing on the job itself. So if you have a very stressful environment,  then you need to prepare people for that. If you don't have an exceptionally stressful  environment and you're using these things and you're saying to yourself, well, I don't,  you know, we like to have this information, but we don't really use it in our hiring practice,  then why are you forcing somebody to go through it at the recruitment stage? Why are you forcing them  to do it at the height of stress and anxiety? rather than if it's just good information for  you to have, you know, whether you use that as part of team building exercises or you know,  helping managers to understand their teams better. Cool. Do it during the onboarding.

[COBY]:

Yes, it's an onboarding tool.

[JAMES]:

It's an onboarding activity, not a recruitment activity.

[COBY]:

Absolutely. And the other thing too is one of the things that again, like I think  that a really important thing for people to be aware of is if you are using, if you're using,  if you're customizing, if you're hiring someone to use psychometrics, an essential aspect is  you have to have almost like a balance, an equal balance of accuracy, validity and user experience. Because those are two vital things that are needed in order to  make sure that the test works and that people complete it with the clearest mind.

[JAMES]:

Yeah.

[COBY]:

And one of the things that we see is that when you shift because like  what we're saying before was that, you know, if they're modifying the  Myers Briggs and the disc kind of like, you know,  copies of copies and they're making them about like birds and trying to tap into like, you know,  the Halo Effect and those kinds of things. They're leaning too heavily on the user experience.

[JAMES]:

Yeah.

[COBY]:

But then they're like well no, we have actually hired psychologists  and the lab to do this. But then they create tests that maybe are too long,  too complicated and it creates a test taker fatigue or…

[JAMES]:

They’re not using a plain language  that creates confusion about the intent of the question.

[COBY]:

It can also create test taker fatigue for sure. Yah. Because the thing is that with test  taker fatigue, if the test is too long or it's too convoluted, just if the user experience is  not held at equal weight as academic, as validity and the accuracy, then people will just want to  get over with, okay, I'm getting Tired of this and let just scroll through. I’m going to answer,  I'm not going to give it the thought and the depth that I need to get over with which throws off the  score just throws off the accuracy and those off the validity and. Right. So it's a problem. So  that's a major problem. And when you add the stress response to it, it just compounds it.

[JAMES]:

Yeah, I mean I've experienced that myself. I find psychometrics really interesting.  I've done a ton of them. We use them ourselves. I kind of like them as a tool and I've gone through  like fairly recently testing out software, and testing out different programs and stuff that  had a psychometric as part of the process. And it was like 73 questions of ranked statements and by  the part, you know, halfway through. Even for somebody who enjoys this stuff, it is,  I'm bored. I just want this to end so that I can take a, ah, so I can see the results.

[COBY]:

Right.

[JAMES]:

And that is a legitimate problem. Trying to make them too  comprehensive. Trying to make them too trying to get to validity by stuffing  it full of more questions rather than making the questions better.

[COBY]:

Right. And that's just it is that one of the things that we always like to do whenever  we're brought in to modify or recreate someone's psychometrics for whatever reason is we always  try and say, kind of, they say, well, we want to be as long as it can be without it being too  cumbersome because we know that too short of test doesn't have accuracy valid statement. But there's  stuff you can do with questions or you can get multiple data points out of single questions,  remove to limit the number to eliminate the test taker fatigue while making it just as  valid. But that's the difference of, you know, just pulling something off the shelf versus  having someone that's actually, you know, trained and knows what they're doing. Right.

[JAMES]:

Well, I mean with. How many times have we railed against boxed solutions of  any variety? And again this is, this is another example of, you know, it's not that these tests  are inherently wrong. It's not that Mars Briggs is a bad assessment. It's not that DiSC is a bad  assessment or strengthfinders is a bad assessment. It's are you picking up a boxed program and trying  to make it fit in a round peg? Right. It's that you're using it wrong so stop using it in hiring.

[COBY]:

Right. So they kind of summarize kind of the thoughts around hiring. We  should Move on a little bit. But it really is. I think the core thing is that many of  these tests are not designed for hiring. It's not why they were created. And because of that  some people created commercial adaptations of the original tests. And the problem is  that a lot of those don't meet the same rigor for accuracy and validity as original tests.  But one thing that we have seen is that we've seen these commercialized adaptations marketed.  I'm saying an air quotes for those who can't see me, as academically valid because well,  the original disc assessment is academically valid and this is an out. So, so that's  what it's so marketing. They're marketing that whether. But their test isn't the same.

[JAMES]:

It's like the made for TV movie based on a real. On a true story.

[COBY]:

Right. Based on.

[JAMES]:

It's based on academically validated theories but it's has no basis in reality.

[COBY]:

Yeah. So like I think long story short, when it comes to hiring,  my opinion, and I'm gonna say my professional opinion is that using  personality psychometrics for hiring is bad science and that people shouldn't do it.

[JAMES]:

Yeah. I can't think of a situation where there's a direct relevance to the job.  Right. I think one intended use is that people that companies think about it as well. We want  to learn more about you. And if you want to learn more about your employees, psychometrics  can be an excellent tool to help you understand your employees better. So if that's the case,  just move it to your onboarding process and you will have a much better result, overall. But  that also leads us nicely into kind of the next piece which is okay, if hiring is out how can we  use them effectively? And psychometrics as part of talent development is in my opinion, I think it's  a great tool, especially when you're getting into like leadership capacity, capabilities, leadership  styles, understanding how somebody approaches, different situations, their personality,  these aspects, even emotional intelligence has really can have a really substantial impact on a  person's on the way that they lead others. Right. And so understanding a person's baseline so that  they can be. You can put supports in place for them to continue the upward mobility through  their career and through your organization. I see as a positive use of psychometric assessments.

[COBY]:

Yeah, I mean one thing that I do think is probably a good sign is when companies look to  improve their hiring promotions talent development by adding elements like psychometrics. Like oh,  we want to do something we want to involve our practices from beyond the standard traditional.  So we're doing these things and that is a great goal and I applaud the goal. I just always caution  people to just not grab an off the shelf solution or to think that a  commercialized product is going to meet is always. Is as good as is marketed. Right.

[JAMES]:

Yeah.

[COBY]:

but you're right. I do think that the point of that there is, there is a very good  use for psychometrics. I mean we say we use them and we build them and we customize them and we,  and we validate a lot of them, for other people. And so there's, there is a real value to it. But  again it's a tool that has to be used. Right. Like that's kind of the, I think the thesis  statement of maybe the question should be it's a tool that has to be used. Right. And so one  of the things that, you know, I really think that, you know, as much as I don't think there's a real  role for them in hiring, maybe, you know, maybe, maybe, you know, there's maybe a 1% chance that  there's situations that we haven't thought of where it could be used. I don't want to.

[JAMES]:

Or somebody has done an excellent job of tying a customized psychometric to  actual job functions and job duties. That's possible. I haven't seen it,  but I'm not saying it's not possible. if that's the case, great, good on you.  And I'd love to actually hear about what you're doing because that sounds flipping cool.

[COBY]:

Yeah, absolutely. But I think as far as talent development,  even internal mobility like promotions, there is a place for them. But again, the place is  that the assessments could be used as baseline profiles, benchmarking of where they are today,  not the deciding factor of a person's fit. And that's something that we just have to kind of  get right out there. That is just, that's the role of it. Right. Because I mean like I said,  their intended purpose was about self reflection, self awareness, self discovery.

[JAMES]:

That if you understand more about yourself and where your strengths and  weaknesses lie, you can build on your strengths and you can work to mitigate your weaknesses.

[COBY]:

Yeah, I mean a lot of psychology is really putting people into buckets. I mean that's kind  of a reductive statement. But I, if you boil it, it's kind of down to stuff. It really,  it's about identification of trends, of labels, of again putting people in the buckets and then  figuring out if they're in this bucket then there's stuff we can do with them,  we can understand them better. And that's really kind of the core of it. Right. But  it's not about saying well once you put in that bucket it's unmovable. It's really about here is  my current baseline understanding of you. Now every interaction may be informed by that and  tested and retested and evolved and modified as a relationship grows. Not this is who you are,  it unchangeable. And that’s the problem with the hiring is you don't have the time, the experience,  the interactions to evolve from that baseline understanding. It's like you label so that’s  who you are. We don't want that label. You're out. But I mean the idea of what psychometrics  are supposed to be and again I one thing we probably could have done a little better job  early on is clarifying different kinds of psychometrics mean there personality ones  are what we kind of started off with. But I mean learning styles there's absolutely tons  of validity around learning styles, learning preferences. there's communication styles…

[JAMES]:

And if you're doing internal training and development, understanding  how people learn best allows you to tailor your training in the most effective manner  for the greatest amount of knowledge transfer. Right. Wonderful application of a psychometric.

[COBY]:

Right. There's communication styles, there's preferences around how people feel  accepted and acknowledged and different team roles and how people can actually become  fit into a more a natural role as part of a complex team. There's lots of different types  of them. Personality is one of the most common. It's a bit of a bucket term but really like  there's so many different other psychometrics but all of them, all of them are really about  creating a mirror to help better understand ourselves and understand those around us. And that really is kind of where the if you want to know how to use them, if you're in a  situation where you as the organization need to know people better and create a better,  a deeper awareness of them to understand them who they are as an individual, as a person or,  and or you as a person want to know more about yourself and have of a bit of a self discovery,  self awareness actions that is the time to use them and in promotions and talent mobility and  child development those are, those are, are ripe areas for that type of self awareness creation.

[JAMES]:

Yeah. For me, the two in my opinion, the two best use cases in the workplace are  like you Said it's from. I think there's a tremendous amount of value for a manager or  a leader to understand themselves and understand their teams. Because we, whether you're talking  communication styles, personality, behavioral assessment, whatever, the more you understand  yourself and how you prefer to, how you default to responding to things and how your team works,  you're going to create better, you're going to have a better team dynamic and you're going to  have better performance outcomes. That to me is one fantastic use case, as you said. The  other one is really creating that benchmark of okay, we've targeted. You know what? I see Coby,  he's been working hard. I think that with a little bit of training and support, he would make  an excellent candidate for this role that we know we're going to have to replace in six months time,  a year's time. So, you know, we're starting to groom Coby to take on more responsibilities.  We need to create a baseline to understand where he's at and compare that against where we want  him to be and then put supports in place to make sure that over the next six months or whatever  that timeframe is, Coby is able to fully jump into that new role and hit the ground running.

[COBY]:

Yeah, absolutely. And that's the great thing because again, when we're looking  for promotions and talent mobility within our organizations, understanding what gaps  employees have today that they need to address or we should be part of their development to get them  to where we want them to be is a great use of it because again, like we do technical assessments,  they don't have this training, they don't have this experience, they don't have this much time  on the machines. That's something that we do already. But where psychometrics are so valuable  is allows us to assess the employability side of it. To assess things like emotional intelligence,  which is kind of a loose, hard to grab onto, nebulous, vaporous thing. Right. Things are. Or  things around kind of like communication styles or around how they handle autonomy or those,  or what motivates them, you know, and creates a sense of purpose in them. These types of  very nebulous things can be captured with well defined and well validated psychometrics. Right?

[JAMES]:

Yeah. Yeah. Even like team dynamics is a big one as well. Right there. Their preferred  roles, how they tend to operate within a team. Do they take a more authoritarian, style is like  there's so many great uses of psychometrics that it's really irritating to see them used poorly.

[COBY]:

Yeah. And the thing that really, again like the thing that I hate is when, you know,  if you look up, like, you know, if you look up psychometrics and you know, you'll see like,  you know, lots of comments and lots of people's opinions about how they don't work,  how they're junk science again, the, the. The use of them and, and the iterations  of the versions out there. I'm gonna say a lot of them are bad science, but it taints the real use.

[JAMES]:

The, the real good thing, the intended use versus the actual use, are miles apart.

[COBY]:

Well, I mean, but that's as fair saying screwdrivers are awful. I tried to like hammer in  these nails with the screwdriver. You know, I tried to cut the piece of wood with the  screwdriver. The screwdriver is a piece of junk. Like it's. And that's what we're seeing out there,  right? But it's the idea of like having people that know how and when to use them,  how to customize them, how to modify them, how to validate them. It's having an expert  to help you is the game changer from having someone that is. That isn't right. So I mean,  I mean that's kind of an obvious statement, but that's kind of the situation that we're  in. These are not. Anyone can use things. These are things that require experience  and background and insight and you know, interpreting them properly…

[JAMES]:

And then creating a plan for like that requires a  depth of knowledge that doesn't come with a prepackaged system.

[COBY]:

No. So. Okay, so. So I think the. We kind alluded to it, but I think is helpful to  kind of. To kind of state. Okay, so we said a lot of things don't work. We said there's environments  that they do work in, but what does work in those environments? Like what kinds of tests or what's  something that we should. If we're I listening, what's something that could be done if we want to  actually explore using these effectively? What are the things that we should know about? And I think  that one of the, one of the things that is a good, almost just little piece of information for people  is that when the assessments are like, you know, by answering this kind of vague statement, it,  you know, with this answer it means something unrelated like you're trying to tap into their  subconscious. Those types of tests are really not that good because I mean the cognitive pinball,  the has you have to be able to predict for that is again, I've seen tests where it's like  pick one of these four animals and if you picked, you know, the, you know,  if you pick the oxen then that means that you're a workhorse or. That's totally right.

[JAMES]:

So yeah, you pick an animal and somehow be. Because you chose a  particular animal out of a very limited selection that automatically means that  you have some particular trait or preference. Yeah. That's dumb.

[COBY]:

Yeah, those are junk science. Avoid those.

[JAMES]:

The thing read like a Facebook find out your Harry Potter character.  You're probably, you've probably bought into a really crappy system.

[COBY]:

Yeah. I mean the unsexy, uncomplicated answer is that the  more obvious what the question is going after is, the more accurate the question is.

[JAMES]:

Yeah, right.

[COBY]:

Like I like working in a group. What that tells you is the person.

[JAMES]:

That the person likes working in a group.

[COBY]:

Yeah, I mean that is that those kind. That's how assessment  should be. Like they really should be that literal because again it's  not. There's not gotcha questions or if there's is. This is bad science.

[JAMES]:

Yeah, but I mean you can have those obvious statements. So what I like. So one of the  methods that I learned from you, when we really started developing these years ago is the idea of  preferred ranked statements. Right. So you know there's. You may have something as simple as  I prefer to work as part of a team compared against another statement and then over time  all of these statements are compared against each other. So you're reading the same, you know, 10,  12 statements over and over and over and over again. And there might be like 50 questions or  more. But it, by going over it and you know, choosing the same I like to work as part of a  team when compared against other form other types of questions and that comes out as top then yeah,  you really do prefer to work as part of a team. Right. It's that type of  reinforcement that can make It s. It sounds really simple because it is.

[COBY]:

Well that's just it. So when we build different types of assessments for people or  we build them for ourselves because again we have some of our own proprietary assessments.  Preference inventories are what you're referring to are a great, simple, effective, accurate,  have a high test retest, result that works. And that's the thing is that there's an art form in  crafting the questions and there's an art form in balancing the question ratio. And the user  experience is not as simple as just ask you a bunch of questions There it's again, there's an  art form to it and there's science behind it and like that. But again, the more like conceptually  simple the assessment approach is, typically the more academically valid it is. Right. but again,  just having good questions doesn't necessarily make good tests. And what I mean by that is that  it's not just about the questions themselves, it's about how the questions are asked because  how accessible the language is. Right. The accuracy of the results and not trying to be too,  like not having too many assumptions in what the results mean and make sure that  you're using things like universal design of learning to make sure that accessible  is easy for everyone to assess regardless of English as your first language or not.  you know, and the idea of just trying to make sure that we're not leaving so much open to  inference or into kind of the vague questions, kind of like what you were saying earlier,  like, you know, my co-workers say kind of questions, well, what if they don't say  it? Or what if they might say it or what if they. Something similar. It's not it.

[JAMES]:

How would I know if they're going to say that?

[COBY]:

Right, right, right. Removing a lot of those, a lot of those pieces  is just as important as kind of what the questions are themselves. Right. Because  you want there to be, you know, as much as you possibly can, only one way to interpret  the question because that improves the accuracy and the validity of it. Right.

[JAMES]:

So it question the question, then the questions flawed.

[COBY]:

Exactly. So part of it too is that it's not just you having a simple approach. Like we  talked about things like preference inventories and ranking systems and those kinds of things.  That methodology has a lot of simplicity and validity to it, but it doesn't do all the heavy  lifting for you. Right, right. and again, go back to my point. I said earlier that you have  to have equal priority of the validity and the accuracy with the user experience. And that's  something that's not an easy thing to do. And that's why, why you and I work together on these  types of things. Because yes, I can ensure that we're meeting the accuracy and the validity and  psychological and the adult learning principles of it. But you have to bring you back to reality to  make sure that it's something that the user can finish, something that the user can be,  giving questions with a clear mind. People have the same energy that.

[JAMES]:

Well, and it's the interpretation because when you're writing, I mean we've  had this argument well, discussion, argument, many, many times. Because when you're writing,  when you create, you Coby, you, anybody, when you are writing, you have a particular intent in mind.  and it's really hard to objectively take a step back and look at how other people might interpret  that question. Right. So when, like, we've had this discussion many, many times when we've  been working on psychometrics, where you go hard into the science behind it  and crafting the questions and we fight about. Well, no, this question doesn't  make sense. It doesn't. I don't read it the same way that you read it. Right.

[COBY]:

I think you and I probably spend 20% of  our time together working, arguing about wordsmithing.

[JAMES]:

Yeah, probably, like legitimately, it's probably more like 22%. Can we spend 20% of the,  time in the podcast arguing about wordsmithing?

[COBY]:

But, but the thing is, is that, but that is kind of the,  that's the kind of care that has to go into making sure that you're not just putting it out there,  that it makes sense to you when you. And you're moving on that crafting the stuff. Again,  there's an art to it. Right. It's not just science art.

[JAMES]:

The language that we use is very important. Communication is very important.  And communication needs to be tailored to your audience. So, if you don't know the very specific  target audience that you are trying to communicate with, you're probably not doing it as effectively  as you could be. Right. Which is why we spend so much time arguing about a particular word.

[COBY]:

Well, one thing you remind me a lot of, usually as you're like, as you're like scratching  is you're like massaging the stress headache away from you. As we're in these arguments thing,  it doesn't matter how good the test is if they're too stressed to finish it.

[JAMES]:

Yeah.

[COBY]:

And that is so accurate. because that's just it. Right. I mean,  that's why the equal priority of accuracy and validity and user experience has to be  ingrained in anything that you build. And that's one of the things that I think going back to why  so many of these commercialized adaptations of tests don't work or why so many things  there on the market right now are bad science is because they lean on one way or the other. They,  and them, some of them ignore the other, some of them, they're just one. Right. And if that's  not taking into consideration, then the tests are bad science. So again, going back to there  orig. No question, should we Be using personal H tests and hiring promotions. I mean hiring. I  think the. Our answer is no. and in promotions, yes. If you're using it as baseline profiles.

[JAMES]:

You’re using it as a to understand people, not as a decision making tool.

[COBY]:

Right, Absolutely. And that is. And there's so much power that they can come from  a good psych metric and understanding people like and that's one thing that I think that  if you're like, well, psychometrics are really great in talent mobility,  helping people kind of scale through their organization,  promote and talent development. Absolutely. And that's where they should be living our  in our businesses. And they should not be something that a job applicant ever sees.

[JAMES]:

Yeah. I agree.

[COBY]:

I'm not sure. Sure. Okay. I think that's kind of the summary right there.

[JAMES]:

That's a good summary. Yeah, yeah. Don't do it in hiring.

[COBY]:

Yeah.

[JAMES]:

If you want to learn more about your ah, your incoming employees, use it as part of your  onboarding process to understand people better. If you want to use it as part of a, performance  management and internal succession planning, upward mobility, talent development piece. Cool.  Great. Understand. Use it to understand people. Don't use it to make decisions about people.

[COBY]:

Right. And if you want to know the difference between  a good test and a bad test is really going to be the matter of if the UX,  the user experience is equal priority to the VAL and the accuracy of the test itself.

[JAMES]:

Cool.

[COBY]:

There you go. Yeah. So yeah, I don't know if we need to dig any more.

[JAMES]:

Done.

[COBY]:

Al right. So that about does it for us. For a full archive of the podcast and access  the video version hosted on our YouTube channel, visit Roman3.ca/podcast. Thanks for joining us.

[ANNOUNCER]:

For more information on topics like these, don't forget to Visit us at Roman3.ca.  Side effects of this podcast being include improved retention, high productivity,  increased market share, employees breaking out in spontaneous dance,  dry mouth, aversion of the sound of James' voice desire, to find a better podcast…

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.