Offer Accepted
Welcome to Offer Accepted, the podcast that elevates your recruiting game. Your host, Shannon Ogborn, interviews top Talent Acquisition Leaders, uncovering their secrets to building and leading successful recruiting teams. Gain valuable insights and actionable advice, from analyzing cutting-edge metrics to claiming your seat at the table.
Offer Accepted
Safeguarding Interviews from ChatGPT Misuse with Aline Lerner @ interviewing.io and Michael Mroczka [Reshare]
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
How can recruiters keep up in an AI-driven hiring world where technical interviews face challenges like cheating using ChatGPT?
Aline Lerner, Founder & CEO of interviewing.io, and Michael Mroczka, Software Engineer and Dedicated Coaching Interviewer, explore the evolving world of recruiting and the surprising implications of AI tools like ChatGPT on technical hiring.
In this conversation with Shannon, Aline and Michael share insights from their groundbreaking study on cheating in interviews, the systemic flaws that encourage it, and why rethinking your interview questions is critical to safeguarding hiring integrity. They also share actionable strategies to create fair, efficient processes that work for both candidates and hiring managers.
Key Takeaways:
- Adapt interview processes: With tools like ChatGPT making it easier for candidates to cheat, traditional technical interview questions are becoming less effective. Shifting to customized, real-world problems gives you a more accurate assessment of skills and also reduces the ability of a candidate to misuse AI.
- Prioritize fairness and transparency: Candidates often feel disenchanted by outdated or unfair processes. Thoughtful, well-designed interview questions not only improve hiring outcomes but also help build trust and create a better candidate experience.
- Custom questions drive better hiring decisions: Verbatim questions from resources like LeetCode are more susceptible to AI-assisted cheating. Instead, tailor your questions to be rooted in actual work challenges so you get stronger signals about the candidate’s capabilities.
Timestamps:
(00:00) Introduction
(01:03) Overview of interviewing.io
(03:12) Challenges in talent acquisition
(04:26) AI in interviews and its implications for technical hiring
(07:10) How technical interviews have become an arms race of memorization
(10:13) Designing a study to test the effectiveness of ChatGPT in technical interviews
(12:07) Verbatim vs. modified vs. custom questions
(16:27) The tools making cheating easier
(19:29) Rethinking interview questions to reduce AI vulnerability
(23:58) Real-world questions for candidates and hiring teams
(28:06) How often teams should refresh their questions and make them scalable
(32:01) Engaging interviewers with custom questions
(33:31) Incentivizing interviewers and making hiring collaborative
(39:00) Balancing fairness, empathy, and efficiency in modern recruiting
(42:36) Advice on achieving hiring excellence
Aline Lerner [00:00:00]:
To me, excellence means taking a little longer to read people's resumes and looking not just at the brands, but looking at what I think is called distance traveled. Where did they start and where did they end up? Did they kind of have an outsized mobility from point A to point B? Even if they started somewhere, that's not very impressive. Is it impressive that they got to where they got?
Shannon Ogborn [00:00:25]:
Welcome to Offer Accepted, the podcast that elevates your recruiting game. I'm your host, Shannon Ogborn. Join us for conversations with talent leaders, executives, and more to uncover the secrets to building and leading successful talent acquisition teams. Gain valuable insights and actionable advice from analyzing cutting-edge metrics to confidently claiming your seat at the table. Let's get started. Hello and welcome to another episode of Offer Accepted. I'm Shannon Ogborn, your host and this episode is brought to you by Ashby the all-in-one recruiting platform. Empowering ambitious teams from seed to IPO and beyond.
Shannon Ogborn [00:01:03]:
I'm really excited for the folks that are here today and the topic we have a Aline Lerner and Michael Mroczka. Aline is founder and CEO of interviewing.io, an anonymous mock interview and recruiting platform that has helped tens of thousands of engineers practice technical interviewing and land their dream jobs. And Michael is a Google software engineer, is one of the highest-rated mentors at interviewing.io. He has a decade of coaching experience having helped engineers get into popular tech companies like Google and Facebook. And after receiving multiple offers from tech companies early in his career, he really enjoys teaching others proven techniques to pass technical interviews. Michael, Aline, thank you so much for being here with us today.
Aline Lerner [00:01:46]:
Amazing pleasure to be here.
Michael Mroczka [00:01:47]:
Thanks for having us.
Shannon Ogborn [00:01:48]:
So just to give folks a little bit of a brief background on you all or what you're currently up to, let's start with Aline, and then we will go to Michael.
Aline Lerner [00:01:57]:
Well, I've been running interviewing.io for the last nine years. Our mission is to make hiring efficient and fair. And what that means to us is making it so great. Engineers, regardless of how they look on paper, can have access to any company that they want. And it also means cutting down on time to hire and cost per hire and helping companies talk to the right people.
Shannon Ogborn [00:02:20]:
Yes, I love that. What about you, Michael?
Michael Mroczka [00:02:22]:
Yeah, and I'm one of the interviewers on interviewing.io. and the goal is just to help people get hired faster. I think a lot of people spend a lot of time studying and doing interview prep. Honestly, it's really effective to just get in front of somebody and have some feedback, get feedback from companies or people at companies. So it's like, what was I missing?
Shannon Ogborn [00:02:43]:
Totally. Well, I'm super excited for both of your perspectives. But Michael, you're actually the first software engineer who has been on the podcast. So I think people will be really excited to get some really interesting perspectives here. But, Aline, I know that you all write a lot of articles and blogs and you come across certain challenges that are happening in the space today. What challenges? What big challenges do you think are really facing the TA space as of late?
Aline Lerner [00:03:12]:
I can think of two big ones and I think we'll be talking about one of them today. But I think the other one is also really important to mention. And I got the numbers for this actually from the Ashby report, so it's coming full circle. I think the first big problem is these days, because of the vast and very unfortunate recruiter layoffs, there are fewer recruiters dealing with a lot more work, both in terms of inbound candidate volume and sourcing. I think for inbound, it's like there are 2x or 3x fewer recruiters dealing with like 2x the candidate volume. So that's insane. And that requires a lot of workflow changes. The second thing is the effect of AI on recruiting, I think both good and bad.
Aline Lerner [00:04:01]:
But one thing that Mike and I have learned through some of the experiments we've done at interviewing.io is that it's surprisingly easy to cheat in technical interviews using tools like ChatGPT. We don't know how many people are cheating, but we expect those numbers are going to go up and it's probably already higher than you think. So how will interview processes have to change to accommodate that?
Shannon Ogborn [00:04:26]:
Yeah, that's so interesting because in another one of our reports, so I think you were talking about the recruiter productivity report and another one of our recent reports about candidate sourcing, we found that folks who used Ashby's AI tokens for outreach had a significantly higher response rate. And so it's this interesting push and pull of should I be scared of AI? Should I not be scared of AI? And I personally think, and I would love to hear your all thoughts, I think it's just an eyes wide open situation. You have to really understand the good, the bad, the ugly. You have to understand what you're dealing with and work around that, not avoid it. But I think if you go at it kind of full force, get into it and know the circumstances, you're just going to be better off.
Aline Lerner [00:05:13]:
I couldn't have said it better myself.
Shannon Ogborn [00:05:15]:
Amazing. Well, I would love to get both of your perspectives on this, especially when it comes to cheating and cheating in technical interviews, there's some roles that I think it would be a little bit more difficult for. And I know the study that you all did was really focused on technical interviews. Michael, maybe you first. I would be curious to hear about why do you feel like this is an important topic to focus on, or why should recruiting teams really, and hiring managers really be looking into how this impacts their team?
Michael Mroczka [00:05:48]:
Yeah, for decades now, software engineers specifically have been given technical interviews to go through, and it's been a contested way to sort of screen candidates, but still very effective. We ask technical questions and depending on how the person does it gives us really strong signal as to how they'll do on the job. And the issue with that now is anyone can go and ask the question to ChatGPT and get an answer. And if it's a publicly available question, I'm tipping off what the study reveals. But if it's a publicly available question, ChatGPT is really good at answering it, and that's really dangerous for hiring. So if we have people hiring based off of this and they're not carefully checking if people are cheating, they're going to get a lot of maybe just unfit and unqualified people for jobs, which isn't good for anybody a hundred percent.
Shannon Ogborn [00:06:34]:
I like that you use the word contested. That's probably the best word that I've heard used for these types of technical interviews. When I was working with candidates, they would push back and say, this doesn't demonstrate job performance. I'm like, but so much data. Google suggests that it does. So contested. Amazing. Love that.
Shannon Ogborn [00:06:52]:
It's such an interesting thing. And I think we're going to get into the results here in a second and review what happened and what it means for recruiting teams. But I would love if one of you could walk through what the experiment was, what the setup was, and what you were really hoping to answer by doing the experiment.
Aline Lerner [00:07:10]:
I can start. And Mike, please jump in. So I think contested is the right word. A lot of our users have very strong feelings about these kinds of interviews, and I think a lot of our interviewers, including Mike and others, do as well. I loved what you said about going into things with your eyes open, because I think interviewing is similar. Right. Technical interviews are a tool and they can be used well or they can be used poorly. One of the unfortunate things that we've seen in recent years, especially as the market has gotten more and more difficult for candidates and recruiters as well, is that interviews have gotten a little Bit more ridiculous.
Aline Lerner [00:07:52]:
I would say that there's a little bit of an arms race that comes with them. Right. There are fewer jobs, candidates are practicing more, they're memorizing more and more questions over time. It's like the frog being boiled.
Shannon Ogborn [00:08:06]:
Right.
Aline Lerner [00:08:06]:
The bar is slowly going up, even though interviewers may not even be consciously aware of it. If they see more and more people that are answering questions faster, that's going to be what they start to expect. And then over time, it starts to become a contest of who has memorized the most stuff rather than who is a good engineer. And for that reason, I think these interviews are getting a lot of vitriol, and it's something that bothers us as well. So with that as background context, when we saw what ChatGPT was capable of, we started to ask ourselves, is there a world where this can actually have some good side effects for our industry? Can ChatGPT as a cheating vehicle be a forcing function for improving and de. Escalating the technical interview arms race?
Shannon Ogborn [00:08:53]:
Wow, that's a really interesting thought, Michael. Anything to add to that?
Michael Mroczka [00:08:57]:
I think that's well said. In this whole experiment, we were trying to see whether or not ChatGPT could accurately handle something like a new question that it hasn't seen before, because a good engineer that, like, understands the principles can take the same idea and apply it in a different way. It's like math. If you know a formula and you use different numbers, you should. You should still be able to do it well. So it's a good way to see, like, is ChatGPT gonna, you know, be able to handle it? And is this a good screening tool to still filter out candidates? So I think it's been fun so far.
Aline Lerner [00:09:31]:
So we were selfishly, and I can talk about the setup, but we were kind of selfishly hoping that ChatGPT would be excellent at cheating because that would be the push that many companies would need to improve their interview process and questions. So here's what we did. We have hundreds of interviewers on interviewing.io, and we recruited some of them to be part of a study. We didn't tell them what the study was about. I think, Mike, we might have told them that it has something to do with AI or maybe. No, not even. No, we didn't even tell them that.
Aline Lerner [00:10:05]:
No, we just told them that we were trying to evaluate the efficacy of, like, different kinds of interview questions.
Shannon Ogborn [00:10:11]:
Yes.
Michael Mroczka [00:10:12]:
Over time.
Aline Lerner [00:10:12]:
Over time, yeah.
Aline Lerner [00:10:13]:
So nothing about. Nothing about AI. And then we recruited some candidates and said, you can get, you know, free mock interview whatever it is. If you participate in our experiment, then we split our interviewers into three groups. So the first group was told to ask questions from LeetCode verbatim. So many companies actually do this, right? I think Meta is a big offender in this realm, like where they literally. There's like a list of LeetCode questions on LeetCode that Meta asks, and you just memorize them. Not to trivialize.
Aline Lerner [00:10:52]:
It's still a lot of work and it's pretty painful, but. So group one, asking questions verbatim pulled from LeetCode. Group two was take a verbatim LeetCode question and then give it a little bit of a twist. Not a lot of a twist, just a little bit of a twist. Finally, Group 3 was asked a completely custom question, a question that you would not be able to find on LeetCode. Mike, I'll tag you in. What happened next?
Michael Mroczka [00:11:22]:
Yeah, at this point, there were a handful of things we were curious about. Like we wanted to know, could people actually get away with cheating? The interviewers didn't know they were going to try, so they weren't tipped off about it. But could the interviewees do it in such a way to where it wasn't very obvious that something sneaky was going on? That was the first question we wanted to know. And then the obvious second question is, is there an impact on the question type that we ask for ChatGPT? Is it better at one or worse at the others? The control for this, just to kind of give you a baseline is on interviewing.io in general, the pass rate is about 53% of people pass with a little bit better than half when we asked verbatim questions, those studies showed that 73% of people were passing. So verbatim questions definitely.
Aline Lerner [00:12:07]:
Let me just jump in. We had told candidates that it was their mission to use ChatGPT during these interviews and to try not to get caught.
Michael Mroczka [00:12:17]:
Yes. And we purposely asked them, don't rely on your own skill set, use ChatGPT, and be as sneaky as possible. We even give some, even gave some tips for how to cheat.
Aline Lerner [00:12:26]:
How to prompt, yeah.
Michael Mroczka [00:12:27]:
So like I said, 73% of people pass. That's almost a quarter more than what we'd expect. When we looked at the modified group, that second group where it was a LeetCode question that was just slightly modified, about 67% passed. So that's still actually quite high. So if you just modify the question a little bit, ChatGPT is really good at still getting the correct answer. And then if you ask Entirely custom questions that do not exist online. This is the crazy bit. 25% pass.
Michael Mroczka [00:12:58]:
So it's actually a better screening tool to screen out candidates than even just the normal. Control was 25%, big difference. So that was a huge revelation in and of itself that we could talk about for a second. But like I said, there was one other piece to this, and it was how many people were caught cheating? So on average, we polled people ahead of time. We asked and we said, how many people think 10% will pass? 50%, you know, 40%. There were a lot of different numbers thrown out, but believe it or not, there were zero. Out of all of the interviews that we had, zero percent of them were sort of caught. No one was caught cheating, which is a huge revelation and definitely a scary thing to say, so it's actually.
Shannon Ogborn [00:13:37]:
Kind of alarming.
Michael Mroczka [00:13:39]:
Yeah.
Aline Lerner [00:13:39]:
I think we even asked, like, interviewers some prompting, like, pretty, like, primy kinds of questions where it's like, was anything weird? Did you notice anything? How confident do you feel about your assessment? And nothing out of the ordinary.
Michael Mroczka [00:13:54]:
Nothing at all crazy. And when we asked, we've actually prompted them in three separate places just to see if anything odd was going on. And there wasn't anything weird in the responses. At best, people thought that they were maybe a little bit nervous or were perhaps a little slow, but nothing to indicate cheating, which is scary when you think about it.
Shannon Ogborn [00:14:13]:
That is because I always. In these cases, I actually also wonder about what motivates a candidate to cheat, because will they really be set up for success in the job if they can't pass the interview? But that's like a whole nother can of worms. So you get these results. Did you report the results back to the people who did the interviews? Were they, like, blown away?
Michael Mroczka [00:14:38]:
We did reveal at the end, and we said, sorry, gotcha a little bit, and apologized. And we did let them know when the article dropped. And a couple people wrote back and were super surprised by it. But I think for the most part, we treat our interviewers like guinea pigs. And they knew it was part of some kind of experiment. So I don't think they were too shocked at the fact that it wasn't what we said it was going to be.
Shannon Ogborn [00:15:00]:
Definitely. Wow. I would be like, damn it, I can't believe I didn't catch that. Because I think a lot of us would like to think we would know, right? We would recognize it, we would know it. And the reality is, based on this, you really don't. Even if you're primed, which is even more concerning. And I know that there's a lot of technical teams out there that are wondering and technical recruiters and hiring managers and executives. Like, moving forward, what should companies really do to resolve this? What is going to move the needle so that they still can hire the right person for the right role at the right time?
Aline Lerner [00:15:43]:
I think maybe one other note on cheating, and I'm not going to name these tools because I think they're gross and I don't want to draw any attention to them. But in our experiment we literally had people just have ChatGPT open in another window. There are tools out there that will listen to the audio of your interview and figure out what questions are being asked and then just feed you stuff. So you don't even need to like interact with ChatGPT. It'll just be this like streaming cheating thing in front of you. Again, I'm not going to name it. I think it's gross, but that I would expect. We didn't use that in our experiment.
Aline Lerner [00:16:27]:
I think it's even more damning that even with having to cycle back and forth between ChatGPT and the actual interview, nobody could tell that they were cheating. But with something like this, it's probably even easier. And this is out there.
Shannon Ogborn [00:16:39]:
Yeah, that's. I don't even know what to say. That's very concerning. But my mind goes back to what is in it for somebody to get hired in a role that they won't be successful in. But I think the mindset may go back to the contested part. Right. I don't believe that these types of interviews would predict if I could do well in this job or not. So effort I'm gonna cheat because I don't believe in prepping for these questions.
Shannon Ogborn [00:17:09]:
I know I could do the job so if I just get it. But I'm like, oh my gosh, the mental gymnastics my mind is going through right now is just like hopping all over those.
Aline Lerner [00:17:17]:
I think that's exactly the mentality, that is exactly what people are thinking. They're like, well, this is an unfair system. The system is screwing me, so I'm going to screw it back.
Michael Mroczka [00:17:26]:
And I think just to add to that, like you've said it twice now that you're like confused by the motivation. But I want to remind you that like some of these are like six figure high six figure salaries that are at the end of these. So I mean, even if I've heard people rationalize this in like on our discord and talk about it and say, well, heck if I could get a $500,000 salary and just last for half a year. That'd be really great. So I mean like some of these people I don't think are, they're just like saying if I can dupe four companies into hiring me for half a year, that's worth it. So I think sometimes motivation is a little short sighted. And there's also the other flip side to this where I think a lot of people think the only people that are passing these jobs are, you know, people that went to Stanford and had private tutors coaching them on this kind of stuff. And like that's not true, but it is sort of a perception that there's no way to get in unless you already like have some sort of elite education of some kind.
Aline Lerner [00:18:17]:
Yeah, like I could see somebody thinking this is the way to level the playing field. And there are so many flaws about the system that I could see why people, I have so much empathy for, why somebody would think that.
Shannon Ogborn [00:18:27]:
Yeah, there's this quote, I don't remember who it's from, but it's if you're not cheating, you're not trying. Maybe some people are taking that approach, which again, it's not right, but, and I'm not saying I'm not justifying it, but certainly like you both, I can see why somebody would do it, why they would be motivated to do it. It's just, man, what a difficult position to be in for recruiting teams where that not only does that hurt the trust between you and the candidate, the company, the hiring managers, and, and so I think recruiting teams really have to take the lead on how they can help their hiring teams reduce their risk of this. So tell me more about moving forward. What do companies need to do? I don't think you're ever going to remove the risk a hundred percent. I just don't know if that's possible even before ChatGPT. Right. But how can companies de-risk this situation?
Aline Lerner [00:19:29]:
So the best idea that we could come up with, and this is not an easy thing to do, and there's a reason that companies aren't doing it now, is to rethink the kinds of questions that you ask and invest more into your interview process. So it's very easy to rip questions from LeetCode and just say this is how we interview. And I think many startups do this because they look at the FAANGs and they're like, well look, the FAANGs are hiring all these great people and they're doing this. So if I use the same process, surely I will have the same outcome, but I like to think of that as cargo culting. Right. Just because you copy the process does not mean you're going to get the same results. And I would even argue that the FAANGs are successful despite their processes and not because of them. So this would involve taking a critical look at what questions you're asking during your process and removing anything that you can find on Stack Overflow.
Aline Lerner [00:20:30]:
Because that's what ChatGPT was trained on, right? Remove stuff that's easily available on LeetCoding. That's. That's what ChatGPT was trained on. And think about the kinds of work that you do every day. I'm not saying you should move away from algorithmic questions. I believe that in the right hands, and when asked well, with a good interviewer, those questions can carry tremendous predictive signal. So you can take the work that you're doing, be like, what is the algorithmic kernel in here? All right, let's take that and let's put it in the context of stuff that's unique to my company. One thing that I've suggested to companies in the past is for their ENG team to create like a shared doc.
Aline Lerner [00:21:14]:
And every time an engineer on the team does something they're proud of or is interesting or is cool, or just something that makes them want to brag to their team, just jot a little note in there. And then those notes can become the jumping off point for crafting truly unique questions. I'll turn it over to Mike now. I'm sure there's a lot more to say.
Michael Mroczka [00:21:34]:
I totally echo what Aline says on that, but I think one of the important things I just want to really touch on even more strongly is it's been misinterpreted. If you were to go and look and see who's talking about this study that we did, this little experiment has got a lot of people making headlines on like blind.com and other anonymous software engineering forums and stuff like that, where it's like, this is the end of data structures and algorithms. Finally, we don't need to care about these stupid interviews and like, things like that. And that can't be further from the truth. Remember that that custom questions arm that did really well was actually still data structures and algorithms. It just happened to be data structures and algorithms that was divorced from toy problems and focused a lot more on real world problems. It was like the Netflix engineer that had solved a bug on the front end that then took that front end bug, simplified it, and then put it into an interview question. It was like using actual experience from the job, showcasing on the job skills, but then also wrapping it into a data structures and algorithms kind of focus.
Michael Mroczka [00:22:34]:
So we can't abandon those altogether. So I'd say the companies can't do that, but they need to stop asking questions that are so easily seen online. I agree with Aline. If it's online, it's what ChatGPT was trained on. So it's not going to be a good thing to kind of test candidates on. There's also the other obvious thing, which is we could do in person interviews. Again, I don't think that they have totally gone away and there's certainly pros and cons of each, but that's the other obvious thing. If you're really not sure and you want to be super sure about a candidate, you know, maybe have at least one of those rounds be in person.
Michael Mroczka [00:23:06]:
But more than anything, I think you'll find that different companies have vastly different processes around gathering questions. And some of them care an awful lot about asking unique questions. Google actually isn't really affected by this very much because they strive to ask questions that do not yet exist on LeetCode. They're well known for asking new unique questions. It's what makes their interview so difficult. But other companies, I don't want to like pick on any in particular, but there are other, even faang companies that specifically take, you know, it's well known, like if you solve the top 100 questions on LeetCode, you're going to get one of those in the interview and that'll be like sort of the secret, the trick in. And now that's, that's no longer the case. And it's if they stop asking those and they pivot away from that, they're not only going to prevent people from getting in because they've memorized an answer, but they're also going to prevent people getting in because they've cheated.
Aline Lerner [00:23:58]:
I think if companies were to move in this direction, candidates would feel less vitriol toward this process. Right. Because if they're actually solving real work, you can't say, well, this isn't the work I'm doing every day. No, we actually just solved this problem. I think that another added benefit would be that interview questions that harken back to the real work are better selling tools. If you can see the kind of work you'd actually be doing, it's going to stick in your head and you're going to get more excited. And especially as a smaller company, if you're fighting with Larger companies for candidates. One of the ways that you can win that war is to have a bespoke, thoughtful process that feels more personal and feels like you put more effort into it.
Aline Lerner [00:24:39]:
That's probably not going to override below-market comp. But if you're paying decently well, this is the kind of thing that can tip candidates into accepting your offers rather than a FAANG. One thing we haven't touched on yet that I think is just important to call out because it's kind of the elephant in the room is I've seen some recruiting teams work very well with their partner ENG teams. And then in other cases, I've seen a lot of tension between the two. Right. There's tension about scheduling and tension about how much time the hiring manager can put into things. So I think it's a lot to ask to say the recruiting team has to be at the forefront of changing what kinds of questions engineers ask. When those questions have to come from engineers and you have to get ENG leadership buy-in for this.
Aline Lerner [00:25:26]:
But I'm hopeful that some of this data will help you get that buy-in and at least start that conversation totally.
Shannon Ogborn [00:25:34]:
Recruiting teams can be the catalyst without necessarily being the owner of creation and things like that. I actually had Aline to one of your points about it being a selling point. Had a great previous conversation on the podcast with Heather Doshe from SignalFire about structured interviewing and how she approaches structured interviewing is when you get into an interview, you kind of explain a little bit about like, what is the intention of this interview? Where companies could say if they have these custom questions, this is something that we solved just last week. And so these are some of the things that you would be encountering, the types of problems you'd be encountering. And so then you have the custom question based on that. And I do think it's a selling point, or at least an awakening point, where people can be like, this is interesting to me, or actually this is not interesting problem for me to solve at all.
Aline Lerner [00:26:27]:
Yeah. And it also shows some kind of value symmetry, right? Where if a company has put effort into creating these questions, it shows that they value and respect your time. Whereas if they pull some stuff off of LeetCode, they're saying, we're not even going to take an hour to come up with questions that would excite you because we're so confident you're going to want to work for us anyway, that we can treat you like shit. That's kind of the underlying message of some of these questions.
Michael Mroczka [00:26:57]:
I really think that it's maybe undersold. Just how refreshing it is for engineers to see questions that are more like this. One of the things we let people do is we let them leave comments afterwards after they took the survey at the end of the experiment and there was consistently many people, basically everyone that got a custom question said things like, this is a quote. Always nice to get questions that are more than just plain algorithms, way more fun and interesting. Like it's just people really enjoy this kind of thing. It's like actually let's do work together and solve a problem. It's also cool because people see, like I said that Netflix example, they see a Netflix engineer, they want to work at Netflix and then they actually get to like do a problem that a Netflix engineer did and like see like could I do this? Like is. It's like there's.
Michael Mroczka [00:27:41]:
It's personally validating in so many different ways that I think we're not really thinking about in this industry quite yet.
Shannon Ogborn [00:27:46]:
100%. One sort of itching question I have is how often should teams change their questions? Because the questions can end up on Glassdoor or on Stack Overflow or on various other websites about people talking about interviews, Reddit even too, if you choose.
Aline Lerner [00:28:06]:
Good questions and if they have several parts to them and if those parts are a little bit fungible depending on what the candidate says. The more open-ended the questions are, the less it matters if the candidate knows the question going in. So those questions take a lot of time and effort to craft. But the good news is you can use them for a lot longer than you would if it were a fairly straightforward question with maybe one or two possible responses. One of the things that we've seen our users enjoy in terms of kinds of questions are questions that do what's called layering complexity. So you start with something fairly simple and then you're like, now let's switch it up this way, what would you do? Let's switch it up this way, what would you do? And that can be much more engaging for the interviewer as well. To Mike's earlier point, that's a better way to see if two people can be smart together and kind of collaborate. So the more more open-ended question is and the more kind of add-ons and twists you can have, the longer that question can live.
Michael Mroczka [00:29:09]:
And I think maybe time is not the best way to look about keeping a problem as well. There's a lot of things that kind of go into it, but you know, depending on hiring volume, if you're not asking the question a lot. Or, like, if you have one that's like, for final round interviews that are on a whiteboard, you can go much longer than if it's like, a place where they could potentially, potentially cheat. If we're worried about cheating, really, I think it's more important to focus on, like, is this question something that, like, Aleene says it's scalable. Can I change it a little bit? Can we get something different? And I just remembered one other thing I just wanted to touch on. You'd asked previously about what can companies do differently? So one of the favorite things that we've heard of people doing is this idea of inviting an engineer to come in or log into a specific server that they can then from that server, you know, kind of play with building some small toy application so engineers can reset the server and make it sort of the, you know, the same environment for people every time they get a chance to see what the code base actually looks like and solve the bug in the exact same way. Like, that's super cool. And if you take the time to do something that's that complex and set up something that complex, it also has a huge longevity component to it because it's actually much harder for people to just, like, take a screenshot and be like, this is everything that I did in it and kind of get past it.
Michael Mroczka [00:30:23]:
So, yeah, lots of interesting tidbits there, I think.
Shannon Ogborn [00:30:25]:
But yeah, yeah, that's such an interesting point. So when I was at Google, I was working specifically on hiring women in tech, and I remember some feedback from interviewers so vividly of, there's many ways to answer this question. The person didn't answer it how I would answer it, but they answered it really well. I actually learned something today because they answered it in a completely different way than I would. And so when there's multiple pull answers, I feel like that makes it more interesting. Not even just for the candidate and their thought process, but also for the interviewer. Like, I think it's really important that our technical interviewers have a really positive experience and association with the interviewing process. Aren't like, I don't want to be here.
Shannon Ogborn [00:31:08]:
I'm not motivated to be here. This is like my worst nightmare. Because then that person actually, I feel like, will be more motivated to see if a person is cheating or something seems off or they're more invested in that person, like, joining the company. And so I feel like it's really important to think about how this also impacts. And Mike, you may have some thoughts on this being a technical interviewer on the interviewer side, I feel like this could positively impact someone on the interviewing side. Or they could say, oh, is it really worth it if people are going to be cheating with these other types of questions?
Michael Mroczka [00:31:41]:
I completely agree with that. I think when you're asking a stock question, then you have to continuously ask the same stock question every time and then watch just a robot. Right? Right. They're stumbling in the same place every time and you're giving the same hint every time. It's like, kind of boring.
Aline Lerner [00:31:56]:
Of course you're gonna sit on Reddit and like have Reddit open in another window.
Michael Mroczka [00:32:01]:
Right, Right. Unfortunately, I know a lot of interviewers that have done that. Not really so much at Google, but like, even at Google, unfortunately it does happen where people are like, I'm only half paying attention because this is the 400th time I've asked this question and this person's not doing particularly well. So. But I do think there's a huge advantage to being able to bring in your work and custom-build a question about something you did because you're naturally more invested in it. You want to see how they do it. And if you're actually talking about like brass tacks of what would you do? Like, you could easily then modify the question, be like, oh, that wouldn't have been possible because there's also this server configuration here or something like that. So you have more domain knowledge and you can actually talk about it more intelligently.
Michael Mroczka [00:32:41]:
It's just more fun overall.
Aline Lerner [00:32:43]:
This makes me think of one other thing companies can do as well, and perhaps this could be its own podcast. But to be succinct, I think part of the reason that candidates are disenchanted with the current interview process, interviewers are checked out, is because companies don't provide good incentives to their interviewers. Right. Very few companies include interviewer performance in their promotion packet. Right. There's actually no incentive to conducting interviews because it takes you away from your core day-to-day work and there's a big cost with being interrupted when you're an engineer. So rationally, the fewer of these you do, the better for your career. So how can companies rethink that and incentivize question creation, engagement? Right.
Aline Lerner [00:33:31]:
I expect Ashby tracks interviewer metrics. Right. How can you tie some bonuses or comp stuff to that or promotions? Right. The companies where I've seen interviewers care a lot or where there's one engineering leader that just really, really cares about this, it comes from the top and it's infectious. I think early Dropbox was excellent at this. They put so much work into their interview process. I'm saying early because maybe it's still great now, I just don't know. But there has to be some support for this from the top, and it has to practically come through in how people are incentivized financially and otherwise.
Shannon Ogborn [00:34:06]:
A hundred percent agree. It's such a good point. And as you said, that could definitely be like a whole nother episode tying sort of a bow on this. Before we move on, any other sort of last thoughts about AI cheating?
Michael Mroczka [00:34:21]:
Yeah, I think there's a good place to end this. Or at least on my end with the cheating study, there was that percentage where, you know, close to 75% of people were able to cheat with the verbatim questions. But I do want to point out there's just sort of like a caveat to people, you know, getting ideas in their head about this that might be listening to this podcast. When it doesn't work, it actually doesn't work, like, really badly. It becomes very obvious you don't know what you're doing if you start typing out a solution that, like, you think chatgpt totally, like, nailed. And then, you know, the interviewer is like, hang on, but why this line? This, this line makes no sense to me. And then you're like, forced to, like, explain it. Like, like you don't have time to, like, read through chat GPT's reasoning on it.
Michael Mroczka [00:35:03]:
Like, it's either go with it or not. So this can really screw you up in an interview. So just like, caution to the people that think this is a sure shot. People weren't caught. But, like, there's a video.
Aline Lerner [00:35:13]:
They look like idiots.
Michael Mroczka [00:35:14]:
Yeah, yeah, there's a video too, of one person kind of really making a fool of themselves in the interview for a couple seconds because they're like, oh, I can't really. I can't really explain why I did that. Sometimes my fingers just do things without thinking.
Shannon Ogborn [00:35:29]:
That is hilarious and amazing, but it is actually. That is such a fair point. And true, because when I used to read a significant amount of feedback on technical interviews, there would be times that would say this person just could not get to the solution, but their thought process was very strong. And that for this question, like, can't obviously speak to other questions, but for this specific question, they just could not get to the right answer or a right answer, but they talked through it. Right? They had a good thought process. And yes, that is definitely spoiled when you're trying to copy something specifically. This actually happened to reminds me of a situation that happened all the way back when I was in ninth grade. My teacher pulled me out of the class one day and was like, did you let so and so cheat off of your paper? And I was like, no.
Shannon Ogborn [00:36:20]:
They were like, well, you got a hundred and they got a hundred. This person is not the type of person who would get a hundred, but they had no work. It was literally just answers. And so it's like, how do you explain that? You know, like it's, don't set yourself up for failure by cheating. It's just not. I understand we talked about the motivations, but it's just you're gonna make a fool of yourself. And unfortunately things carry pretty quickly even if they shouldn't.
Michael Mroczka [00:36:50]:
Absolutely.
Aline Lerner [00:36:51]:
It's a good threat to end on.
Shannon Ogborn [00:36:55]:
So, okay, zooming out like a ton. We talk a lot at Ashby about hiring excellence. You all have sort of unique positions and perspectives to this that are outside side of like your day-to-day recruiting or talent leaders. I'm just curious, like, when you hear the phrase hiring excellence, what does that mean to you or how would you describe hiring excellence in the context of like what you're doing?
Aline Lerner [00:37:21]:
I think two things. One is top-of-funnel excellence, and that to me means making good choices about your inbound candidates and making good choices about whom you source. For us specifically, because we've over the last nine years have seen so much data that over indexing on brands on people's resumes is pretty limiting and not the right way to go. Like I, I'm convinced of this, but I don't think most people are. So to me, excellence means taking a little longer to read people's resumes and looking not just at the brands, but looking at what I think is called distance traveled. Where did they start and where did they end up? Did they kind of have an outsized mobility from point A to point B? Even if they started somewhere, that's not very impressive. Is it impressive that they got to where they got and giving those candidates more, having a process that finds those candidates and then for the rest of the funnel, knowing your metrics and being thoughtful about what's working and what's not working, Knowing metrics by source, including engineering time and your cost per hire, things like that.
Shannon Ogborn [00:38:32]:
100%. Michael, you're just in this in a completely different context of being in software engineering, but maybe hiring excellence means something completely different to you. I would love to hear it.
Michael Mroczka [00:38:43]:
Yeah, I think Aline's answer is definitely much more meta than mine is. But Like, I just think back to the different interviews I've had, and some of the best recruiters I've worked with are the ones that just, like, had those empathetic responses when it's like, holy crap, this is like my dream job. Did I do well? Did I not do well? And being able to handle, you know, good news and the bad news, like I said, empathetically, I think that makes a huge difference because there's. And I imagine this is really hard for recruiters because they're dealing with petite people, swearing at them, crying on the phone, stuff like that. That. That emotional bandwidth, I'm sure is crazy, but it's definitely appreciated. And I don't think it's appreciated by software engineers, honestly. There could be a whole separate podcast on how to wrangle software engineers from.
Michael Mroczka [00:39:30]:
Stop being such whiny babies. Sometimes I think about this sort of thing. But, yeah, no, that's maybe a little bit of what I think about it.
Shannon Ogborn [00:39:39]:
Yeah, no, I love that. The whole reason that I left IC Recruiting is because it was becoming so taxing for me personally. I have so much respect for everyone I know who is still in recruiting and just doing the hard work, oftentimes thankless. So I honestly really appreciate that. Shout out. And I think all of the recruiters will as well. Now, last question here. And again, it's a different perspective.
Shannon Ogborn [00:40:06]:
So I'm really excited about this. I would love to hear. And Michael, we can start with you if you want. What is your recruiting hot take?
Michael Mroczka [00:40:14]:
My recruiting hot take? Hoping I'm not going to steal Aline's on this because she. She turned me onto this recruiting hot take. Resumes don't mean anything. I think you're better off talking to a hiring manager directly, whether that's through LinkedIn or. I mean, don't go stalking them in person, but, like, find a way to connect with them and then try and get in that way. Because resumes kind of like what Aline said, the brand name on the company, those bullet points, they just. They don't mean much in. In today's day and age.
Michael Mroczka [00:40:41]:
Like, so many people look. Look the same. So that's. That's my hot take.
Aline Lerner [00:40:44]:
I agree with that and I'll talk about it through the lens of the company. I wish that recruiters would spend more time reading resumes and would spend less time focusing on brands and would actually read more of the bullets and the achievements. Does this person have good spelling and grammar? Do they sound thoughtful? Can you clearly tell what they did? I know that that's Asking a lot. But that's my hot take because we have a lot of data that shows that that stuff matters more 100%.
Shannon Ogborn [00:41:11]:
I mean, on the flip side of that, for the candidate side, I review probably a hundred resumes for free every year. It's just something I feel really passionately about. I don't. I guess my hot take is that I don't think people should have to pay for that. But I respect people who do that as a profession and are really, really helping people. But the thing that I've noticed is that I would say probably 80 to 90% of them are just so bad that it's no wonder that people aren't getting contacted. And that's not a knock necessarily against the person. It's just like there's so many things that you can do as a candidate to help your resume.
Shannon Ogborn [00:41:50]:
And then if all of these things combine together. Right. So all of these takes combined together, I think we would be in a lot better place in recruiting. But it is yet to be seen, so we'll just remain hopeful for the future.
Aline Lerner [00:42:07]:
Sounds.
Michael Mroczka [00:42:08]:
Yeah, it's this weird balance, right? Because we're kind of trying to balance as candidates applying to places, you want to try and make your resume look kind of custom, but you don't want to care about it too much because what if this place doesn't look at it, but then on the other side, it's kind of like you want to look at it, but you like this person's one of many people you're screening. So, yeah, definitely a balance.
Shannon Ogborn [00:42:27]:
Okay, we are coming up to the end of our time. Where should people go to learn more about you all and, and your work?
Aline Lerner [00:42:36]:
Well, one, our company is called interviewing.io. We actually recently started doing a thing that might be useful to your audience. We found that especially in this market, doing practice is more important for candidates than ever because of that technical interview arms race. So we have a link that recruiters can give their candidates where their candidates can get a free mock interview on our platform. And maybe that's good. I think it might build some goodwill with candidates. We also have some great resources to share with candidates. We have a free basically book about system design interviews and a bunch of other stuff.
Aline Lerner [00:43:10]:
So I'll share all of that with you. And of course, if you want to know more about this study, you can read our blog post.
Michael Mroczka [00:43:18]:
Yeah, and just one other additional shout out. We have a Discord community too, so if you ever want to ask a question. We've got a lot of interviewers that look at that and for free answer some questions like it's not going to be full coaching, but if you're curious about something, feel free to check out that discord.
Shannon Ogborn [00:43:32]:
I love it. This has been such an interesting conversation. I really think that part of leaning in to AI in any profession is really understanding what the possible implications are. And obviously cheating is a huge implication. So I have found this very informative. I think everyone else will too. Aline and Michael cannot thank you enough for joining us on Offer Accepted.
Shannon Ogborn [00:43:56]:
Really appreciate you spending some time with us. Thank you all for listening and we'll see you next time.
Aline Lerner [00:44:00]:
Thank you.
Michael Mroczka [00:44:01]:
Thanks a lot, Shannon.
Shannon Ogborn [00:44:04]:
This episode was brought to you by Ashby, what an ATS should be a scalable, all in one tool that combines powerful analytics with your ATS, scheduling, sourcing and CRM. To never miss an episode, subscribe to our newsletter at www.ashbyhq.com./podcast. Thank you for listening and we'll see you next time.