Offer Accepted

Building Interviewer Scorecards to Raise Your Talent Bar with Vanessa Paladini, Nubank

Ashby

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 24:37

Great hiring outcomes start with great interviewers.

Vanessa Paladini, Global Talent Acquisition Director at Nubank, joins Shannon to share how the world’s largest digital bank outside of Asia raised its hiring bar without slowing down growth. She walks through Nubank’s four workstreams, including structured assessments, technical bar raisers, and a data-backed interviewer scorecard.

Vanessa broke down how her team measured interviewer assertiveness, linked decisions to 90-day ramp and 12-month performance, and reduced the interviewer pool to improve their skills. 

Key takeaways:

  1. Data builds credibility: Linking interview feedback to long-term performance helps TA influence the business.
  2. Use AI thoughtfully: Evaluate how candidates use AI tools during exercises instead of prohibiting them.
  3. Measure assertiveness: Track interviewer decisions against ramp time, retention, and performance data.
  4. Hiring is collective: Moving accountability beyond recruiters creates stronger business partnership.

Timestamps: 

(00:00) Introduction

(01:03) Meet Vanessa Paladini

(01:40)   Three pillars of Nubank's Hiring Talent bar

(03:15) Four workstreams to raise the talent bar

(06:15)  Why companies should invest more on interviewer quality

(08:19) Building and measuring interviewer scorecards

(11:19) Defining the traits of a good interviewer

(14:40)  Who has access to the Scorecards?

(17:30)  Additional learnings from the Scorecard program

(20:53)  How the Scorecard data is expected to impact conversions

(23:44) Where to connect with Vanessa


Vanessa Paladini (00:00):

The one thing that sometimes is not very intuitive for people is that by reducing the interviewer pool, you can increase quality. If we have the selected group of interviewers, they will guarantee that the quality of the talent that you're bringing is actually mad. So you'd go ahead and you reduced the very best interviewers. They are there, they're interviewing, and then you start expanding the group of interviewers.


Shannon Ogborn (00:27):

Welcome to Offer Accepted, the podcast that elevates your recruiting game. I'm your host, Shannon Ogborn. Join us for conversations with talent leaders, executives, and more to uncover the secrets to building and leading successful talent acquisition teams. Gain valuable insights and actionable advice from analyzing cutting edge metrics to confidently claiming your seat at the table. Let's get started. Hello and welcome to another episode of Offer Accepted. I'm Shannon Ogborn, your host, and this episode is brought to you by Ashby, the all- in-one recruiting platform empowering ambitious teams from Seed to IPO and beyond. Today we are joined by Vanessa Paladini, who is the global talent acquisition leader at Nubank, the world's largest digital bank outside of Asia. Vanessa brings deep experience leading recruiting through IPOs, M&As, and really rapid scale-ups with a passion for building high-performing teams and applying that data-driven strategy to elevate hiring standards.


(01:23):

At Nubank, she leads talent initiatives that span from early career talent to executive search, so kind of running the gambit of things that she's working on, but really with the goal to drive sustainable growth across complex and diverse markets. Vanessa, thank you so much for joining us today.


Vanessa Paladini (01:38):

Thank you so much for having me.


Shannon Ogborn (01:40):

I know that you all have just gone through this sort of transformation process. Before we get into our topic today around interviewer quality, would love to just briefly hear from you on how Nubanks sees hiring talent bar.


Vanessa Paladini (01:59):

Yeah, sure, of course. I think that we approach that in probably three pillars. So we think about, of course, you having a good quality of process. So making sure that you have really the objective methods in place to measure talent density. So you are consistently about the way that you assess candidates, you are using probably a criteria of career level into satisfactions. You know what you're looking for in terms of skills, you have the frameworks to do that, and then you executing that process very well. So I think this is a second pillar that is really about the quality of the execution that you run. So you need to make sure that you sustain this rigorous and standardized global hiring bar, applying that consistently across all functions that you have, pipelines, all countries. And then also with limited, yet very good group of interviewers, and we're going to talk more about that in the conversation.


(02:52):

But then finally, with that in place, so having qualitative process and quality of execution, you can then guarantee that you have a positive outcome. And so that you have actually the talent that you need, not only for their immediate need, but also talent that is going to adapt and is going to perform and probably be ready for future needs that the organization is going to have. So this is how we see it and can talk more about it as we go.


Shannon Ogborn (03:16):

I think the adaptability piece is so important, especially now. Technology's changing very quickly. And I think technology will only continue to change, whether it's AI or other things at Break Neck Pace. So hiring for that adaptability and longevity is super important. And I know that you had mentioned there's sort of four work streams at Nubank that started to implement the raise of the talent bar. What did those look like?


Vanessa Paladini (03:43):

Yeah, sure. I think the four work streams, they were really to guarantee that we were actually raising the talent bar a little bit, also to pursue that adaptability and that skill that is so important as everything is evolving so fast. And so what we did was to implement, first of all, a structure assessment at the top of the funnel. So making sure that we were applying cognitive tests for everybody as the very first future. And also we are using a tool that is also for technical exercises. So we make sure that it's consistent enough and also apply to all the technical roles at least, right? Second, we have adapt our process to match the career ladder. So I was mentioning this before. You need to make sure that you know exactly the skills that you are assessing for a level, for a function. And so if you have the career ladder as this very first thing and the baseline of how you define the process and splitting all the skills that you need across the process in a way that makes sense, you're guaranteed that you have a much better decision making at the end.


(04:48):

And so third pillar was really implementing a program that we call technical bar raisers. So bar raisers are very well known in the market. We decided to implement for technical skills specifically. So it's this very small group of strong bar raisers. They're very aware of our technical bar. They know what they're assessing for in each one of the levels, and they are part of the processes at the end. So they join the debriefings to guarantee that we are applying this consistent bar across the board for all our positions at the moment. And to make sure the way that implement gets in the way of scalability because it can, because it's like a small group of people that you need to be in your processes. But if it's well done, it's super strong and powerful tool. And then finally, we had the four pillar that was really the interviewer quality.


(05:38):

So guaranteeing that you have a group of well calibrated and engaged interviewers, which is tricky. It's hard to do, but we'll talk more about it, how we make that happen. But I think it's total game changer when you have that group.


Shannon Ogborn (05:52):

There's definitely a lot of hot takes in many directions on bar raisers, which is maybe a future hot take. But on the interviewer quality, when you think about interviewer quality, which obviously we're going to dive deeper into today, why should companies consider doing and measuring this?


Vanessa Paladini (06:15):

I think it's very dangerous when you don't have a skilled and well calibrated group of interviewers because in the end, they should be empowered enough to make decisions during the process. And so you need to make sure that that group really holds that very high bar, but also standardized bar, fair enough, and also that they are representing the culture of a company. They need to know of what are the values, what they should talk about, and they need to be very well trained. So I think that is very important to guarantee a good outcome of a good hire to have that group well calibrated and engaged. And I think that we try to create a way to measure the performance of those interviewers. So really what we called interviewer scorecard, that was actually understanding how they were performing that role as an interviewer, if they were being assertive, if they were well calibrated in terms of leveling, if they were making good decisions, if the actual result of that hire was positive for the organization.


(07:17):

So there was a lot around that, but also these people need to provide a very positive candidate experience. So it's so hard to hire talent in technology, especially you need to make sure that you're providing the best experience possible, that people really want to join and work with that group of people. So I think they're all also very important in that


Shannon Ogborn (07:35):

Angle. 100%. I think it's the intersection that you're talking about of quality, but also consistency because that is what elevates a positive candidate experience. If we know that we're delivering the same experience, that's an exceptional experience to every candidate, you're in such a better position, especially for a company that is doing B2C. The candidate experience is such a big piece of being successful because every person that you talk to could be a customer.


Vanessa Paladini (08:06):

Absolutely. And we hope they are.


Shannon Ogborn (08:10):

Right. With the building the structure of the interviewer scorecards, what was sort of the first step there?


Vanessa Paladini (08:19):

I think it's very long work. It's been at least 18 months now that we are building this and still trying to get to the end of it and measure results. But what we did was to start understanding first the data that we had available. So how much we had register of their scorecards, interview notes and all of that, and realized that if we needed to do any sort of shift and revamping the type of data that we are accumulating about candidates, and we found out that it's actually very bad data, first of all, and then we had to revamp the way that we were doing that. Now with AI note takers, I think this is getting easier and easier to have good amount of data, but first this was a challenge. But then after solving that first problem of having the available data, then we started to measure the results.


(09:10):

So we were measuring the decisions that they made during the process in any given stage that they were participating with the outcome of the hire. So they said yes, and the person in the end was approved in the process. So this was one measurement. And then second, when the person started after 90 days, were they already doing a good ramp up? So if yes, great, and this is positive, no, then it's also another measurement that we were taking into account. And then finally, and this takes time, of course, because you need to wait for people to be here for a period of time. When this person, this new hire completed six months and then 12 months, what was the actual performance? And then we had the formal performance cycle to back this up. And with that, we could understand and go back to their first feedback during the process to know how assertive their interviewers are actually being in terms of really getting us the best talent that we can have.


(10:07):

And so this was the way that we started to build that scorecard and understand how they were performing overall. So now we have a measurement that says how assertive they are, like in a percentage of cases, and we can actually feature the most assertive interviewers to bring them to the process and to also use that information to train the ones that are actually lagging behind and they need to be better calibrated. So I think this is very powerful as a tool for any organization.


Shannon Ogborn (10:37):

Definitely. And I think the peer mentorship is such a big piece of success because as talent people, we can do a lot of coaching, but I think people typically respond a bit better from their peers of, okay, let's look at this example of someone who's doing this exceptionally well. Because it's not really about, we want you to feel guilty or bad that you're not doing this right. We want everyone to be set up for success. But in terms of the inputs, you had mentioned assertiveness. Can you explain a little bit about how you got to defining these traits of a good interviewer? Who did you work with and how were they sort of established?


Vanessa Paladini (11:19):

Yeah, I think we had a partnership from the business for sure. So we were working directly with each and every function that we have. And when I say function, I'm talking about engineering, product managers, business analysts. We had people from each one of them to help us figure that out. But also I think assertiveness really goes in a direction of like they hired people that actually are performing here, not only in the short term six months, but also in the midterms, 12 months, they stayed, they are retained in the company, and they are also performing very well. So I think this is the most important piece. They made actually a good hiring decision, and this is how we define assertiveness, but then you have the experiences. So we had feedback from candidates, like how was their experience with the interviewers, if it was well conducted, if they feel comfortable with that people.


(12:09):

So this is also an important piece of data that is more qualitative data that you need to take into account to define a good interviewer as well and build that with the business in partnership with them so they don't feel like you are inspecting their work. And so I think this is something that you just mentioned, so important, but actually that they are getting better at that job that is so key for each and every company. The one thing that I want to comment about this is that you need to make sure that you're doing a way, especially if you are in rapid growth, that you do that in a scalable way. So we have now a tool that is an AI low taker, but they also have an embedded coaching system in the tool. And so the tool actually gives feedback to the interviewers on how they're assessing the skills and how they can do better.


(12:58):

So this is helping us a lot to scale that a little bit more and not do one-on-one coaching only.


Shannon Ogborn (13:05):

When you were making these changes, did you start with a pilot or did you just go sort of full broad scope right away?


Vanessa Paladini (13:14):

We did a pilot at first, started with, at the time with business analysts because they're very into data. And so it was very easy to have that conversation with them. And saying that we were trying to measure this, they were super open. And so we built a whole dashboard and the scorecards with them with their help also to correct the data, extract the data, analyze it. And so we had a lot of help and this was our very first pilot. We pilot this for at least six months with them. And I think that the one thing that sometimes is not very intuitive for people is that by reducing the interviewer pool that you have, you can increase quality because then what we found out is like if we have this selected group of interviewers that of course they are enough to meet your demand. So we need to make sure that you know your funnel so you're not creating a problem in the bottom lack here.


(14:08):

But if you have this very well-trained and calibrated group of people, they will guarantee that the quality of the talent that you are bringing is actually met and then they can also train other new interviewers. So you'd go ahead and you reduced, and this is what we did, right? We reduced the group of interviewers, the very best interviewers, they are there, they're interviewing, and then you start expanding the group of interviewers as they are also very well calibrated and now can train others. So this was a way and an approach that we did during this whole process.


Shannon Ogborn (14:41):

That's great. Who has visibility into the scorecards? Do they have visibility into their own scorecard? Is it just talent and their manager?


Vanessa Paladini (14:49):

Yeah, it's just talent acquisition. Yeah, just talent acquisition. We decided not to use that as a performance indicator, for example. So we were doing, and we're still doing that. We're still working with this information and figuring out the way that we hold people accountable for a good outcome. Right now for interviewers, our only sort of requirement in terms of hiring is that they actually show up. So if they don't, if they're declining interviews, if they're not engaged with hiring and we need them, then this is something that affects their performance. But the fact that they're the best interviewers or not so good interviewers, it doesn't actually affect their performance. So the managers don't have visibilities, the talent acquisition that manage that information. But then as we go and as this becomes a more mature thing, we can start to do that for sure and see how we use more of that information.


(15:40):

One thing that we realized during this whole process was that it was much easier for them to perform well as an interviewer when they were tied to a unique profile or level. So if you have, of course, to hire recurring profiles in your organization, you can actually tie them to a unique level. So they are hiring only senior software engineers, for example. That group of interviewers are only hiring for that level. That makes them much more calibrated and making better hiring decisions than having them. One day they do junior software engineer. The other day they do a super senior. So this was also something that we learned with that data. And now we are structuring, especially for engineering that is the highest volume that we have a Nubank of hires, then we always have interviewers, a group of interviewers tied to a unique profile and level.


Shannon Ogborn (16:31):

Yeah, that makes total sense to me because it's easy to forget that while hiring is an important job, it's not interviewer's only job. And when you're in talent and you live, breathe, sleeping, eating, interviewing and candidates and experience and all of these things, we're so absorbed in that work that I think it can be hard to see past the fact that it might be hard for someone to interview for three different levels in a week. And the context switching is a pretty high cognitive load. Exactly. So being able to break that out into the unique profiles I'm sure has been incredibly helpful on the results note. I know that you had mentioned, which counterintuitively reducing the interview pool can increase the quality, which is super interesting. And then sort of tying to those unique profiles, what else has come out of this program?


Vanessa Paladini (17:30):

I think that many learnings and especially of how much we needed to have that data to better train them. And then you saw people that were in the same process with the same candidate, one is a strong guess and the other one is definitely not. So I think that data overall was very powerful in just making us move in the right direction and having a better group of interviewers. But then we also saw how much some interviewers are doing such an amazing job. So we have people already with us for more than 12 months and they are here performing well, they're doing amazing. And so I think there was also a relief to see that we are actually making good hiring decisions. So we are measuring this data also with a correlation, as I said, in the first performance cycle that the candidate has, but also 12 months retention and then promotions within 24 months.


(18:25):

So this is a way that we are seeing the success of those hires. And I think that we found out that there are several interviewers that are so assertive and doing amazing hiring decisions. And then let's use these people to train others and to understand how they're doing and why they're so good at what they do. Yeah,


Shannon Ogborn (18:42):

Because you really need interviewers to show up. People think that recruiters have a lot, I think especially candidates think recruiters have a lot of power in this situation. And really recruiters are facilitators of an amazing process that hinges on the experience someone has with an interviewer. And we've all had experiences where we've showed up to an interview or we've seen feedback from an interviewer as a recruiter where it's like, you just didn't show up today, like you didn't want to be there, you weren't excited, you didn't demonstrate the values. And I think everyone has bad days and that's totally fine, but when it becomes consistent and you can see the data for that, and then you can take that person out and say, "Hey, here's the feedback." And now obviously with these AI note taker for interviews, it's becoming more interesting not to spectate or over-govern interviewers, but just make sure that everyone has what they need to be successful.


Vanessa Paladini (19:45):

That's it. This is extremely important. And I think that we learned a lot about it doing that whole path. And I think it's longer than this. I think it's at least three years ago that a Nubank, we were tying the hiring outcome only to the recruiter. So it was a recruiter responsibility to make a good hire. And then we had to do that mindset shift of like, this is actually a collective effort and we are doing those decisions, as you said, sometimes together, but most of the times the hiring managers is a decision maker. And so we are just facilitating that. And so we had to change that mentality in the organization. And people now really understand that this is a collective effort. They need to show up. They need to do a very good job at this. And we as enablers, we need to make sure that we give the less friction possible that we have the processes structured, that the scheduling actually works, that they don't need to switch context all the time.


(20:43):

So there's this cognitive load, as you were saying. And so I think that this is something that has changed throughout the years, and especially in this last 18 months with this whole interview scorecard project.


Shannon Ogborn (20:54):

For sure. I know you talked a little bit about how this will go looking forward in some of the metrics there. Anything else that you all will be looking at on a move ahead basis? This project started 18 months ago. It takes a long time. These things do not happen overnight. The data collection just starting it is such a big piece. What does this look like in 12 months, 24 months, 36 months?


Vanessa Paladini (21:23):

I think that we will see, and we hope to see an impact also in the funnel conversions. And so what ended up happening with this whole revamp on hiring TalentBar is that we went from one conversion rate to the extreme other direction of having a very low approval rate. And that can happen because as you are adjusting the pendulum, sometimes we go to the opposite direction. So now we're being very rigorous with our talent bar, but then are we being too harsh? I don't know. So we will see. And as we calibrate people against the level that they are assessing for, we will know better and adjust that better. So there were a couple of changes in some functions that we saw a big jump in the decline rates like spiking and we are like, wait, wait a minute, let's understand better what is happening.


(22:13):

So further down the line, as we continue to calibrate people and bring people along and only use this selected group of people tied to one job requisition, then we will have better conversion rates, I think, not probably to what we were before because we were trying to increase the bar. So naturally you're going to prove less people, but to a level that makes sense and this is the expected because we don't want to reject everybody. We want candidates to succeed. And so I think this is something that the evolution of this is continuing to do that so well that actually you get to optimum place in order to the complete opposite direction.


Shannon Ogborn (22:56):

Yeah, you certainly don't want to ... It's bad to have false positives and then you have to terminate and there's all of that, but it's also bad to have false negatives. You don't want to have so many false negatives that you're missing out on what could be one of your best employees. So the Balancing Act is definitely there. Well, we are going to get to some of our more fun questions. For those who aren't a regular listener, these have moved to our extended version on YouTube if you're listening on Spotify, Apple, et cetera. We are coming up on our time. Where should people go to learn more about you and your work in Nubank?


Vanessa Paladini (23:37):

Just go to my LinkedIn profile, I guess. Vanessa Paladini, you find me there, please send me a message if you want to catch up and I'm super available.


Shannon Ogborn (23:46):

Amazing. Well, I know that there's a lot of companies right now that are focusing on this sort of quality of hire talent bar, and I think this is going to be really helpful in them understanding just one of the steps to get there with the interviewer scorecard. So really appreciate you taking the time to share that with us.


Vanessa Paladini (24:06):

No problem. Thank you so much for having me and happy to come again and talk about the other things. So count on me.


Shannon Ogborn (24:13):

This episode was brought to you by Ashby. What an ATS should be, a scalable all-in-one tool that combines powerful analytics with your ATS scheduling, sourcing, and CRM. To never miss an episode, subscribe to our newsletter at www.ashbyhq.com/podcast. Thank you for listening and we'll see you next time.