The Rundown with Kansas Legislative Division of Post Audit

Reviewing the Department of Commerce's Process for Reviewing Building a Stronger Economy (BASE) 1.0 Grants

Legislative Post Audit

The Department of Commerce evaluated BASE 1.0 grant program applicants using a standardized scoring rubric, but it didn’t consistently follow its process or document the Secretary’s final award decisions. The Building a Stronger Economy 1.0 grant program (or BASE 1.0) awarded almost $100 million in federal funds to Kansas entities for infrastructure development. Commerce received 445 BASE 1.0 grant applications requesting a total of about $1.7 billion. Of the $99 million Commerce awarded to 35 recipients, about half went to 10 projects in Butler and Johnson counties. Commerce reviewed applications to the BASE 1.0 grant program and determined which applicants would receive funding using 3 main steps: eligibility review, application scoring, and final selection. Commerce completed an eligibility review for all 10 applications we reviewed and eliminated 1 that wasn’t eligible. Commerce didn’t consistently follow its application scoring process for the 9 eligible applications we reviewed. We couldn’t review the Secretary of Commerce’s final award decisions because this process wasn’t documented.

SPEAKER_00:

Welcome to The Rundown, your source for the latest news and updates from the Kansas Legislative Division of Post Audit. I'm Maury Exline. In May 2025, LPA released a limited scope performance audit reviewing the Department of Commerce's process for awarding building a stronger economy-based government. 1.0 grants. I'm here with Andy Ranzo, principal auditor at Legislative Post Audit, who completed this audit. Welcome to the rundown, Andy.

SPEAKER_01:

Thank you, Maury.

SPEAKER_00:

First, what is the BASE 1.0 grant program?

SPEAKER_01:

So the Building a Stronger Economy, or BASE 1.0 grant program, was a pandemic-era program that awarded Almost$100 million in federal funds to Kansas entities for infrastructure development. So the Strengthening People and Revitalizing Kansas Executive Committee created this program in December 2021 using Federal American Rescue Plan Act or ARPA pandemic funds to help with infrastructure development associated with economic development projects. This funding was open to a wide variety of entities. for projects that were delayed or slowed because of the COVID-19 pandemic. They included local governments, economic development organizations, community foundations, nonprofits, private developers, Native American tribes, virtually anyone you can think of would have been eligible to apply for one of these grants. And if they received one, a recipient of a base 1.0 grant could use it for a wide variety of different purposes. They included things like renovating business parks, parking facilities, industrial offices, water infrastructure, other utilities. Recipients were generally required to complete these projects within two years of signing a grant agreement. And they also had to provide at least 25% in matching funds. So the Kansas Department of Commerce administered this program, and they're responsible for reviewing applications and selecting the recipients. They reviewed applications that were submitted between January 31st, 2022 to February 28th. 2022. They also the next year in 2023 administered a second round called base 2.0. Similar program, but that is that was a separate grant program. And we're not talking about that today.

SPEAKER_00:

So how many grant applications did commerce receive? And how much did it award in grant funds?

SPEAKER_01:

Commerce received 445 base 1.0 grant applications requesting a total of about$1.7 billion, which is much, much more than they had capacity to hand out. The request that they received varied significantly. On average, each applicant requested about$3.8 million. But the range was between$7,500 to$25 million per grant application. And they came from all over the state. So they got applications for projects in 84 counties. Johnson County, as is often the case with programs like this, was the most common. Projects in Johnson County accounted for 55 of the 445 total or 12%. 60 of the 84 counties had five or fewer applicants. So it was pretty concentrated in a couple of counties, but there were applicants from all over the state. And they got all kinds of different applicants. As I mentioned before, this was open to a wide variety of different types of entities. And the types of entities and the types of projects really ran the gamut. So as I mentioned, they got a lot more in requests than they could hand out. So they had$99 million to award. They awarded this to 35 recipients. On average, each recipient received about$2.8 million. But the awards, again, range from$13,500 to$10 million each. And And again, these were spread throughout the state. 24 counties were represented among the recipients. 20 had just one approved project. And in alignment with how many applicants they had, Johnson County had the most. They had six projects out of the 55 that they requested.

SPEAKER_00:

The main part of the audit dealt with evaluating whether Commerce followed its process for reviewing Base 1.0 applications. First, walk me through the process that Commerce described using.

SPEAKER_01:

The process that Commerce described had three basic steps. So the first step was an eligibility review. Commerce used a third-party vendor called Witt O'Briens to review all the Base 1.0 grant applications to ensure they met basic requirements. eligibility requirements. This included a basic risk assessment that looked at things like internal controls. The vendor completed an eligibility review for all 445 applications that they received, and they determined right off the bat that 195 didn't meet the eligibility requirements. The second step if a project application was determined to be eligible was application scoring. So Commerce told us that two department staff members reviewed each application using a standardized scoring rubric. The rubric allowed a maximum of 155 points for each application based on 10 scoring categories. And then the two staff members scores were averaged to determine each application's final score. The department scored 250 of the applications that they received. And I will get into a little bit more of the detail of what those scoring categories are in just a minute. The final step of the process is the Secretary of Commerce's selection for a final selection. He just uses professional judgment to decide which applicants received a grant award and how much they would receive. And After that point, selected projects received a more comprehensive post-selection risk review conducted again by Witt O'Brien's, the third-party vendor. And as I mentioned before, the secretary selected 35, ultimately, of the 445 applications to receive funding.

SPEAKER_00:

So how did you approach your evaluation and what did you find?

SPEAKER_01:

So essentially, we wanted to review a selection of 10 applications to assess whether Commerce followed the process that they had outlined and followed it consistently. So we selected, as I said, 10 applications based on a variety of different factors to ensure we had variation in things like geography, the funds applicants requested and were awarded, and the scores applications received. So The results I will talk about sort of in those three categories that I mentioned before. We'll start with the eligibility review. All 10 of the applications we reviewed did receive the eligibility review. Commerce's third-party vendor completed those reviews for all 10. They looked for things like whether the applying entity was an eligible type. had requested a valid funding amount, was proposing an allowable use of the funds. And again, it included that basic risk assessment. Staff determined one of the 10 applications in our sample was... ineligible because it proposed an unallowable activity. And we saw that that wasn't given further consideration. It didn't move on to the scoring phase. So of the 10 applications we reviewed, nine were deemed initially eligible. So the second step of the process, as I mentioned, was the application scoring step. This is where two Commerce staff members scored based on 10 factors and then averaged the scores together to get the final score for the application. The 10 factors that are worth up to 155 points are a description of the project and its scope, why the project needed base 1.0 grant funding, the project's budget, the applicant's matching funds, the project timeline, project-related bids and estimates from contractors, project-related architectural and engineering reports, the applicant's business and marketing plan, community letters of support for the project, and the expected impact of the project. As I mentioned, nine of the ten from our selection passed the eligibility review and received the scoring review. And among those nine applications, we saw that Commerce didn't consistently follow their process. So eight of the nine eligible applications received two scores, as they were supposed to. One application was scored by one person, though, and Commerce staff said this was human error. That one was not subsequently approved for grant funding. Since one of the nine eligible applications was only reviewed by one person, these nine applications received 17 rather than 18 reviews. So it should have been reviewed 18 times, two per, but one was only reviewed once. We saw that 14 of the 17, so the vast majority of them, had blanks instead of scores for that last element that I mentioned, which is the expected impact of the project, which is worth up to 20 points. Department staff said the reviewers... probably believed the projects deserved zero points and blank means zero points. But because they didn't enter a score of zero, we can't tell with certainty whether the reviewers did intend for the point value to be zero or if they just unintentionally skipped this question.

UNKNOWN:

Okay.

SPEAKER_01:

We also noted that the two reviewers for the nine that we looked at often disagreed on objective application factors, like how many community support letters were provided or whether there was evidence of matching funds or project-related bids and estimates. Since these are objective things, we would have expected them to be the same. Commerce staff said this was human error, again, simply due to the volume of applications they received in one month and the short timeframe they were working within to get the money out the door. These all sound kind of minor, but inconsistencies in an application review process in our estimation matter because they could affect which applications are approved. So for instance, the application that received one review wasn't scrutinized as thoroughly as applications that received two. And then skipping that question that's worth 20 points, the expected impact of the project consistently decreased the total points that were available to these applications and that could have disadvantaged them compared to other applicants. We don't know how frequent these issues are because we didn't review all of the applications. But we think they're indicative probably of issues beyond the 10 that we reviewed. It's unlikely that we happened to select the 10 that had these issues. Finally, the last step of the process, as I mentioned, is the Secretary of Commerce's final award decision. We couldn't review that because it wasn't documented. We don't know which projects the Secretary considered, why he approved the projects he did. how the amount of funding was determined, and of course that limits the public transparency of the program. Department staff told us the secretary considered the application scores and other factors like geography, project type, risk, the amount of funding when making the final selections, and they said this was a verbal process that happened during meetings with the program manager, but we don't know how many meetings they held, and we don't know ultimately what they talked about or how they made their selection. Because of this, we don't know why the secretary didn't select some of the higher scoring projects over lower scoring ones. We saw several instances, 11 projects, in fact, that weren't approved, but that received the same or higher scores as a project that was approved. We don't know why that happened because none of this was documented. Part of it could have been the fact that the secretary reviewed applications on a continuous basis and didn't require a minimum score threshold for approval, which, of course, increased the risk that this type of outcome could have happened. And then finally, just as a final note, all six approved applications we reviewed did appear to receive that post-selection, more thorough risk review that I mentioned earlier. So that part of the process was consistent.

SPEAKER_00:

So finally, what's the main takeaway of this audit report?

SPEAKER_01:

I would say the main takeaway of this audit report is that although the Department of Commerce created a process for consistently and fairly reviewing each base 1.0 grant application, they didn't follow that process as faithfully as they could have. And they didn't document key parts of that process, like the final selection. This is similar to another audit we completed recently of the Department of Commerce, thinking of the community service tax credit program in which we had very similar findings. Key elements of that process, again, were not documented, particularly the final selection by the Secretary of Commerce.

SPEAKER_00:

Andy Brianzo is a principal auditor at LPA. He evaluated commerce's process for awarding base 1.0 grants. Thanks again, Andy.

SPEAKER_01:

Sure. Thank you.

SPEAKER_00:

Thank you for listening to The Rundown.