IBS Intelligence Global FinTech Interviews
Go one-on-one with the innovators, disruptors, leaders, and decision-makers driving change in FinTech and financial services. IBS Intelligence delivers exclusive global interviews that uncover strategies, challenges, and the ideas powering the next wave of financial technology.
IBS Intelligence Global FinTech Interviews
Ep956: Where Cloud Meets AI: Redefining the Future of Digital Lending
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Hari Padmanabhan, Founder – Chairman, Uncia
The discussion moves beyond buzzwords like AI and cloud-native, arguing that most institutions are still in early transformation stages. True AI-driven lending, he suggests, will emerge only when every transaction continuously trains the system. Uncia’s Zero Implementation and Self-Serve model challenges traditional, time-heavy deployments - shifting control to institutions through configurable, low-code tools like Uncia Studio.
We also explore how the Pay-As-You-Grow model is reshaping cost economics for SME lenders and NBFCs, turning technology from upfront capex into scalable opex.
The broader shift? Lending platforms are evolving from static systems into adaptive ecosystems - where cloud, AI, and embedded learning quietly redefine how credit is designed, deployed, and scaled.
I have a bit of a confession to make to start us off today. Oh boy.
SPEAKER_01What is it?
SPEAKER_00Aaron Powell Well, I have this folder in my email inbox and it's labeled AI Announcements. And honestly, at this point, it's basically a spam folder. It is literally a spam folder. I mean, if I see one more press release from some legacy bank claiming they are, you know, an AI first institution.
SPEAKER_01Just because they added a chat bot to their FAQ page.
SPEAKER_00Yes. Exactly. I might actually scream. It feels like the whole industry is just grounding in these buzzwords, but the actual user experience uh and the back-end operations, they haven't really changed since like 2015. Trevor Burrus, Jr.
SPEAKER_01Yeah, it's the classic lipstick on a pig scenario. I mean, everyone wants that valuation multiple that comes with being an AI company.
SPEAKER_00Yeah.
SPEAKER_01But nobody actually wants to do the uh the architectural surgery required to truly become one.
SPEAKER_00Aaron Powell Right. And that is exactly why I wanted to pull this specific deep dive together for you today. We are looking at a really dense, fascinating interview from the IBSI FinTech Journal. This is the November 2025 issue.
SPEAKER_01It's a great piece.
SPEAKER_00It really is. The subject is Hari Padmanapan. He's the founder and chairman of NSIA. And what I appreciate so much about his take is that he isn't out here selling a chat bot.
SPEAKER_01No, not at all.
SPEAKER_00He's basically saying that the entire way we think about digital transformation is just fundamentally wrong.
SPEAKER_01It's a very provocative stance, for sure. He argues that we're in this transition phase right now that is much, much messier than people want to admit. He draws a really sharp line between what he calls doing digital versus Which is the apps and the chat bots we were just venting about. Exactly. Doing digital versus being digital. And his definition of being digital is incredibly rigorous. It's not about overlaying technology on top of old workflows, it's about a fundamental behavioral change in the system itself.
SPEAKER_00Yeah, he drops a line early on in the interview that I actually want to use as our anchor for the discussion today. He says a system becomes truly AI-driven only when it cannot function without AI.
SPEAKER_01Aaron Powell, which is a massive distinction from where we are currently sitting.
SPEAKER_00Huge. Right now, I mean if the AI server goes down in a major bank, what happens? The lawn officers just roll up their sleeves and go back to Excel, or they log into their legacy mainframe.
SPEAKER_01Aaron Powell Right. The business just continues. It might be slightly slower, but it continues. He calls this current state agentic AI. These are basically helpers, you know, co-pilots. Right, right. They retrieve data, they might summarize a 50-page document, maybe they flag a suspicious transaction for review, but they are entirely additive.
SPEAKER_00Aaron Powell, you can strip them away and the engine still runs.
SPEAKER_01Precisely. If you strip them away, the core ledger, the core decisioning engine, it all still works perfectly fine. Pat Matapan is talking at something else entirely, which he calls embedded intelligence. This is where the AI isn't a helper sitting on the side, it is the engine itself. If the AI stops, lending stops entirely.
SPEAKER_00Aaron Powell To get to that point, though, you need way more than just a software update. And he touches on the hardware shift, which honestly I think gets overlooked a lot in these fintech discussions. We usually just wave our hands and talk about the cloud as this nebulous magical thing, but he's pointing toward highly specialized compute.
SPEAKER_01Aaron Powell It's a necessary evolution. I mean, general-purpose CPUs, which are the chips running the vast majority of bank servers today, they just aren't built for the kind of heavy matrix math required for real-time AI embedding.
SPEAKER_00Aaron Powell They're built for traditional logic, not neural processing. Trevor Burrus, Jr.
SPEAKER_01Right. Pat Manipan points out that we are moving towards specialized AI chips and completely distinct architectures. You just can't have a system that learns from every single transaction in real time if you're trying to run it on infrastructure that was designed for batch processing in the 1990s.
SPEAKER_00Which brings us nicely to the first major technical pillar of this interview, the small learning model or SLM. And I want to pause here for a second because you know the hype train right now is all about large language models.
SPEAKER_01Aaron Powell Everybody wants an LLM.
SPEAKER_00Everyone wants to plug ChatGPT or Llama into their tech stack and call it a day. But UNSIA is placing a huge bet on going small. Why are they swimming against the current here?
SPEAKER_01Aaron Powell Well, it really comes down to the fundamental difference between generative and predictive AI, especially in a highly regulated environment like banking.
SPEAKER_00Yeah.
SPEAKER_01LLMs are probabilistic by design.
SPEAKER_00Aaron Powell Meaning they're just guessing the next word.
SPEAKER_01Exactly. They are designed to guess the next likely word or pixel in a sequence. That is amazing if you want to write a catchy marketing email or summarize a client meeting. But it is absolutely terrible if you're trying to calculate credit risk or determine loan eligibility.
SPEAKER_00Because LLMs hallucinate.
SPEAKER_01Yeah.
SPEAKER_00And you really, really cannot have a banking system that decides to get creative and just invents a credit score out of thin air.
SPEAKER_01Right. The regulators would have a field day with that. You need auditability. You need determinism. An SLM, a small learning model, is architecturally very different. It isn't trained on the entire public internet.
SPEAKER_00Which means it doesn't know about, say, 18th century poetry.
SPEAKER_01Exactly. It doesn't need to. It is trained exclusively on the institution's own proprietary data, its specific risk parameters, and its historical outcomes. It's a completely closed loop.
SPEAKER_00So it's not trying to know everything in the world, it's just trying to know this specific bank perfectly.
SPEAKER_01Precisely. And this directly addresses the data governance nightmare that keeps bank executives up at night. One of the biggest blockers for banks adopting public LLMs is the sheer terror of data leakage.
SPEAKER_00Putting PII personally identifiable information into a public model.
SPEAKER_01Yes. With an SLM, that data never leaves the institution's secure perimeter. But the magic isn't just that it's secure, it's that the model is dynamic.
SPEAKER_00Walk me through that actually. He talks about how every transaction feeds continuous learning. Now that sounds great on a slide deck, but how does that actually work mechanically?
SPEAKER_01Okay, think about the traditional feedback loop in lending. A bank issues a loan today. Two years later, unfortunately, the borrower defaults. That default data sits in a database somewhere.
SPEAKER_00Gathering dust.
SPEAKER_01Pretty much. Then maybe once a quarter or twice a year, a risk analyst runs a massive report, spots a trend, and manually updates the credit policy for the whole bank. That feedback loop is months, sometimes years long.
SPEAKER_00Right. It's completely retroactive. You're always looking in the rearview mirror.
SPEAKER_01In the SLM model that Pat Madahan describes, that feedback is immediate and highly granular. Let's say a borrower with a very specific financial profile delays a payment by just three days.
SPEAKER_00The SLM catches that right away.
SPEAKER_01Instantly. It detects that signal and effectively asks itself, did I miscalculate the risk weight here? It then automatically adjusts the parameters for the very next application that looks similar to that profile. It's not waiting for a quarterly committee review.
SPEAKER_00It's adapting on the fly.
SPEAKER_01The system is literally evolving its own logic with every single interaction it processes.
SPEAKER_00That is the embedded intelligence he was talking about. The software isn't just mindlessly executing a set of static rules, it's constantly refining them. But this brings me back to the infrastructure problem. Because you cannot run a self-evolving real-time model like that on a 30-year-old mainframe sitting in some basement.
SPEAKER_01No, you absolutely cannot. And that brings us to what he calls the cloud conundrum.
SPEAKER_00Pat Manavan seems pretty uh pretty critical of the current hybrid cloud setup that most banks are so proud of. He basically argues that private clouds are just legacy infrastructure with a better marketing budget.
SPEAKER_01It's a harsh assessment, I'll admit, but it's highly accurate. Most banks are frankly terrified of the public cloud. It's purely a security perception issue. So what do they do? They spend millions building private clouds.
SPEAKER_00They own the servers, they build the walls, they control the perimeter. It feels very safe.
SPEAKER_01It feels safe, but the massive trade-off there is isolation. In a private cloud, you are a single tenant. That means you are solely responsible for every single upgrade, every security patch, every new integration.
SPEAKER_00So you completely miss out on the network effects of the broader tech ecosystem.
SPEAKER_01Completely. You're on an island. Padmanapan is pushing hard for true multi-tenant size. Think about it this way: in a multi-tenant architecture, the software vendor maintains a single, massive centralized code base.
SPEAKER_00Okay.
SPEAKER_01So when they develop a new security patch or brilliant new AI optimization, it rolls out to all 50 or 100 banks on the platform simultaneously. Overnight.
SPEAKER_00But hold on a second. If I am a bank CISO, a chief information security officer, the phrase multi-tenant sounds an awful lot like shared data. If I'm sitting on the same infrastructure as my biggest competitor, how do I know for sure my data isn't bleeding over into their models?
SPEAKER_01That is exactly the trust gap he mentions in the interview. But the reality of modern multi-tenant architectures is that they use incredibly strict logical separation. We're talking database sharding, advanced encryption at rest, encryption in transit.
SPEAKER_00So the data is technically commingled on the physical hardware, but mathematically isolated at the software layer?
SPEAKER_01Exactly. And Pat Manipan's core argument here is that the security of these massive SAOS platforms is actually significantly better than what any single bank could ever build on its own.
SPEAKER_00Because the vendor is amortizing the cost of top-tier security across hundreds of wealthy clients.
SPEAKER_01Right. They can afford the best cybersecurity talent in the world. You might feel safer in your own private bunker, but you're actually way more vulnerable because you just can't afford the same level of sophisticated defense systems that a cloud native giant can.
SPEAKER_00And you definitely cannot afford the speed. That seems to be the real killer here. Private clouds are rigid. If you want to deploy that fancy new SLM capability we were just talking about in a private cloud environment, that is a six-month integration project, minimum.
SPEAKER_01Easily. Whereas in a multi-tenant size, it's just a feature flag. The vendor flips a switch and it gets turned on overnight.
SPEAKER_00Which perfectly leads us to the part of the interview that made me, frankly, the most skeptical. The concept of zero implementation.
SPEAKER_01Yeah. It definitely sounds like a glossy sales pitch, doesn't it?
SPEAKER_00It sounds like pure vaporware.
SPEAKER_01Right.
SPEAKER_00I have been through major enterprise software rollouts. I know what they look like. They are never zero implementation. They are two years of absolute misery, scope creep, and budget overruns. How can UNSIA possibly claim zero implementation when they are dealing with highly complex financial products? Are they just bypassing the core banking system entirely?
SPEAKER_01They aren't bypassing the core, but they are dramatically abstracting the complexity. Pat Manahan makes a very deliberate distinction here between implementation and go live.
SPEAKER_00Okay, what's the difference?
SPEAKER_01He says implementation implies a massive construction project where you are building custom code from scratch, writing new logic, testing it, fixing bugs. Go live, on the other hand, implies configuration.
SPEAKER_00Okay, but configuration in enterprise banking usually means low customization. If I'm a bank and I have a very specific, highly nuanced, tiered interest supply chain finance product with weird bespoke repayment terms, can a no-code platform actually handle that? Or am I just stuck with a generic vanilla template?
SPEAKER_01That is historically the trade-off, yes. Customization meant coding. But Padmenahan argues that they have fundamentally productized the granular logic blocks themselves, not just the user interface. They use a proprietary tool called UNSIS Studio.
SPEAKER_00So how does that work?
SPEAKER_01Instead of writing custom code to say, you know, if X happens, calculate interest like Y, you are actually dragging and dropping pre-validated, pre-audited logic modules on a canvas.
SPEAKER_00So the complex financial math is already baked into the block itself.
SPEAKER_01Exactly. The block is a hardened, compliant financial component. You are just assembling them in a specific order. This entirely shifts the workload from the IT department, who historically had to code, test, and debug everything over to the business product team.
SPEAKER_00The people who actually understand the business logic and just need to configure the flow.
SPEAKER_01Right. So zero implementation doesn't mean zero work.
SPEAKER_00It means zero coding.
SPEAKER_01Exactly. He completely removes the traditional software development lifecycle from the critical path. You aren't writing code, you're just tuning parameters. And this allows for what he calls the self-serve model. The bank literally does not need to call the vendor to launch a new product.
SPEAKER_00All right, I wanted to see the receipts on this, and he brings up a Unity Bank as the concrete proof point. And the numbers here are frankly aggressive. He claims that they booked 1,000 crores in supply chain finance in just six months. Now, for our global listeners, 1,000 crores is roughly 120 million US dollars. For a brand new program launch, that is incredible velocity.
SPEAKER_01It's massive. But honestly, the number that really matters in that case study isn't the monetary volume, it's the timeline. He states that Unity Bank can configure and launch a complete new finance program within 24 hours.
SPEAKER_00Let's drill into that for a second. 24 hours, does that actually include UAT, like user acceptance testing? Does it include compliance and regulatory checks? How do you compress a standard three-month launch cycle into a single day?
SPEAKER_01This circles right back to those pre-validated blocks we just discussed. If you are coding a product from scratch, you absolutely need weeks of testing to ensure you didn't accidentally break the core math.
SPEAKER_00Right.
SPEAKER_01But if you're using pretested certified modules, the UAT phase is significantly faster because you are only testing the arrangement of the blocks, not the internal logic of the components themselves.
SPEAKER_00So the compliance rules and the regulatory guardrails are already embedded in the blocks.
SPEAKER_01Yes, exactly. The guardrails are part of the core configuration. So a business user practically cannot accidentally build a non-compliant product, the system won't let them. This is what allows the bank to move from a signed agreement to a full market launch in under a day.
SPEAKER_00That changes the strategic landscape completely. Right. I mean, if you can launch a product in 24 hours, you can run actual micro experiments in the market. You can test a specific dealer finance program for just one small region, see if it actually works, and then scale it up or kill it without having burned a million dollars in IT setup costs.
SPEAKER_01And that is exactly what Unity Bank did. They leveraged supply chain finance, which, as we know, is a very high volume, revolving credit product to deepen their relationships with existing corporate clients.
SPEAKER_00Instead of spending millions on new customer acquisitions.
SPEAKER_01Exactly. They just monetize their current ecosystem much, much better. And they could only do that because the technology allowed them to customize the financing offer for entirely different supply chains almost instantly.
SPEAKER_00There's one final piece of this puzzle that we need to hit, and it's the money. Not the money the bank is lending out, but how the bank actually pays for this underlying technology. We are seeing a major shift from CapEx capital expenditure to opex operating expenditure.
SPEAKER_01This is the democratization angle of the whole NSIA philosophy. Historically, buying a core lending system was like buying a physical office building. You pay tens of millions up front for a perpetual license.
SPEAKER_00And if the software didn't work as promised, or if your business didn't grow, you were just out that money. It was a sunk cost.
SPEAKER_01It was a massive barrier to entry. It basically meant that smaller lenders, NBFCs, or scrappy fintechs, couldn't compete with the tier one banks on technology.
SPEAKER_00But NSIA is pushing this pay as you grow model.
SPEAKER_01Right. The cost of the software is directly pegged to your actual business performance. So your software bill is tied to your disbursement volume, the number of transactions processed, or the overall size of your loan book.
SPEAKER_00Now that sounds fantastic for the bank, obviously, but it sounds incredibly risky for the vendor. I mean, if the bank fails to sell loans, the vendor simply doesn't get paid.
SPEAKER_01It does put the onus entirely on the vendor to ensure their system actually works and actively drives business growth. It perfectly aligns the incentives.
SPEAKER_00They win when the bank wins.
SPEAKER_01Exactly. But more importantly, it radically changes the unit economics for the lender. You don't have this massive depreciation anchor sitting on your balance sheet anymore. Your technology cost becomes a variable cost that scales perfectly in line with your actual revenue.
SPEAKER_00It really levels the playing field. A small SME lender can basically access the exact same Ferrari engine that a massive global bank uses because they are only paying for the gas they actually use.
SPEAKER_01That's a great way to put it. You have to stop investing capital in static software licenses and depreciating servers. You start investing in people who can deeply understand the data and configure the business strategy.
SPEAKER_00So bringing this all together for everyone listening, we have three main shifts happening here. We have the intelligence layer, moving away from bolted-on chatbots towards small learning models that are deterministic and don't hallucinate. We have the architecture layer, moving from isolated private bunkers to multi-tenant Sauce Clouds. And finally, we have the agility layer, moving away from hard coding toward drag and drop configuration.
SPEAKER_01Those are the three pillars, absolutely. And if you look at them all together, they describe a completely different kind of financial institution than what we have today.
SPEAKER_00It really does make the current industry debate about, you know, will AI replace bankers feel a bit shallow. It seems like the real question we should be asking is: will AI replace the traditional process of banking?
SPEAKER_01That's the right framing. Pat Manahan isn't talking about replacing the human relationship aspect of lending. He's talking about replacing the static, dumb workflow that currently sits between the banker and the customer. He explicitly mentions blending AI with human judgment.
SPEAKER_00Let the machine do the math, let the human do the relationship.
SPEAKER_01Exactly. The system handles the context-aware intelligence, the heavy mathematical lifting, the historical analysis, the pattern recognition, so the human banker is freed up to handle the nuance and the strategy.
SPEAKER_00But it does require a serious leap of faith from the institution. You have to trust the black box to some degree. If an SLM suddenly changes a risk weighting because of a subtle pattern it saw just yesterday, the human banker has to trust that the system is actually smarter than the manual policy they wrote six months ago.
SPEAKER_01True. And that trust only comes from extreme transparency and auditability, which is exactly why the shift to deterministic SLMs is so critical for this industry. You can audit the internal logic of an SLM. You can prove why it made a decision.
SPEAKER_00Whereas you can't easily audit the logic of a massive neural net that's just guessing probabilities based on the whole internet.
SPEAKER_01No, no, you can't.
SPEAKER_00So for the listeners out there, whether you're working in a traditional bank, a scaling fintech, or you're just keeping an eye on this space, what's the ultimate litmus test here? How do they know if their own organization is actually being digital or if they're just doing digital?
SPEAKER_01I would tell them to look closely at their feedback loop. That is the single most important metric to evaluate. When your organization makes a decision, whether that's approving a loan, rejecting a customer, or setting a new interest rate, how long does it take for the real-world outcome of that decision to reprogram your core system?
SPEAKER_00And if the answer is, well, we wait for the quarterly strategy meeting to review the data, you're dead in the water.
SPEAKER_01You're obsolete. The ultimate goal is to shrink that feedback loop to near zero. The transaction itself should be the lesson.
SPEAKER_00The transaction is the lesson. I really like that. It moves us fundamentally away from just data storage and towards actual data intelligence.
SPEAKER_01We are definitely entering an era where your competitive advantage isn't just who has hoarded the most data, but whose data actually teaches their system the fastest.
SPEAKER_00It's a fascinating, slightly terrifying, and incredibly exhilarating time to be watching this industry. It really feels like the training wheels are finally coming off.
SPEAKER_01I think the race is genuinely just beginning.
SPEAKER_00I want to leave you all with a final thought to chew on as we wrap up today. We talked a lot about Padmanapon's core philosophy that true intelligence and lending will emerge only when every single transaction teaches the system to make the very next decision smarter.
SPEAKER_01It's a very high bar to set.
SPEAKER_00It is. So I want you to look at your own workflow today, not just your software stack, but your actual daily grind. Is the work you are doing right now actually teaching your system to be better tomorrow? Or are you just feeding a graveyard of data that no one will even look at until the next compliance audit? Because the difference between those two things is the difference between building a legacy and becoming one.
SPEAKER_01I couldn't have said it better myself.
SPEAKER_00Thanks for joining us on this deep dive. We will catch you on the next one.