How Oracle Became the Backbone of Enterprise AI with Chris Chelliah
Fresh out of Oracle AI World 2025, Chris Chelliah, Senior Vice President of Technology and Customer Strategy for Japan and Asia Pacific at Oracle, joins us to unpack how Oracle is positioning itself as the definitive enterprise AI platform across the region. He shares his career journey from a computer science geek working on distributed databases to leading technology strategy across a market representing two-thirds of the world's population. Chris explains Oracle's comprehensive four-tier AI stack—infrastructure, data platform, applications, and agentic orchestration—emphasizing how this unique full-stack ownership enables enterprises to consume AI out of the box and extend seamlessly without ripping and replacing existing systems. He highlights compelling use cases from financial fraud detection and healthcare automation to precision agriculture and energy grid optimization. Closing the conversation, Chris shares his vision for what great Oracle will look like in Asia Pacific, continuing its 50-year legacy as the behind-the-scenes platform provider powering everything from OpenAI and TikTok to global banking infrastructure.
What's been consistent for Oracle is to be a platform provider that helps organizations unlock full value of their data. Today it is all about AI and unlocking the value of your data in AI, and cloud is a mandatory enabler. With AI and agentic AI, an agent is effectively an employee—it's an automated employee, a process, a workflow. You want your employees to be within your ecosystem, within your firewall. AI thrives at the edge because that's where inference happens. With AI and agentic AI, an agent is effectively an employee—it's an automated employee, a process, a workflow. You want your employees to be within your ecosystem, within your firewall. AI thrives at the edge because that's where inference happens. - Chris Chelliah
Profile: Chris Chelliah, Senior Vice President of Technology and Customer Strategy for Japan and Asia Pacific at Oracle (LinkedIn)
Here is the edited transcript of our conversation:
Bernard Leong: Welcome to Analyse podcast, the premier podcast dedicated to dissecting the pulse of business, technology and media globally. I'm Bernard Leong, and Oracle has been in the headlines on AI and data centers, but today we are diving into how Oracle is shaping the next generation of enterprise AI across the region of Asia Pacific and Japan. With me today is Chris Chelliah, Senior Vice President, Technology and Customer Strategy, Japan and Asia Pacific at Oracle. So Chris, welcome to the show and thank you for joining us post Oracle AI World 2025. I bet it must be exciting in Las Vegas.
Chris Chelliah: Well, thank you, Bernard. Yes, recovering from the jet lag. Extremely excited to be here. Thank you for having me.
Bernard Leong: Yes, we also have that pre-launch chat about discussing what is coming. Of course, as with all guests on the show, I always want to find out about the origin story. So how did your career journey begin and what led you into the world of enterprise technology?
Chris Chelliah: Well, let's see. I'm a bit of a geek, so computer science and math was my background. My research thesis post-grad was on distributed databases. Java was emerging when I was just coming out of university. So the world was moving from client server to the internet, right? The problem of the day in that era was surfacing data that used to be inside an enterprise to the internet. Doing that securely across the big wide internet. At that point in time, the Oracle database was portable across platforms. Java as a language was portable across platforms, and both were central to the internet era, right? So, as an engineer, which engineer doesn't love a problem like that to solve? We see a lot of parallels today in the AI world, but that's how I started.
Bernard Leong: So over the years you've held various leadership roles across Oracle globally. What drew you to your current position leading technology and customer strategy in Japan and Asia Pacific?
Chris Chelliah: Well, technology is definitely in my blood. I think I'm a bit of a tinkerer by nature. That would be the best way to describe me. My home has been running Linux since the late nineties. So I've always got pretty much at least a half a dozen projects on the go that I'm tinkering with and working on. For me, Oracle was a bit of a natural place. It encouraged innovation. We have a mantra of always being customer zero. When I joined Oracle it was a very early nascent database company. Using this technology and being able to be customer zero of all of this emerging technology and then taking that to enterprise scale, that was definitely interesting.
I joined Oracle back in Perth, Western Australia but I've worked with Oracle across the UK and France and Sweden, in the Bay Area for at least three or four years and then back in Singapore for a good 15 odd years. Probably three things drew me back to this role. Number one, obviously it's family. I had to come back closer to home. The second is really the potential in this market, right? The potential in this market. We have two thirds of the world's population in Asia Pacific and Japan, and this population is digital native. It's a digital native generation. Huge opportunity there. The third thing, like I said, is technology just gets me out of bed every day. I'm super excited. So, it was a good combination to bring me back to do technology in this part of the market.
Bernard Leong: So from your diverse experience, what are the key lessons you can share with my audience on your career journey?
Chris Chelliah: Listen, I think there's a couple of things and I say this to new hires and when I'm interviewing into the organization as well. Find your passion wherever you're going to go. Like I said, I happen to be a tinkerer and Oracle happened to have areas that I could play with. So certainly find your passion. In Oracle we are in FinTech and construction and biotech. Find something that would excite you to give more than your best every day.
The second thing I always tell folks is you own your own career. You own your own brand. Yes, a company has a culture and a company has tools and processes and certain segments that they work in. When I speak to my team or my new hires, I let them know that all of those things—culture, tools, processes, segments—are just colors in a color palette. But you have the job of converting that color palette into a painting or the persona you become in the organization. There's a few truisms I always say: if you keep the customer in focus, you keep Oracle as the second in focus, and then you keep your line of business as a third in focus, you can paint where you fit in all of that if you have the right passion.
The last thing I always tell my team is use what you have. Oracle's a big company. If you're in a startup, you might have a different set of tools that you work with. Use what you have. So grab every opportunity. I sometimes say I've worked for like four different startup companies, right? Because they all happen to be called Oracle. I was in a startup database business. I was in a startup applications business when Oracle got into that. I was in the startup hardware business when Oracle got into that. Then I was in the startup cloud and AI business when Oracle got into that. So four startup businesses in one company called Oracle.
Bernard Leong: Amazing. So we are going to get to the main subject of the day. We're going to talk about Oracle's AI and multi-cloud strategy post the AI World 2025. As a start, I always know Oracle works in the background with all things related to enterprise. Can you provide an overview of Oracle's global vision and mission, and the business footprint across Asia Pacific and Japan?
Chris Chelliah: Yeah. Listen, one thing that's been very consistent with Oracle over the last 48, almost 50 years that we've been in business is really about helping organizations unlock the full value of data. We've been through different genres. There was the client server genre, there's the internet genre, there's the cloud, and now we're in the AI genre. But what's been consistent for Oracle is to be a platform provider that helps organizations unlock full value of their data. Today it is all about AI and unlocking the value of your data in AI, and cloud, is a mandatory enabler. That's the overall mission and vision for Oracle. We want to provide our customers with everything that they need to unlock this potential.
We've been in the market, in this market JPAC since 1988, so we're not a new kid on the block. We power critical infrastructure in the region. Pretty much every bank, telco, utility, power grid is powered by Oracle underneath. We're continuously innovating and growing. If you think about it, I think we've got close to 47,000 customers in JPAC. We're still growing 44% year on year consumption growth last year. 12 cloud regions, another 14 dedicated cloud regions. We're here in that mission and vision, continuously growing, and I think we play a big part already today in Japan and Asia Pacific. With AI and the inference opportunity that's there at the edge, I think that's only going to get bigger and we're really looking forward to that.
Bernard Leong: I want to cut to the chase. What are the key takeaways and major announcements that came up from the recent Oracle AI World 2025? Specifically on agentic AI, the data platform and things related to multi-cloud or even other news that may be of interest to everyone here listening to the podcast.
Chris Chelliah: Yeah. Well listen, I think let's start with the headline: AI changes everything. We get that. We hear that. That's why Oracle delivers AI everywhere. As you notice, we rebranded Oracle Cloud World to Oracle AI World, just for our commitment across every layer of the stack.
I think there's a number of components that we bring together. Let me stack this out for you. First thing is let's start at the bottom of this: the infrastructure. AI is underpinned by infrastructure and AI thrives at the edge, especially when you get to inference. So there's some significant announcements that we made around infrastructure, around what we're doing with networking and making that just hugely scalable and reliable. That is a big part of this platform.
The second layer is AI, we know, is fueled by data. You mentioned it earlier, the AI data platform. We'll talk about that. But we've made some significant announcements of how we're able to basically reach into all of your data without disrupting what you have in place today.
The third layer on top of this is a rich set of applications that are going to have AI embedded in them. AI doesn't always have to be something you build. You should be able to use something out of the box as well. So there's a rich set of applications and then there's the fourth layer on top of that: the orchestration, which is our agent and agenting marketplace. We made some significant announcements there as well.
What we will surface out of the box through our applications, you can identify and we can have partners identify, but you can also build using the other components, the data layer and the infrastructure layer. I think what came out and resonated really loudly is we're very unique in having all of those tiers of the stack, yet not mandating that you need all to start. You could start at any one of those tiers or you could use some or all of those tiers. I think that's unique and that came out loud and clear in AI World last week.
Bernard Leong: As a former customer, I probably have encountered at least two applications, and definitely one is the Oracle databases for infrastructure when I was a Chief Digital Officer, and then I encountered Equinex in the construction space when I was working as a CIO. So you almost covered every stack now. With the recent launch of the AI data platform, can you break down what it does and how do you help customers move from raw data to production grade AI?
Chris Chelliah: Yeah, you talked about it earlier about Oracle being in the enterprise space. Enterprise doesn't have to be big enterprise, by the way. It could be small and emerging enterprises as well. So I want to be very clear on that. Oracle's not just for big organizations, it's across the board.
When you are engaging in enterprise, I think it is different from engaging a consumer. When you're engaging in enterprise, you're going to say, what problem are you trying to solve? When people talk about data platforms for AI, go and do a search today on any of your favorite AI chat engines. Search for why should I look for a new data platform for AI? You'll find out that it's very common. The pain points are about addressing real time, addressing flexibility, addressing security. Organizations, enterprises have got multiple types of data: structured and unstructured and document, et cetera. They're trying to do different types of analytics. They've got Lakehouse, they want to do some graph, they want to do some streaming. They've got different types of workloads: AI workloads, IOT, blockchain, distributed data. They've got all of these things in place today.
What we set out to do is we said, how can we address that? Because you need to address the problem and help our customers. The AI data platform is about that. It's about addressing the direct need that customers have for a platform that brings all of this together. It focuses on the outcome. So it's all about accelerating analytics and AI, not about a particular product. It's all about using as much as you can that you already have in house versus saying bring something to aggregate.
Regardless of where your data is—on premises, in Oracle applications, third party SaaS applications, on my cloud, or in a third party cloud—what we offer with the AI data platform is the ability to ingest, to process and serve this data using a single security and a single governance model so you don't have to rip and replace your house, your tooling, your skillset. Seamlessly across applications, across infrastructure. I think that's unique because none of the other hyperscalers provide something that is as wide reaching as that. But it's not just wide reaching in terms of data types. It's in the north-south. It's also reaching to the application stack that we have. I think that takes away a big fear that customers have around why they need a new data platform today and I think we address a lot of those areas.
Bernard Leong: That also ties in very much into the artificial intelligence when we talk about AI agents. The expansion of Oracle AI Agent Studio, and also the launch of the AI agent marketplace—I think there's a pretty big step in terms of enterprise AI adoption. Maybe the more fundamental question here is what problems is Oracle trying to solve with these new offerings?
Chris Chelliah: Yeah. Listen, we think AI is so big and it's so pervasive that I think it's going to solve a problem today, which is actually going to open up five more problems to solve tomorrow. Because every time the nature of us as human beings is we're never satisfied. So we solve a problem and you go, oh, cool, but there's five other things I can solve and I want to solve five more things. Then those five are going to give you another five. So, I think it's going to be us being able to then collect and not reinvent the wheel each time.
With the AI Agent Studio, it's the ability to give you some endpoints that are already created, some agents that are already created out of the box, and you can then extend naturally. The marketplace—we extend that to our partners as well. So our partners who are implementing solutions, who may have specific domain knowledge, can bring that IP to the platform and add into the agent ecosystem that you have.
The more agents you have—think about it in the old days, as my kids would sometimes say, in the dinosaur age when we used to do data integration, point to point integrations. An agentic integration with autonomous agents is just a higher level of abstraction of that. When you used to do point to point integration, what was really important was you needed to have an adapter. You needed an adapter to connect from app A to app B. The more adapters you had, the better the integration platform was. It's the same with agents. The bigger your agent marketplace, the more successful this is. Like I said, I think this is just going to feed a wave of new applications being built with AI.
Bernard Leong: Just a very clear exposition of how you broke down the agent AI systems from a traditional enterprise automation tooling from the point to point viewpoint. We've seen Oracle AI agents now being deployed across finance and supply chain. Can you walk us through some real world use cases that actually may have some measurable business impact currently, probably at pilot stage or maybe even some of them in production?
Chris Chelliah: Yeah, the good news is they are working in pilot, they are working in production. I can't mention all of the names because customers want to go through their validation, et cetera. You already mentioned the finance use cases, whether that's looking at sophisticated fraud schemes or making sure you're meeting all of the regulatory scrutiny that you need. We have capability like investigation hub to analyze alerts and matches between customer data, sanction lists, et cetera. So FinTech and financial—massive number of use cases there.
Supply chain, you already talked about that as well. Whether it's about maintenance advisor, whether it's about using real time image recognition for quality control or for branching of workloads. We've also seen it being used as diverse as agriculture in optimizing planting and optimizing harvesting decisions, integrating satellite imagery and IOT sensors to make sure that we can best use fertilizer to increase the yield and also reduce costs. You don't want fertilizer going where it doesn't need to go, so precision spraying.
In healthcare, we know we've seen reduction in times for documentation. Physicians being able to talk to you without having to take notes or without having to then organize that. Just the ability to not just listen, but also to annotate and also make recommendations in healthcare. Reducing documentation time, annotation, but also in detection—whether that's lung nodules or breast cancer detection.
We've seen agents being used, like I said, it is the art of the possible. We've seen it being used in environmental applications to monitor things like micro climate change. I think for every problem we solve, we're going to open five more to solve and the pace at which problems surface and we resolve it is just going to increase. It's going to be a massive flywheel. I think for all of us as human beings, the quality of life can also increase as we solve some of these problems that just make things easier for us.
Bernard Leong: With the AI data platform now supporting multi-cloud and hybrid orchestration and Oracle database also living within the other hyperscalers, how do you view the role of multi-cloud in accelerating enterprise AI adoption?
Chris Chelliah: Absolutely. Just like we talked about, the AI data platform removes some of the problems that customers have to adopt AI. Multi-cloud, deploying the AI data platform on multi-cloud, further enhances that. I sometimes say to gaze into the future, it pays to look at the past.
Multi-cloud is definitely here to stay. We at Oracle are very proud that we've broken down these walled gardens between the hyperscalers and we've leveled the playing field. But if you go back to the origins, the dinosaur days again, when you were building database or data-driven applications, you had to first pick the hardware platform. Then you picked your programming paradigm, and then you pick the workload, whether it's analytical or OLTP, and then finally you got to pick your database that ran on that platform, that supported that workload, et cetera. That was the way it used to be.
We changed that paradigm many years ago. We built the Oracle database. It's the same database that you could do transaction processing as well as analytics. We also made that same database be able to run on 140 different platforms. That's what we did about 30 years ago. So what happened is now you could pick your programming language, you could pick your hardware platform and you'd have a consistent experience regardless. Giving customers choice really makes them accelerate development, so they didn't have to worry about whether it's going to run, whether that hardware platform had longevity or not. They just built applications and they deployed it, and that's what got Oracle to where we are today.
Now we saw a massive gap going forward. We saw that what was happening with the hyperscalers in the cloud was they were building these siloed walls of keeping the data in each cloud, and we could see that that's just going to be the barrier to AI adoption because it's a natural barrier. So that's exactly what we sought to solve again, because we figured customers want to have multiple clouds, they want to diversify risk. What about if we could give them the same experience regardless of which cloud, the same commercial model regardless of which cloud it runs on, and the same open standards? Everything that we do is based on open standards that is completely portable on a cloud or onto on premises as well.
Bernard Leong: From your vantage point in the Asia Pacific and Japan, how are you activating partner ecosystems and what are the roles they play in helping you to scale AI for customers?
Chris Chelliah: Listen, our partners are definitely a key enabler. We need them to move fast. I already alluded to the fact that we're going to move fast and it's going to solve multiple problems. So we need our partners to be there to help us scale really quickly. As a technology provider, I think partners typically don't take a product, a new product to market per se. They take solutions to market, and they certainly know how to accelerate a market and drive scale and reach.
I'm very proud, especially this year. If you look at the last three years, the number of partner logos and sponsoring AI—you know, Cloud World, which became AI World—is just multiplying orders more, which is great. Which tells me that the partners see value in building a practice. They've got to build investment to build a practice around our stack. The fact that they're building a practice tells me that we're heading in the right direction.
Partners are definitely there and going to be a big part of taking us and scaling this. Massive reaction in less than 24 hours from when we made that marketplace announcement—less than 24 hours, I think we had about a hundred solutions already that were submitted by partners. Not all of them were built but they were submitting like intent to build in the different specific domains. It's amazing when partners jump on board that quickly. It tells me that we're heading the right direction and it tells me that the ecosystem believes that, which then gives our customers comfort because now they have scale. They have choice. They can pick a partner of their choice and deploy on a cloud of their choice, but get the outcome that they need. That's what enterprises want. They don't want to be painted into a corner.
Bernard Leong: What's the one thing you know about Oracle's AI platforms that very few people do, but should?
Chris Chelliah: Well listen, AI is multifaceted. Customers are now looking at this. When I speak to enterprises, they go like, oh, we have—they're either looking at consuming or they're looking at building and somehow they see this as two divergent tracks. You either consume it or you build. From an Oracle perspective, I think you shouldn't see that as divergent. We think that your AI platform should be convergent.
What does that mean? You consume as much as you can out of the applications. We're doing that through our rich vertical industry applications, horizontal and AI enabling. Then you just extend. You're not building from scratch and you extend it because we use the same platform to build the agents and the applications as you would to extend. That same platform is available across all the hyperscalers. It means that customers—I can change the perception. Customers think they have to pick the fork in the road. Do I go the build way? Do I go the consumer way? I'm saying to them, actually have both. It's not divergent. It's convergent. We can help you consume very quickly and then extend and build from that.
Bernard Leong: Now, how important is openness—by openness, I mean open data formats, agent interoperability—to Oracle's strategy? I mean, especially in this region where legacy systems are common. I also would grant the fact that Oracle owns MySQL, which is a pretty well known open source database.
Chris Chelliah: Yes, we do. I mean, listen, openness is absolutely critical. Our AI and data strategy is completely open, completely ubiquitous. As you said, way back when we acquired Sun and the MySQL, I think there was a lot of conversation in the market that Oracle was actually not going to continue with MySQL because we had the Oracle database. At that point in time, we've always said, that's not our intention. There's a right place for MySQL. There's a right place for the Oracle database, and we've continued that investment for a decade and a half now.
Openness is absolutely critical. Everything you see Oracle do will be portable. Like I said, everything that we are doing with the autonomous AI Lakehouse, everything that we're doing with MySQL and HeatWave is based on open strategy. It runs on my cloud, it runs on premises, it runs on my competitors' clouds as well. Our goal is to give customers a ubiquitous experience, and we do that without any lock-in. Whether that's using iceberg formats or using the MCP standards or using the ONNX standards, we're very standard-based.
We typically don't fork open source projects and make our version of it. We use them natively as they are so that you don't have a lock in that sense. If you look at it, you won't be 50 years in a business by locking in customers. Our customers have complete choice today on where they want to run, how they want to run, how they want to access their data—structured, unstructured, document, streams, time series. We support all of those things in a completely seamless experience across all of the cloud offerings.
Bernard Leong: I think how do you now help CIOs to balance the need of what I call vendor choice flexibility, which I think you already pointed out when I asked you the previous question, with the need of now thinking about enterprise-grade governance and reliability? I think this is a question that seems to be on every CIO's mind about data governance, AI governance, et cetera.
Chris Chelliah: Yeah, listen, I think this is a huge thing. I think this is still a remaining thought barrier, at least for the enterprise customers that we speak with. Just like we think multi-cloud is here to stay, I think we would all agree that multiple models are also here to stay. Not just multiple frontier models—and we know all the big frontier models—but there's going to be lots and lots of smaller models and specific industry and niche models.
So there's a huge emphasis around the ability to bring a model of choice onto the platform but not compromise in the governance and security. That's some of the things that we've built as native into the Oracle platform that I think is very differentiated. For example, let's start at the infrastructure layer. The infrastructure layer, we are giving our tenancies a level of isolation that they don't see in the other hyperscalers. It's a very clear level of isolation. Down to where you can declaratively declare your network packet routing, and we will enforce it for you. You cannot bypass those things through what we call the zipper protocol. We're rolling that out. So you've now got an isolation level in the tenancy.
Then on top of that, in the data level, regardless of the models that you bring in—and you can pick any of the frontier models or anything from Hugging Face or something you've developed in-house—when you roll that into our Gen AI path service, again, we declaratively define the security and the governance models. That's enforced at that level. Then you can swap models in and out, but the governance and the security and the access mechanism is still maintained. It's still consistent.
Then you look at the next level, what we do at the data and in the AI data platform—same thing. Regardless of where your data is or where you shuffle your data, we retain the security and the governance. I think when you have a full stack of a full and consistent stack of security and governance at the infrastructure levels, at the data layer, at the model layer, and at the application layer, then you can have a better trust index as a CIO. We show that to CIOs and say how you can build trust, how you have traceability in this and how it's all declarative.
Then what we do is out of the box, the ability to show governance reporting around where things came from and where things went. That level of transparency is important and we're able to do that because we own the four levels of the stack. I don't see that being as pervasive anywhere else.
Bernard Leong: I thought that was a very interesting point you made about those different four levels and how Oracle is holding all the stack together. I'm just thinking a lot in practice when we think about the marketplace also evolving. What are the challenges now you think about adoption? Is it mainly in talent, localization, or compliance?
Chris Chelliah: Yeah. I think more and more if you think about it as well, we used to talk about data residency and data sovereignty. I think it used to be—I used to hear in the cloud days, only a decade ago, we used to talk about that my data needs to reside in my country. But I think with AI and then agentic AI, you think about it: an agent is effectively an employee. It's an automated employee. It's a process, it's a workflow. Well, you want your employees to be within your ecosystem, within your firewall.
So I think more and more with AI you're going to see there's a need to bring it closer and closer home. Your data residency is no longer going to be the country, but it might just be within the enterprise. So the ability to—and that's why I said in the very opening, I said infrastructure, AI thrives at the edge because that's where inference at the edge is. So the edge is going to get smaller and smaller and closer and closer to within each organization, within each enterprise.
I think the ability to bring the power of all of that as close as possible is important. That's what we do with our ability to bring the cloud and shrink that down with all of the inference and all of the four stacks—the four tiers of the stack that I talked about—inside an enterprise. That becomes a big thing.
The second thing you talked about as well as talent and skills. I think AI's going to help us with that. I think this is one of those areas where we're going to use AI. It's like priming the pump. You can take, for example, I was talking to a large SI at AI World last week and they were telling me they had a big practice, a consulting practice that was already based on one of my competitor hyperscalers. You know, we've got a whole bunch of skill sets there. We want to grow your skill sets. I said, listen, AI can help you fix that because you take your reference architectures that you already have and your best practices that's for that deployment, and I can actually use AI to flip that and generate infrastructure as code that would meet your best practices in OCI.
So I think that whole skills paradigm, we can flip really quickly by using AI to prime the pump. The last thing you talked about was around operational maturity. I think that is also going to be—I think with AI and agents, automation that we can provide out of the box as a cloud provider, and I talked about the lineage and the governance, we need to surface more of that to make the customers, the business users, the CIOs more comfortable that they actually have control over this. Because I think there's always this fear of the unknown. We need to make sure that they can see that they've got control over this.
Bernard Leong: For large enterprises within the Asia Pacific and Japan, they have heterogeneous legacy estates. One question I really have—and actually it is two questions, but I want to see how you're going to answer that. How do you guide them to get AI-ready data without disrupting operations? So that's the first part. The second part is what metrics do you believe matter most for these companies starting their AI journey?
Chris Chelliah: Okay. Let me go with the first one. You talk about without disrupting operations. I think that's the first test. The first test would be when I go in and engage with a customer, we talk about AI, we do some design thinking workshops around what do you want to get out of this? Because I think AI is all about the art of the possible. I don't go in there with something in mind. I go in there and say, what do you think we could solve together?
Then the second test would be how quickly would it take for me to do that? Now, if I go in and say, listen, I need to build a whole bunch of what I call scaffolding and I need to change the data, I need to change a security policy, I need to move stuff around and build scaffolding, and I'll come back in three months with the answer—that's the wrong answer. I need to say, how about if once we've done this design thinking workshop, within a matter of days, maybe a couple of weeks at top end, without disrupting your data policies, I can show you this working?
I think that's important because, and again, that's what we've designed the AI data platform for. The word I use is the AI data platform respects. The word I use is it respects the security and governance models that you already have in-house because the security and governance models you've built over the last number of years in a large enterprise has been there for maybe 10, 20 years. I can't say, you need to break this before I can give you an AI outcome. I need to say, what can I do working and respecting what you have in place?
We do that. We've been doing these engagements from design thinking to a vibe-built application and then deploying that and being able to enable endpoint APIs to plug into that. The customers can see a real working system in a matter of a week, two weeks. That really opens up eyes when we do that.
Bernard Leong: Yeah, and I was just in full respect of confidentiality. I heard you telling me that you have done it for a customer and then within like two, three days you could actually show them something that is like a workable, viable product just to see how the scaffolding really looks like. I think that's really the great use of AI in this age.
Chris Chelliah: Absolutely. The example I shared with you was the energy company. We've done another one with a very large conglomerate that's got retail, agriculture, logistics and the ability to take a whiteboarding design thinking session and generate that into some sort of working model without ripping apart what they have in data. When they asked questions of this environment, it respected the data. So in other words, if I asked a question, I saw data that only I could see. When you asked the question, you saw data that only you could see because we respected the policies in place.
The second question you asked was around the metrics that matter. The way I'm being a little bit cheeky sometimes when I talk to the CIOs about this, I sort of say, listen, how do you benchmark this? Well, it's like when you go to the gym and you're running next to somebody on the treadmill next to you. You're not competing with them running on the treadmill next to them. You're competing with yourself. You're trying to better your personal best from yesterday or last week, depending on how good you are and how often you go to the gym. But really it's about going to the gym and bettering yourself.
So I think organizations are really looking at—you already know the benchmark. You already know the metric of how you do that job today. So you know how well you ran yesterday and you've got to say, I'm going to try to beat how well I ran yesterday, today. That's how I look at measures for success. When we talk to customers after we've done that design thinking workshop, we sort of said, well, why are we even doing this? Is it worth your time and my time? How much time does it take you to do this?
Then we work out, well, what would be a reasonable—listen, now that I've brought the vibe experience down to a week or two, I'm not saying a year or two to test this hypothesis out. If I'm testing the hypothesis out within a week or two, then we can say, listen, okay, it takes me five days to do this task today. Can we make it half to two and a half days? For example, I said, well, is it worth us investing two weeks together?
I think it used to be when you had a hypothesis to do transformation in an organization to prove that hypothesis used to take 6, 9, 12 months and a lot of money. We're able to now get that hypothesis spinning really quickly. Then you can see how you want to measure, and they go like, oh, this makes sense. You did make it half the time. Now let's go about what it takes.
I think we're looking at a different way of quantifying success and benefits. The time horizon is a lot shorter and so you're happy to invest. Most customers are happy—they have a board mandate typically to go and look at AI projects. So they're happy when we're in here with some tooling and some methodology and some capability and not saying it's going to take you six months, saying it's going to take you a week, maybe two weeks. They're more than happy to work with us. Then together we define whether or not it was successful outcome or not, and how do we scale that effectively.
Bernard Leong: That's interesting because I think that will also make it faster for enterprises to build some trust, at least on the interconnecting data platform layer. I think that's where a lot of the plumbing—people do not appreciate that a lot of the plumbing actually happens in that part of the stack. If that process is sped up, then it also speeds up the process of adopting all the agent layer.
Chris Chelliah: That's exactly right. I said my job to the customers, I usually say, my job for you is to try to make the plumbing as transparent as possible and really just give you a power outlet that you plug yourself into. So really that's it. If we can make the plumbing—and again, we have 50 year pedigree, DNA in data management. The problem statement today is data management. That is the problem statement if you think about it, because AI is fueled by data. We have capability and we understand data replication, data synchronization. We understand that.
Bernard Leong: That goes to the roots of Oracle from the beginning to this day: data management. Very good. What is the one question you wish more people would ask you about Oracle's AI strategy, but they don't?
Chris Chelliah: I sometimes wish people would—we are in a world where today you can't walk down the street and not talk to somebody or bump into somebody and there's an AI conversation within the second sentence. Whether it's a consumer, enterprise person, you're talking—it's always about AI. I think sometimes it's easy to say that everything's AI. The one question I think I want people to ask me is, why are you different?
Bernard Leong: Why are you different?
Chris Chelliah: Why are you different? Why is Oracle different? I think my answer to that is two things. My answer to that is we need to go from the superficial down to the depth. We need to get into the specifics to work out why you're different. Two reasons why we're different.
Number one, we're the only player that plays in all four tiers of the stack today. High powered infrastructure that pretty much all the frontier models train on. We have a data pedigree to bring all of the data into the fold of AI. We have a rich set of applications that basically power industries today: banking, telco, utilities, construction, healthcare. We have an agent framework that ties all of these things together seamlessly, whether you want to just consume AI, whether you extend and build on that consuming area.
So I think we're fundamentally different in each of those. In each one of those things, it's not a lowest common denominator. Within each layer we're fundamentally differentiated as well. For example, for example, my infrastructure layer, I'm fundamentally differentiated with the way I do network switching and the way I do declarative network security, for example. The data layer, I'm fundamentally differentiated because I'm the only one who can run on any other hyperscaler plus on premises, give you the same experience without disruption. Nobody else can do that.
The applications tier, I'm the only one that plays in all of these verticals. Then, the agentic marketplace and the framework will build from the other three. So I think I wish people would sort of say, instead of saying everything's AI and what I call AI washing, as opposed to whitewashing—AI washing everything—get into specifics. Hey, why are you guys different? Show me. If they asked us that, I think we'd have some very productive conversations.
Bernard Leong: I really want to sit down in one of those design thinking workshops you have, just to be a fly on the wall to see exactly what's going on.
Chris Chelliah: Well, you know what, it's customer proprietary, so what we should—
Bernard Leong: Well, I'm willing to sign an NDA.
Chris Chelliah: Yes, we should find a customer that you already know because I know you've worked in the market and in the industry as well. We'll find a customer that you already know and more than happy to bring you. Obviously it will need their buy-in as well, but happy for you to be a part of that.
Bernard Leong: Of course, I wouldn't be sufficient to say that—I mean, Oracle is actually responsible for very big, large scale infrastructure. If you think of the likes of OpenAI and TikTok, for example, you guys are running the actual infrastructure behind them. So, my traditional closing question: what does great look like for Oracle in Asia Pacific and Japan as it builds out the next generation of AI platforms and services?
Chris Chelliah: Yeah. Listen, I think where we're going to be in my view of what looks great—we will become the number one AI platform for enterprises in the market. That's what it is. We are going to be across Japan and Asia Pacific. How do you rate number one? How I rate number one would be we're going to be powering the agents and the inference across the largest and broadest spectrum of enterprises in Asia Pacific.
I gave you some examples. We're already working on energy grids, for example. How do we regulate power between renewables and non-renewable using AI and agentic AI for that? We're already working in agriculture. We are working in healthcare, we're working in construction, we're working in FinTech. I think you're going to see a lot of those things. Like we have been for the last 50 years, Oracle tends to be the behind the scenes player. Our customers shine and we tend to be the platform player. Our customers benefit from that.
For example, customers who have been using OpenAI and TikTok, as you mentioned, have been using OpenAI and TikTok for the longest time. They may or may not have known that it's powered by Oracle. If you're using the utilities, if you're putting your ATM card in, you may or may not know you're actually being powered by Oracle. It's just the way we've been for the last 50 years. We're the underlying platform provider, and that's where we'll be. We'll be providing the AI infrastructure, the agent processes across all of the industries will be the number one across Asia Pacific. I firmly believe that because we have a value proposition for all of these organizations.
Bernard Leong: Chris, many thanks for coming on the show and really this is probably one of the most interesting conversations I have on enterprise AI, given Oracle's breadth and depth of services across the entire stack. So, in closing, any recent recommendations that have inspired you recently?
Chris Chelliah: Well, a book—I think it's called "First, Break All The Rules". It reminds me of where we are today. We're in an era where it's time to break everything, the norms of what we thought was possible, because I think AI is going to change everything. Almost every problem that we encounter, you want to take a contrarian view to it because let's go back to first principles. Don't worry about why we couldn't solve that or why we do it the way we do it today. First break everything, break the norms and the perception of why we are doing it that way, or why we can't do it this way. Let's see how we can do it completely differently. So I think I could see a lot of applicability of that in what we are doing every day.
Bernard Leong: So, lastly, how can my audience connect with you and follow Oracle's innovations in the Asia Pacific?
Chris Chelliah: Well, stay tuned because, firstly, you can connect with me on LinkedIn and I'm relatively active on LinkedIn. Also stay tuned because we're about to launch our developer series. Just like we talked to the enterprises, I think what we also want to do with Michael Smith, Bernard, who you know very well, Michael's going to spearhead that for us and launch a developer series where we are connecting some of these capabilities into the developer community and making sure that there's a huge level of awareness of what we want to do and can do with the audience in this market here. So stay tuned to that. I'll post it on LinkedIn. So will Michael.
Bernard Leong: Okay. Definitely you can find this podcast everywhere and of course subscribe to us on Spotify or YouTube and as well as our LinkedIn newsletter. So, Chris, many thanks for coming on the show and let's chat again sometime soon.
Chris Chelliah: Thank you very much for having me here. I really appreciate it.
Podcast Information: Bernard Leong (@bernardleong, Linkedin) hosts and produces the show. Proper credits for the intro and end music: "Energetic Sports Drive" and the episode is mixed & edited in both video and audio format by G. Thomas Craig (@gthomascraig, LinkedIn). Here are the links to watch or listen to our podcast.