logo_header
  • Topics
  • Research & Analysis
  • Features & Opinion
  • Webinars & Podcasts
  • Videos
  • Event videos

Globe Telecom’s AI strategy focuses on the kitchen, not the dish

Anton Reynaldo Bonifacio, Globe Group’s CAIO and CISO, discusses how the company is laying AI foundations for the long term.

Joanne TaaffeJoanne Taaffe, TM Forum
24 Apr 2025
Globe Telecom’s AI strategy focuses on the kitchen, not the dish

Globe Telecom’s AI strategy focuses on the kitchen, not the dish

A telling reflection of a company’s AI strategy is where responsibility for it lies. When Globe Telecom created the post for Chief AI Officer (CAIO), the company made it a centralized role, reporting directly to the CEO. Globe entrusted this responsibility to its Chief Information Security Officer (CISO).

Anton Reynaldo Bonifacio, Globe Group’s CAIO and CISO, caught up with Inform to discuss why the telecommunications company has chosen to separate AI strategy and deployment from its technology functions.

After all, as Bonifacio points out, often “The AI hat is either tucked under the CIO or maybe tucked under a Chief Data officer. But [our CEO] felt that it needed to be independent”. That led to the creation of what Bonifacio describes as “a mini-CTO” function.

This affords Bonifacio the relative luxury of laying long-term foundations rather than delivering short-term return on investment. “There’s more space to achieve mid- to long-term goals rather than trying to hit something immediately just to prove value,” he explains.

A view from above

Bonifacio already has experience working independently in an umbrella role as Globe’s CISO. “We sit outside of IT and network and report directly to the CEO with our own CapEx, our own OpEx,” says Bonifacio. “It’s fully end-to-end. The infrastructure operations and engineering for all the security technologies is with my team.”

“We’re in the trenches together,” he adds, explaining that the security division avoids friction with IT by working side by side. “We say, ‘Here’s our technology—we need to implement this together.’ It’s more of a partnership.”

He also sees internal partnership and the avoidance of silos as vital to AI development. For this reason, Bonifacio recruited his direct reports from other business and technology units, where they continue to perform their original roles while also taking on AI responsibilities. Each is supported in their AI role by a dedicated headcount to ensure that they can focus on the AI agenda.

As Bonifacio explains in a LinkedIn post, the role of CISO gives him a keen understanding of the company’s regulatory environment and risk posture. So, when ChatGPT erupted on the world stage, for example, he knew where to set guardrails and when to give employees the space to innovate.

“This knowledge, confidence, and accountability allowed us to get on the GenAI adoption track differently.” In particular, “the board, management, and our teams felt comfortable to have ‘fun and play’ with AI,” he notes.

Around 20% of Globe’s employees are now members of its ‘AI Advocates Guild’ and have access to Gemini for Workspace, ChatGPT Enterprise, and the company’s in-house Retrieval-Augmented Generation (RAG) toolkit. The result is more than 400 co-pilots and bots in production internal use cases “built by our people to address issues that matter to them, bottoms up,” he states.

Focusing on the long term

Like other telcos, Globe is already using AI to address customer pain points and improve internal productivity, which includes partnering with hyperscalers.

“A lot of our flagship projects that we are building or co-building with Amazon Web Services (AWS) and Google Cloud Platform (GCP) are really business enabling,” Bonifacio explains.

He further illustrates his point with a kitchen analogy: “Most companies are focused on the dish — they want to serve the fastest fast food because everyone’s hungry and having this fear of missing out on AI,” he explains. “So they end up building a kitchen for Chinese food, another for American cuisine, and so on.”

“In our case, while we do want to serve appetizers now, we’re more focused on building a kitchen that can support whatever we decide to cook tomorrow,” he explains. “If you’re too focused on specific use cases, scaling becomes a problem two or three years down the line. Any expected savings or revenue gains can easily be wiped out by how messy and fragmented the infrastructure becomes.”

Bonifacio believes that having a long-term view helps ease the pressure to perfect data right away. With his kitchen analogy, data is an important base ingredient, but he points out that there might be some appetizers he can serve that won’t need an extremely robust set of data. Just like a chef sourcing ingredients, he sees his relationship with Globe’s Chief Data Officer as that of a customer — relying on them to provide what’s needed to bring each dish to life.

“The data lake will be a key ingredient,” Bonifacio shares. “But we’re also exploring which data sources we’ll need, depending on the kind of menu we want to build.”

Collaborating with partners

"It’s not so much about building the data first, it’s about building the infrastructure first,” Bonifacio explains. When it comes to that foundation, Globe relies heavily on several key partners.

“For most of our workloads we are a huge AWS customer. At the same time, we are gradually starting to move some storage workloads over to GCP,” he states.

Bonifacio is careful to build the AI foundation on top of the company’s current cloud partners for AI and thereby avoid creating new technology silos or increased overhead.

He points out that AWS and Google are not only able to provide the infrastructure to build a kitchen, “they also sell appliances and provide ingredients.” Globe, however, doesn’t treat either hyperscaler as a one-stop shop. Instead, it works with each one selectively, depending on specific needs.

One of the bigger challenges Globe faces is integrating network data: “There are some things being ingested into the lake house from the network side of the fence. But what we are realizing is that it’s not enough,” Bonifacio says. “So, it’s part of that foundational thing.”

"If we look at our 12-course Michelin star menu there are certainly a lot of network-based elements or network-based use cases that we want to have,” he adds. But network systems aren’t as easily integrated as modern cloud-based IT, and Bonifacio believes this is where vendors need to step up.

“On the network side, almost every vendor is trying to sell their very unique closed AI or data solution,” he claims. “We want to be able to integrate your stuff into the kitchen, but they keep on selling me their own kitchen. That is part of the overall challenge.”

Challenges aside, by carefully integrating the right elements and working with trusted partners, Globe is preparing to serve up a flexible infrastructure that can adapt to whatever’s on the menu.

Join Anton Reynaldo Bonifacio at DTW25 in Copenhagen 17-19 June where he will be discuss what is involved in becoming AI and data native.