In the latest Energy Transition Talks podcast, CGI’s Peter Warren sits down with Frédéric Miskawi, Vice-President of Global AI Enablement at CGI, to discuss how energy and utilities organizations can move beyond experimentation to achieve measurable value, exploring practical ways to accelerate AI adoption and build operational resilience.
Accelerating software delivery for rapid AI ROI
AI adoption has shifted from experimentation to execution, and organizations now ask how to turn pilots into measurable performance. Fred suggests one of the fastest paths to value is AI-led software acceleration, which can deliver three to five percent productivity gains within 90 days with “the right partner, the right guidance and when employees are empowered to experiment and learn.”
This hands-on approach enables teams to build confidence with the tools while generating tangible business value early in their AI journey.
Busting the myth of perfect data
For years, organizations have hesitated to scale AI due to concerns about data quality. Fred challenges this outdated assumption, explaining that modern algorithms have dramatically reduced the need for pristine datasets:
“With new algorithms, the dependency on highly clean data is being reduced. We’re entering an era where data quality will matter less, because solutions are increasingly able to clean, prepare and layer heuristics on data automatically.”—Fred Miskawi
Instead of waiting for large-scale data transformation projects, utilities can deploy AI tools that intelligently interpret, cleanse and enrich data in real time, accelerating innovation and reducing friction in implementation.
Using the right algorithm for the right job
Fred’s approach to AI is rooted in pragmatism: choosing what works rather than what’s trendy:
“It’s always about the best algorithm for the job,” he explains. “If you’re doing math, don’t ask a large language model to multiply numbers—use a procedural algorithm. Our role is to pick the right tool for the value we need to deliver.”
This approach ensures that governance, cost and performance stay aligned with business objectives, leading to sustainable AI deployment rather than one-off experiments.
Building operational resilience through AI insights
AI is reshaping operations for energy and utility organizations facing climate-driven disruptions, asset degradation and unpredictable demand. According to Fred, AI now enables a “digital triplet”—a near real-time digital reflection of enterprise operations that enhances visibility and resilience.
“With new weather data integration and predictive algorithms, you can see which parts of the network might be at risk under certain conditions,” he says. “It’s not a revolution, but an evolution moving on an exponential curve.”
By combining predictive analytics with geospatial and weather data, utilities can anticipate maintenance needs, reduce downtime and optimize asset allocation.
Measuring AI success with business-aligned KPIs
As organizations mature their AI capabilities, Fred advises leaders to define KPIs that reflect team-level outcomes and long-term value, not just individual productivity metrics.
“When we work with clients, we fine-tune KPIs to their business outcomes,” he explains. “For software acceleration, we look at developer productivity, trust in outputs, and value delivered at the team level. For chatbots or knowledge engines, we look at usage patterns, satisfaction, and interaction quality.”
This outcome-based measurement framework helps organizations measure AI’s true impact: how it drives trust, quality and strategic advantage across teams.
AI transformation: A steady evolution, not a sudden revolution
Fred remains confident that the technology’s trajectory is solid and transformative:
“I laugh when I hear about AI bubbles, because I see the value every day,” he says. “The technology is evolving exponentially, but it’s our human ability to absorb and apply it that really matters.”
The shift from pilot projects to scalable, integrated AI systems marks a turning point for the energy sector. As utilities continue digital transformation efforts, the emphasis will increasingly fall on building capabilities that evolve continuously, creating a “flywheel effect” of learning, optimization and growth.
Looking ahead: From algorithms to ecosystems
In part two of the discussion, Peter and Fred will explore how organizations can balance large and small language models, and how advances in hardware—from cloud hyperscalers to edge devices—will shape deployment strategies in the years ahead. As Fred hints, every iteration of AI solutions feeds the next, propelling the industry toward smarter, more autonomous systems.
Listen to other podcasts in this series to learn more about the energy transition
Read the transcript
- Introduction
-
Peter Warren
Hello everyone and welcome to another edition of our podcast series Energy Transition Talks. Today we have a very interesting one—we're going to be diving into how do you move from AI to ROI. A lot of people have been playing with AI, but now it's time to move beyond that and get results that actually improve the bottom line, improve efficiency. To that point, I have a great guest, Fred Miskawi. Do you want to introduce yourself?
Fred Miskawi
Hi everyone, I’m Fred Miskawi, I'm part of our global AI enablement team at CGI and I lead our AI innovation expert services, which luckily has me work across nine SBUs, across geographies. I've been involved in one way or another with artificial intelligence since the 1990s and I love what I do every day.
Peter Warren
Just to clarify that SBUs are our strategic business units, so that includes countries like Canada and the United States, all of Europe, Australia and nine different groups in the UK as well. So, let's kick it off here and talk about, AIOps and data maturity. Where do people start? You know a lot of things have been moving forward. People are looking for results. How do they get something in the next 90 days that brings a benefit to them?
- Accelerating AI adoption for real business value
-
Fred Miskawi
The fastest way that I've seen and this is what we do on a day-to-day basis, both with our internal teams as well as with client teams is AI-led software acceleration Number one value for these types of tools, this technology, we can deploy licenses fairly quickly.
But even with those licenses, what we're learning is that it's just not enough. So, we bring in a lot of expertise and layering above that, but you can start getting value very quickly. It's going to start in that three to five percent productivity improvement; it's going to start a little slow, but you can get and gain that expertise very quickly. You just need to have the right partner, the right guidance. A lot of online learning as well can help, and the biggest thing is to work with employees who are leveraging these tools to learn how to use them and to cater to them and let them know that it's okay to experiment and learn. And through these approaches, what we've been able to do is to get people accelerated quickly and you're going to get value very quickly because you're accelerating the process of value delivery, and software is what's powering most of our businesses today.
- Overcoming the data quality myth
-
Peter Warren
I think there's a belief out there that “I can't get started until my data is pristine,” or “This is going to be a major overhaul,” as there's certainly a bit of change management, as you just alluded to. You know, how do people approach those things?
Fred Miskawi
That used to be true. In artificial intelligence, you've got different types of algorithms. You've got structured approaches, unstructured approaches, and with these algorithms, what we were getting used to over decades is that you needed very clean data sets, key value pairs. You needed to be able to have large sets of this data, clean data, to be able to find the patterns, to enshrine the patterns, to tweak the parameters, to get where you wanted to. What we're finding out with these new algorithms is you're now getting into a realm where the dependency on highly high quality data sets is being reduced, and the more this technology evolves, the less that dependency exists, to the point where what we're hearing from our partners, hyperscalers and other edge labs that live on the edge of this technology hyperscalers and other edge labs that live on the edge of this technology is we're getting into a new era where data quality will not matter anymore. So we're already seeing in the labs approaches that automatically clean the data, collect the data, get the data ready for what's needed.
What we're seeing today in production is the ability to add layering above the solutions that we deploy. That layering enables us to apply certain heuristics to the data that comes in. So, even if your data is not clean. We could say, by the way, if this document is older than X or if this is version 5.3, and then you've got a newer version. Maybe let's take a look at that. So there's very simple heuristics like that that you can apply in these types of solutions.
But because of this layering and now with agentic approaches, you're getting even less dependency on high quality datasets, and what we're going to see is, over time, these solutions, these agents, are collecting more and more data. They're collecting it with the level of quality that they need for the next generation of fine-tuning or training, and you see this flywheel effect that's happening today. So, no, you don't need to have high-quality data sets to get started. It certainly helps for certain types of algorithms and it will always be there for very specific types of algorithms, but what we're seeing is a reduction in dependency on those data sets.
- Choosing the best algorithm for the job
-
Peter Warren
Maybe let's talk about algorithms. You've mentioned it a bit. You've got a bit of a famous saying there. I think I enjoyed it the last time we chatted. I'll set you up to start off with that. But you know, people are looking. Do I build, Do I buy, Do I partner? How do I do this? How do I put in governance and maybe kick off the thought about the algorithms?
Fred Miskawi
Yeah. So, I think about life in terms of patterns, in terms of data, in terms of algorithms, and for me it's the best algorithm for the job. And we saw that very early on with this technology in the last two years, where we were asking these models to calculate 5,333 times 55. And next thing you know, you've got thousands of multiplications happening behind the scenes to get you an answer which may or may not be right, and then the labs quickly realized that well, hold on, maybe we can just kick off a very simple little algorithms that's procedural, so that we could get an answer in the way that we know and love. So that's what we're seeing the best algorithm for the job, and our job in this business is to figure out what is that best algorithm for the need that we have, for the value that we need to deliver.
- Using AI to enhance operations and asset management
-
Peter Warren
So building upon the concept of the best algorithm for the job. I mean one of the big concerns in our industry is asset maintenance. It's heavily impacted by the weather storms both, and fires recently. There's a lot of things going forward. How do you see AI sort of coming in and helping the operations improve?
Fred Miskawi
Yeah, and that gets connected to that concept of the enterprise neural mesh or digital triplet, where you get that near real-time view. We're seeing an evolution in that space. You're getting new solutions, weather maps, weather data that is being fed into some of the systems that we're working on. Partners like Microsoft, for example, are introducing that kind of capability and with a new layering on top of that and new maturity in how to absorb that data, you can start working in that next generation of predictive algorithms, leveraging the data to be able to navigate the data and understand which areas of the network might be red under certain weather conditions. That level of visibility, transparency, comes together with that growth of algorithms in a multi-agent type of ecosystem and we're seeing that evolve. It's not a revolution, necessarily. It's an evolution, but that evolution is moving on an exponential curve. So, as you're evolving, improving your digital solutions, going through digital transformations, using the technology to accelerate the migration of legacy systems your long tail of digital technical debt you're building this new capability that enables you to absorb these new datasets, absorb this new insight, these new patterns, and now you've got companies like us that come in and build this layering to give you that transparency, visibility and understanding and from there you could start feeding that into your planning cycles, and your planning cycles start accelerating a little bit. And now you're empowered with a new generation of solutions and pattern recognition engines that enable you to fine tune what's happening across the enterprise and making sure that, for limited assets, you're deploying them in the best places possible. We're seeing that evolve.
It's an evolution, not a revolution, but I think it's an important evolution of the technology. So when we hear about AI bubbles, for example, I laugh because I see the value every day. I understand it. I see it evolve very quickly and it's really, at the end of the day, it's our human ability to absorb it and to put it in practice. And that's what we're seeing and a lot of that data, a lot of this empowering of the planning process. For example, you're not going to necessarily see that in revenue. You're not going to see that necessarily in margins, at least not yet, but it's there, it's happening. It's accelerating your near real-time understanding of what's happening in the enterprise. And that evolution, even though it's moving at an exponential rate, I think is incredibly important and you've got to understand it. The digital push, the wind that's causing these things to evolve, so that you can start planning for the next generation of solutions, of digital transformations, of legacy realignment, and these are the patterns that we see every day.
- Aligning KPIs and budgets for AI success
-
Peter Warren
That's interesting. It's not a big bang. It's a slow evolution, as you said, or a steady evolution, if not slow. Thank you very much. We'll talk about the next point. So don't use a large language model just to do calculations. In other words, use something that's dedicated for it. And this really before we maybe wrap up this part. We'll catch up in part two, where we'll talk about large language models versus small language models and the impact of hardware, both large and small. How do you see about the KPIs and budgets? Where are people going, you know in the first part of this coming year, the end of this year even, where do you think things are going and you know what should people be putting in place as a leader?
Fred Miskawi
So, the KPIs—we tend to work with clients in a very fine-grained, fine-tuned way. So, depending on what business outcomes they're looking to get, we're going to fine-tune the nature of the type of KPIs that are being used. So if we talk about software acceleration, for example, a lot of what we're being requested is to look at developer productivity, the quality of what comes out, the trust factor that comes with what gets produced, and understanding how to leverage this new technology in a way that can cut your time in half or by two thirds, and that requires a certain set of data points that you got to collect. So we work with hyperscalers. We work with and we built our own data collection engines and dashboards to be able to get a feel for what the trends are.
Personally, when I look at that, I don't look at individual productivity levels. I think it's a. You're not getting the value for the money when you do that. What we tend to look at are maybe at from a granularity level. We look at the team level. We look at the value for the money.
When you do that, what we tend to look at are maybe from a granularity level. We look at the team level. We look at the value that's delivered by the team, value delivered over time, the quality that comes with that. So we have a set of KPIs that come with it. So, depending on the particular business goal and the nature of the solution that you're deploying, there will be a different set of KPIs. When you're looking at chatbots or knowledge engines, where you've got the need to unlock the power and the knowledge of the enterprise, of the industry within, and give it to the hands of your employees, when you take that path, you've got to look at certain things like the nature of the interaction, how often these requests are coming in, the nature of the results that, how often these requests are coming in, the nature of the results that you're getting from surveys there are a lot of different data points that you bring in to make sure that you're getting the answer you're looking for.
- Closing thoughts
-
Peter Warren
Peter Warren:
Well, that's excellent. Well, thank you, Fred, and thank you everybody else for listening. We'll pick this up in part two and have a great day. Bye-bye.
Fred Miskawi
Thank you, Peter.