Transcript
Transcript: CSPS Data Demo Week: Financial Risk Discovery, with MindBridge
[The animated white Canada School of Public Service logo appears on a purple background. Its pages turn, opening it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. Text is beside it reads: Webcast | Webdiffusion.]
[It fades out, replaced two title screens side by side in English and French. At the top, it shows three green maple leaves, each made of different textures. Text on Screen reads:
CSPS Data Demo Week
Financial Risk Discovery with MindBridge
GC data community]
[It fades out, replaced by a Zoom video call featuring five panelists. On the top right panel, Taki Sarantakis, a man with glasses and a neat beard, sits in a home library. In the top middle panel, John Colthart, a man with sweeping bangs and a short beard sits in front of a MindBridge banner. On the top right, a woman with light blond hair, Nadine Roy, sits in front of a blank wall. On the bottom left, a man with short, dark hair, Solon Angel sits in front if two windows. On the bottom left, a woman with blond highlighted hair and a red shirt sits in front of a hanging vine plant.]
Taki Sarantakis: Good morning, good afternoon, good evening, depending on where you are joining us from around Canada and around the world. I am Taki Sarantakis. I'm the president of the Canada School of Public Service.
[Taki's panel fills the screen. A purple text box in the bottom left corner identifies him: "Taki Sarantakis, Canada School of Public Service."]
Taki Sarantakis: And welcome to our last instalment of the CSPS Data Week. We started our week with artificial intelligence and labour relations, and we're ending our week with artificial intelligence and financial statements, and spending, and controls. One of the things I like to tell audiences when we're talking about the world and we're talking about technology and we're talking about changes, is that today is the slowest day of the rest of your life. And what I mean by that is broadband will never be slower, Wi-Fi will never be slower, and artificial intelligence will never be dumber.
These things are just getting better and faster, and better and faster, and better and faster. So don't be kind of fooled by what things can do today, because tomorrow, their limitations will just kind of strip away. And as public servants in the Government of Canada, we have an obligation to use the best tools possible going forward to serve the government, and Parliament, and Canadians. So today, as I said, our final CSPS data demonstration event. So, we're going to have some people that are really, really cool. We're going to have somebody from the Office of the Auditor General. And yes, I said "cool" and "the Office of the Auditor General" in the same sentence.
[All five panelists rejoin.]
Taki Sarantakis: We're going to have somebody from DND, and we're going to have a Canadian, couple of representatives from a Canadian company called MindBridge, which is one of Canada's top AI leaders. So, let's kick off—Nadine from the Office of the Auditor General. Surely Nadine, what on earth are you doing here on a topic called AI?
Nadine Roy: Well, I like the fact that you introduced me as cool. That's probably the first, so thank you for that. Yes, I am from the Office of the Auditor General. And my first question is—and I think it's related to the topic—will technology replace humans?
[Nadine's panel fills the screen. As she speaks, panels pop in and out.]
Nadine Roy: As Stewart Butterfield stated, there's a lot of automation that can happen that isn't a replacement of humans, but of mind-numbing behaviours. So, let's talk about artificial intelligence, and automation and financial audit. And that's why I'm here to talk about this. So, I'm convinced that modern and user-friendly tools will provide you with great benefits. AI and automation allows people to devote their attention to their work rather than spending hours acquiring data, running analytics and generating visualization of their data. I have three points for you today.
The first point is that AI and automation can be easy to use, it increases efficiency, and saves time. To illustrate this, prior to using these types of tools at the Office of the Auditor General, to assess possibility of fraudulent reporting and the audit of financial statements, financial auditors used to look through physical binders of data and transaction. They ingested all of the financial data manually. They deciphered every transaction to see if they could be high risk or not, and hopefully selected those for further investigation for possibility of fraud. This manual process was not completely accurate or reliable. AI and automation provides processes to increase the accuracy and consistency of analysis, and even provides a visual interactive dashboard, allowing auditors to rapidly and easily consume the results of the analysis and look for insights in the data.
[A purple text box in the bottom left corner identifies her: Nadine Roy, Office of the Auditor General of Canada.]
Nadine Roy: The technology does save a lot of time and is also, and can be user-friendly. You do not need to be a data scientist to operate it and you'll see this demonstrated to the demo today. My second point is that AI and automation is now part of the audit profession. To keep up and to remain relevant, these tools are now viewed as a member of the team. So, the audit profession and the standards have been catching up to the use of technology and audit at both the firm and engagement level.
Most recently, technology resources were added to the scope of the system of quality management at the International Standard on quality control. IT applications are now formally part of the firm's IT environment and part of the audit team. The firm use of technology is also a significant component of audit quality and its governance is critical. It is not only about having tools to read, explore and report on data; it is also and more importantly about the governance of these tools and surrounding activities. The firm methodologies is constantly now being updated on for both how and when to use technology, but also now for how we're going to use that technology, and how it must be controlled and overseen.
My third and last point of the day is that data needs to be fit for purpose. Standard systems and standardized data format will increase the usefulness of the AI or automation tool. It is really important to remember that data needs to be fit for purpose, and standardized. Unstandardized data really reduces audit opportunities, like applying AI and automation to perform audit procedures, like reconciling interdepartmental settlements, or quickly finding out the amount of money the government is spending with a specific vendor. It also reduces opportunities and creates inefficiencies with government. Duplication of information, creating data redundancy and possibly inconsistency, prevents sharing of information held by one department that could support the delivery of a program in another department. So, for example, in financial audit, if every department and agency has a different financial system, it is very difficult for the auditors to analyze the complete population of different transactions from the different financial systems, and make a global assessment of all of the data for the whole of government. Because the data is not standardized, the AI tool cannot be used to its full capacity. So, standardizing your source of data will ensure that the power of AI and automation comes to life. You can see that I believe these modern and user-friendly tools bring great benefits. And my call to action for you is, I challenge you to start exploring these tools with some of your data, even starting next Monday. You can start small, but please do start. Thank you.
[All five panelists return to the screen.]
Taki Sarantakis: Thanks, Nadine. And now we'll go to Valerie. Valerie, where are you from?
[Valerie speaks, but the call remains silent.]
Taki Sarantakis: Valerie, are you on mute maybe?
[Valerie adjusts something to the side of her panel.]
Valerie Brobbel: Hello, can you hear me?
Taki Sarantakis: Yep.
Valerie Brobbel: There we go. Hello! I'm a member of the National Defence Internal Audit and Evaluation Team. Thank you, Nadine, for a wonderful presentation. So, what does an internal audit team do? In a nutshell, we help our branches, other branches of our department, to do their work better. And we also provide independent assurance to our deputy minister that defence resources are managed well.
[Valerie's panel fills the screen. A purple text box in the bottom left corner identifies her: Valerie Brobbel, National Defence.]
Valerie Brobbel: So, considering the size of National Defence, you can imagine how many financial transactions we process on an annual basis already. On top of that, defence is currently in the middle of implementing what is referred to as strong, secure, engaged vision, which means renewing our aging fleets, and also building an agile military. Now, that also means that corporate support services such as internal audit, for example, have to also become agile. And just like everywhere else, as you can imagine, human resources are limited. So, what do you do when you can hire only a limited number of auditors? We teamed up with Build in Canada Innovation Program to see how we can leverage modern technologies such as artificial intelligence.
Now, just a bit of a disclosure, at this time, we're in very, very early stages of our pilot with MindBridge. So today I will talk about what we're hoping to achieve with this pilot. In defence, right now, automation and digital transformation is the name of the game. So specifically, I can summarize what we're hoping to achieve with the MindBridge pilot as the five Cs—five Cs being capacity, coverage, internal controls, continuous risk monitoring and closing procedures. Now, let me unpack each C a little bit.
The first C is capacity, which very much echoes the message that you heard from Nadine. Currently, a lot of our time is spent on wrangling data, trying to find it, organizing it, cleaning it, which are very tedious and mundane tasks. What we would like to do is we would like a machine to do that and free up our auditors' time to do what humans do best. The second C is coverage. Historically, a typical audit involves what's called sampling, which is a technique where you take some of the data, which is a sample, and then you assume that it represents all of it. So, if you remember, our role is to provide assurance to our deputy minister. Now, wouldn't it be great if we could look at 100% of the data instead of a sample to provide that assurance? Now, the third C is increasing our ability to monitor the performance of our existing internal controls. So, for example, imagine going to work in the morning as an internal auditor coming into your office and saying, "Alexa," or "Siri, show me the top ten highest-risk transactions that happened yesterday," and then looking at them and taking a deeper dive at what happened and trying to understand what internal controls worked or maybe what internal controls need a little bit of improvement or maybe we need new ones. The fourth C is standing up at continuous risk monitoring capability. Continuous risk monitoring is what contributes to continuous improvement. In other words, improving DND programs is our end goal, while monitoring is the means.
Without AI, so far, what we could do is periodic and compliance audits. We can look at transactions from the historical perspective, but if you think about it, to drive a car by looking at-
[Valerie freezes momentarily and her audio cuts out.]
Valerie Brobbel: Continuous risk monitoring has been almost impossible 10 to 20 years ago. Why? Because the technology wasn't available back then. Well, guess what? It is now. The goal of AI is to shift that emphasis from historical to proactive financial risk analysis. If we can look at historical trends in combination with risk assessment scores and then leverage AI technology, then we can try to identify financial future risks. With AI, what we can do is we can continuously update our visibility. It's kind of similar to how a GPS works. We can flag any kind of risky events that may lead to financial challenges down the road for defence. And you can imagine driving a car is a lot easier if you have a GPS. And so, the fifth C is for closing procedures. Fiscal year-end closing process is very time consuming and laborious for our CFO—and really all CFOs in general. So, what if we could use AI or advanced analytics to run a financial health check right before the fiscal year-end and identify areas of concern? It could help us reduce the amount of work and time that our CFO team spends on fiscal year-end process. So, these five Cs would represent our use case for applying AI in internal audit. This is what we're hoping for. If I may turn over to Solon, who is the founder of MindBridge, to continue the conversation. Thank you.
[All five panelists return to the screen.]
Taki Sarantakis: Thank you, Valerie. So, Solon, are you an auditor? Are you going to be the third auditor, or do you have a different profession?
Solon Angel: I think we're going to make auditors turn into super auditors. That's what we always say. So, if you think about the amount of financial data and problems out there, there's a bit of an imbalance in the forc4, right, where you have a lot more Sith Lords and a lot more bad things happening and very few Jedis. If you look at the amount of auditors there is in the world, accountants already are very small. There's a shortage of accountants. And auditors—there's even less of them. And auditors that understand data analytics, not even that know how to code, just that understand data analytics—they're tiny. There's very few of them to do that right.
[Solon's panel fills the screen. A purple text box in the bottom left corner identifies him: Solon Angel, MindBridge.]
Solon Angel: So, what we're trying to do here is to give lightsabers to the guys that know the force—and girls. Because right now it's mostly women, quite frankly. But that's what we're trying to do. The change agents that are the first ones who are going to look at the lightsaber and understand how to yield it, are going to change everything, right? It's always the same when you have a new tool; the first person that knew how to use a wheel was a far more effective farmer. The first person to use a tractor was a far more effective farmer. It's like there's always those things that the first person that uses the tool gets a lot better outcome.
So, you know, sitting on the thought Valerie just expressed, which is it's better to navigate now—nobody takes a map to go somewhere. We just take, you know, Google Maps, or follow a GPS, or something like that. It reminded me of a discussion we had about the function of audit. If you put a parallel with AI in audit and AI in financial analytics. So, if you think about level 5 autonomy, which is when you have a car and you don't do anything anymore, it just drives itself. You say, "I want to go there," and it just drives you there. There's a lot of processes today that should be level 5 autonomy in the government. There's a lot of things happening manually with people in the Excel spreadsheets that could just be automated and should be. But there's also quite a bit of them that should never be fully automated, where you should keep a human in the loop because humans trust humans. And quite frankly, it's the law in some conditions. You know, you always need a human to review what the AI has done. And so, I think, if I can give an advice to everyone, what we do as entrepreneurs is to think of the future.
Imagine a future where everything that should not need a human in the loop is fully automated through an API. Right? And then you define very quickly where you should never have automation just because maybe the processes are bad, or it's just the law that a human needs to review it. So, you draw that future and then you map that backwards to five years from now because things are—you know, let's be honest, in a large organization, it's not just the government, by the way—any large organization takes time to adopt change. So, if you think you give yourself five to seven years. And you say, "Five years from now, when I do my job, this is how it should look like," and then work back four, three, two, one year, one quarter, one month, one week from now, you basically start seeing the wheels in motion you need to activate and to become the change agent. And finally, I mean, before we go into more detail, I think we need—you know, Jedis don't become less scary when they have lightsabers. If you look at Yoda and he's cute, baby Yoda even more—if you look at Yoda, he's not going to be yielding his lightsaber every day. And it's the same thing here when you think of governance and compliance. We have as a profession to be humans first.
One of the things that made me extremely proud as an entrepreneur in AI is when I hear, in the US in particular, some of our users say we used AI in that audit, we used AI in the analytics and our clients did not notice. Right? They just did not notice. What they noticed is that we ask more relevant questions. And I was like, "Bingo!" When the auditors arrive, and instead of just going through a checklist and they're not valued as interrogators, as people challenging asking what's going on—if you look at the president of the iA, he just released a book called Agents of Change, which is what auditors can be. Right? This is a very noble profession. And so, with the fact that the clients of our clients said, "You guys just ask more relevant question," that was a huge sense of pride for us because it makes the profession more relevant and more human than ever. And that's what AI should do, right?
Leave the technical difficult parts to the machines and make the human use the higher cognitive function to really understand, get an understanding of what's going on and be able to be true partners into the process. So that's the three things I would say that after listening to what Nadine and Valerie—by the way, I think it's the first time I'm on a panel where there's like actual, you know, natural representation without being forced, I don't know, John, is that a first for you? It's quite nice to see. Yeah, it's the first for me, too. Usually when we talk AI, it's like dudes everywhere. But anyhow, just a side note here. All right. Well, so what should we do next? Shall we go to some slides to explain?
Taki Sarantakis: Yeah, show us a few slides, but then we also want to see the thing because it's a demonstration. So, we want to see the power of the Jedi saber that you've been talking about.
[Solon shares his desktop screen. On it, iMessage is open to an involved chat. The desktop background shows a candid photo of young girl smiling at a blond man.]
Solon Angel: All right. So that I'll leave it up to JC, and I'll talk quickly about MindBridge. So, I do have to pause right now because there's something on the deck that is not there.
[A slide deck loads. The title slide shows the MindBridge logo, an "M" styled to look like a bridge inside a blue circle, next to the company name. The title slide reads
Canada School of Public Service — Transform your data analytics with MindBridge: AI for your Data." A 3D graphic shows floating, distorted rings mimicking a Wi-Fi symbol.]
Solon Angel: MindBridge, this week, became very proud because we're the first Canadian company that was selected by Forbes as a top 50 AI company in the world—emerging AI companies. And I can tell you the Government of Canada needs to be proud, because if it wasn't up for IRAP, if it wasn't up for SR&ED, if it wasn't up for— some people in Canadian government that cheered with us all along the way to succeeding, I doubt MindBridge would have been in Canada today, and I doubt that we would have gotten all that recognition. So, this is a tool that in the early days, many people in the government told us we need something like that. And on top of that, the Canadian government was a strategic investment fund with IRAP and SR&ED really supported it in the early days to create. And I think this is something that—you know, I've said it in the Forbes article, you can go look at it. I'm not shying away from saying it. When people criticize, the government doesn't innovate, well MindBridge is the proof that it's not true. The Government of Canada does innovate and we're very proud of that.
[The slide changes. The next slide features a grey background and is split into three sections. It's entitled "MindBridge Use Cases. Section one reads "Governance — Payment fraud (electronic transactions/Swift), liquidity & outage risk, bankruptcy fraud compliance, market trading oversight, synthetic id (starting)." The second section reads "Banking — Branch risk & performance — product performance, people (HR), loan origination — loan process and health." The third section reads "Government — Financial ledger analysis for audit, financial ledger analysis for the CFO — expenses, invoices & more, general anomaly detection tools for claims analysis."]
Solon Angel: So, if you look at the use cases of MindBridge today, a lot of people use us for essentially looking at financial data. Right? So, it's around governance—in the banking sector in particular, there's a lot of operational risks around the low-recognition risk performance. From a governance standpoint, if you look at the Bank of Canada, and also the Bank of England and also all the financial transaction agencies which we will not name, there's a lot of issues around payment fraud, liquidity risk, market trading oversight and other things like that. But in the departments of governments, anything around ledger analysis, financial audits, statutory audits, expenses analytics, invoices and more, and general anomaly detection for claims, for example, are issues that we know how to address very well. So those are the three pillars of the use cases of the core AI engine of MindBridge.
[The slide changes. The next slide is entitled "MindBridge overview, and features a series of facts in rings: "Founded in 2015, HQ located in Canada, 2 Patents, +20,000 financial professionals leverage MindBridge, +20BN transactions analyzed." Below the rings, a subhead reads "Applications for the financial services market." Icons accompany "Financial risks, Operation Risks and Efficiency, Fraud" and "Regulation and Compliance" below it. A small graphic details their funding activity, a large area of the graphic is labelled "Gov't SIF R&D."]
Solon Angel: If you want to understand what happened with MindBridge, we were founded in 2015, in Ottawa, with a headquarter—when we used to have a headquarter in Ottawa—still in Ottawa headquarters, we're going to reopen the office in Ottawa. We have two patents and we have already twenty thousand financial professionals using MindBridge to do their work, with over 20 billion journal entries processed. The funding we received was actually—a huge chunk of that came from the Government of Canada. We received twenty-eight million dollars US to date—fourteen million dollars Canadian that came from the Government of Canada today.
[The slide changes, showing awards, such as "Best Machine Learning Solution for Regulatory Compliance Central Banking," logos of accounting associations they are recognized by, such as CalCPA, and MACPA, and the logos of key customers and partners, including Microsoft, KPMG, Bank of Canada and National Bank.]
Solon Angel: The last five years, we've been very busy winning big logos and also winning awards around the world. So, for example, MindBridge is the only solutions today that was awarded Technology Pioneer status by the World Economic Forum for Financial Data Analysis. And I think Canada should be proud. I mean, I'm proud of my baby, of course, like every parent. But if you think—if you pause here for a second and you think of the size of Canada, and the size of the tech sector in Canada, getting those awards, right, really asserts the role that Canada has as an innovator in AI in the world. When we go in Singapore and other places, people know that Canada really values governance. And even in the tech sector, it's translated by having the global leader in that space, right? The key customers we have and partners are Microsoft, KPMG, National Bank as a customer, but also an investor in MindBridge. So, we have strong roots in Canada, but we've been recognized by more than fifteen associations in accounting worldwide, from CalCPA to the ICNW in the UK and others.
[The slide changes. The new slide is entitled "Current financial data vs. traditional methods. A large organic shape sits on the left-hand side of the screen, labelled "Financial loss — due to human error or by intent." A small, light blue circle sits inside the shape. Its label reads "Only 1-5% is seen and known about. CEEs estimate that organizations lose..." Bubbles with figures slide on screen. They read:
- "$1.5 M Average loss per case
- 70% claim their organization has mad a significant business decision based on inaccurate financial data
- 5% of revenue to fraud each year
- Financial experts spend 80% of their time manually gathering, verifying and consolidating data.]
Solon Angel: So today, I'll go quickly here and JC, John, show the product a bit more. But if you think of the current financial data analysis tool versus what traditionally people do, when you do a modern approach to it, you can really reduce the risk of errors and anomalies, whether they are due to fraud or just mistakes in this space. And we're looking at both, right? We're not going to look at just fraud; the majority of problems are not necessarily fraud-related.
[The slide changes. The new slide is labelled "The technology." It features a diagram of a square with a venn diagram in the middle. The circles within are labelled "Machine learning, statistical methods," and "traditional tools." The bottom of the square is labelled "Secure Platform." At the top of the square, one arrow labelled "Smart Ingestion" leads down into the square. Beside it, an arrow pointing up and away is labelled "Transparent Findings." Between the two arrows, an eye shape is labelled "Customization & Standardization."]
Solon Angel: And the technology we've built to do this is a combination of things. The first one is a smart ingestion engine, combined with a very secure platform that we call Ensemble AI that combines machine learning, statistical methods and traditional rules and be able to produce high-fidelity reports out of it. So, you know, the goal is with the least amount of work possible, pass the data in it and you just review what the AI has picked as high risk, medium risk and low risk in the books. To do that, we've created something quite unique, which is something we call control points.
[The slide changes. The new slide is entitled "40+ Control points work together to analyze all financial data." Half of the slide shows a dull cube with sharp grey shapes sticking out and circled blue and green shapes. Three lists show types of control points. They read:
Traditional rules control points:
- Suspicious keyword
- Complex Instrument
- Sequence Gap
- High Monetary Value
Statistical method control points:
- 2-digit Benford
- Complex Structure
AI/ML control points:
- Flow Analysis
- Outlier Detection
- Unusual Amount
- Expert Score
Below, text reads "More likely to identify errors and fraud in your financial data compared with rules-only data analytics methods by over 10x."]
Solon Angel: So, if you think of what a control point does at MindBridge, think of it as mini apps, dedicated analytical apps within the platform that look at different things. One of it is a pure AI expert system. Another one of it is a simple sequence gap analysis. And some others are just a very complex flow analysis where it looks at the flow of money throughout all the accounts. All of those work together and aggregate the risk score down to each transaction, and each entry. And we can find issues more than 10 times—I mean, marketing lower the reality here. When we say it's more than 10 times—I mean, sometimes we're talking about a 1,000% more issues found than with another method, but we're trying to be humble and Canadian here about it.
[The slide changes. The next slide is simply a blue background and text reading "Let's see it in action."]
Solon Angel: So, this is MindBridge, but now let's just pass the floor to—except if you have questions, Taki and others. Let's just see it—it's great to have slides, but I think people would like to see what it looks like on a real-life basis and John—
[The screen share switches to a browser with MindBridge loaded. The website's interface shows a series of tabs across the top row. "Financial Statements," the currently selected tab, shows tables with variable dropdown options.]
Taki Sarantakis: Yeah, no, let's go to the demo because we'll be asking questions throughout the last half hour. And also, for the audience, ask your questions through the ask question presentation or feature of the software and somebody will be analyzing them and firing them over to us. So please proceed.
John Colthart: Thanks, Taki. Thanks, Solon. And obviously huge thanks to both Valerie and Nadine. I think it's wonderful to see that this isn't just about an internal audit, it's not just about an external audit, but it's about the family of auditors. And I love how Solon relates to that. I'm going to give you a bit of a whistle-stop tour. I'm not going to be able to go into everything that MindBridge does, but I'm going to try to highlight some really salient points that match to what Nadine and Valerie are excited about as it relates to AI.
So, when MindBridge brings data in, and we can bring data from any type of ledger system that you have, obviously SAP and others that may be in use as well as subsystems, and we can essentially rebuild your entire financial ecosystem. Everything from flows of financial records, bubbling all the way up to your financial statements in your key actual plan, sort of view of the world. What's great about that is because we can sort of encompass every bit of data that's in there, we're doing all that hard work. So, we're doing checks and balances, and integrity checks, and all these things so that your resources are more aptly put on things that are of concern or of matter. And what you can see and just very simple fluidity of using the solution—
[John scrolls through the table. Below it, graphs show "Monetary flows by account." He manipulates the data as he speaks, changing the graphs.]
John Colthart: I can look at my cash and cash equivalents and see where the money's come from and where it's going to. I can go down and look at other types of capital assets or my payables, and I can very quickly see what's happening with that data and where it's coming from, where it's going. And this is really exciting for the business, because most times you're just looking at the plan and you're trying to understand where you are. And if you get into an audit situation, all you're really worried about is, "Am I capturing enough of the most interesting things?" And you really have no concept of whether you are, so you get to see the flow. You get to not only see the flow, but you can then go in and see where does that risk actually happen?
[John selects the "Risk Overview" tab. and numerous charts and statistics load.]
John Colthart: And as Solon was talking about, when we think about how the solution needs to be present—and I think one of Nadine's first points was "easy to use;" AI isn't this really scary thing and we should all be starting to use it as early as Monday—is because AI can actually just bring things to the forefront. We can raise the bar of what's interesting about what's potentially risky, and we can really extremely impact people's work life, which is really, the whole point of automation. When Solon was talking about the autonomous vehicles and we're thinking about, "Yeah, wouldn't it be great? We just hop in the car, and we say destination, you know, Stittsville," such site, and the car just drives us there. That would be amazing. AI should be that way even for your financials. And so being able to very quickly dig in, see where some of those items are, see which users are causing or creating some of this interesting data or causing some of these potential risk areas. Being able to see it by job number, department code, fund, level of restriction in terms of where those funds get applied, different types of interdepartmental transfers; all of this is possible with MindBridge. And when we bring the data in, it's like snapping your fingers and all of a sudden seeing these types of visualizations.
So we can very quickly see who's doing what, which jobs are causing us more grief, which types of programs that we're deploying our funds to, and where they are. And something that probably Nadine's team would love to have in a solution in the future is also just knowing where if we were doing this on a basis of what the audit standards look like, generally speaking, there's different classifications of risk that they need to look at.
[John selects the "Audit Assertions Risk" tab. It loads, showing a complex, colour-coded table entitled "Account risk by assertion." Below the table, more boxes show further charts.]
John Colthart: They need to look at valuation. They need to look at presentation and classification, existence, rights and obligations, all these things. And both crown corporations and government in general will have different expressions of what these look like. The beauty of MindBridge is they can all be tailored exactly to what you need to look at. So that's a big piece of—you know, when Solon was saying it's amazing that our client's clients are not seeing us, it's because of all this configurability and usability that just enables the auditor to really hone in. Oh, this looks a little bit off. Why is it off? Where's the risk over time? Which control points are impacting that particular thing to go off the radar? And it just delivers a very significantly improved way for people to see and think about data.
[John switches to the "Data Table" tab, which features a large table of data. He selects a row of data, marked under the "account name" column as "accounts payable." A new page loads, showing more details, and an expandable list of scores related to the data.]
John Colthart: And so, I'm just going to go into one quick payables here, see what's going on. I can see the types of interaction of monetary flow. In this case, I'm ordering some things and I'm building something out of raw materials, but it looks a little bit funny. Right?
[He selects the "MindBridge Score" from the list of scores and expands it. Many uniform boxes pop up with text block and little red and yellow shapes in their top left corners.]
And the scoring looks a little bit off. And I can see exactly which lines are causing which things to score higher or lower.
[He selects a different row of data.]
John Colthart: So I switched from the payables view to the cash view of this particular transaction. And I could see there's a big swing in a couple of things; some of the machine-learning algorithms are really kicking in and they're now high and medium. So MindBridge's whole goal as a tool, thinking about that lightsaber, is you've got someone with the force and now it's enhanced with all these midi-chlorians I think is what they're called. Gosh, you're stretching my brain, Solon, for what I need to use in terms of analogies. But I've got this—instead of it just being within me, I've got the entire force coming around me with the data, with all these different types of assets.
[He switches to another open browser tab, also with MindBridge loaded. It shows a risk overview of the accounts payable.]
And being able to, again, really quickly go in and see. So, I've switched over to payables specifically for a second, which looks at a totally different set of potential problem areas—vendors, you're spending more money than normal on a certain frequency of time, looking for duplicate vendors, duplicate payments, duplicate this, that, the other—they're not part of the usual vendor list. All of these things are made possible by MindBridge. So, I'll probably pause there and Taki, if you have any questions for me, for sure.
Taki Sarantakis: Yeah, John.
John Colthart: But it's like so big.
Taki Sarantakis: For sure, John. So, I'm going to ask you to go back to the screen before and just kind of do it a little more slowly for people like me who aren't experts.
[John switches back to the "Data Table" tab with the expanded score view.]
Taki Sarantakis: And very sadly, I apologize. I'm not an accountant either, let alone an auditor. So please forgive me. But what I think I'm seeing here, and please correct me when I'm wrong, is you've kind of pulled out a transaction from a myriad of transactions—it could be one in ten, could be one in a million, one in a thousand—and then I think the algorithm has started taking that transaction and comparing it with other transactions. And it's comparing it in different categories.
John Colthart: Yup.
Taki Sarantakis: And it's actually giving you some kind of at-a-glance information, which I think is what we're seeing. And even though I'm not an accountant or an auditor, I know that a big red X in red is probably not a good thing. So, my untrained eye will go to those and I'll see things like, "Oh, this is a credit to a cash account." It's like, "Oh, this entry was made on a weekend," or "Oh, this entry is higher than a typical entry in value." "Oh, this entry came in on accounts that do not usually interact in this ledger." So basically, what you've done is you've kind of started telling me, "Look here."
John Colthart: Yup.
Taki Sarantakis: "This is where you want to pay your attention to." And then again, as a non-expert, I'm looking at the green stuff—especially the green stuff with a checkmark—and I'm like, you know what? I don't really need to pay any attention to that, because that's kind of telling me it's normal. It's within kind of the expected distribution. There's a lot of green there, which is good. And there's red there, too. And the red could be a problem, but there could be explanations behind the problems, too. So, this is where we come back to what Nadine and Valerie were talking about earlier and what Solon was talking about, which is how the human can come in. So, you've done a lot of let's call it the grunt work. You haven't had to compare it manually to a thousand other transactions and you haven't had to kind of go, "Let's call Mary. Mary deals with this type of transaction a lot and she can tell us if this is roughly normal or if this is roughly anomalous." So, has that stuff happened automatically in the background or how does that work?
John Colthart: Yeah, so it's a great way to phrase it, actually. And I love that you pause me and allow me to kind of come back to that. As your data is filtering in—and I think this is a little bit of what Valerie was saying about part of the five Cs being continuous. Bringing data in during your closed process, after your closed process, all of this data is automatically being sort of rank scored, if you will, on all of these dimensions. Right? We call them control points. Some are more important than others. Some have leading indicators to much bigger problems, much, much higher degree of confidence that there's an issue. Right? Flow analysis, rare flows, outlier anomaly, experts score, Benford's; those would be ones where if any of those turned yellow, amber, red, you're going to want to look at this. And so, you're absolutely right. It's all happening automatically in the background. Data comes in, these reports get built, auditor accounting professional comes in and says, "All right, what do I have to deal with?" Right? And right from the start, you can go high to low, you can go low to high, and you can filter any data to say, "I need to look at this particular issue because it's bubbling up highest."
Taki Sarantakis: So, John, is this actually doing this on every single transaction?
John Colthart: Every single entry, millions upon millions.
Taki Sarantakis: Can you go back and just click on some of the entries so that people can see like that—so basically what it's doing is it's taking every transaction and some of the transactions—you know, there's one for like thirty-one thousand dollars. There's one for one hundred twelve thousand dollars. Let's just look at one.
[John navigates back to the "Data Table" tab and selects another transaction row as Taki speaks. Another page with extra data and a score list loads.]
Taki Sarantakis: And so, I guess you get like a kind of a snapshot overview, is that the page you're looking at now?
John Colthart: Yup, so now I'm at the details. I see it has three entries. It's hitting taxes, cash and returns. I've got a particular type. So, SAP data typically or other big ERPs have different document types. I might know that this is a manually caused journal versus part of a batch versus some other scenario. I get a bit of a bird's-eye view. And in the scoring, I also get a bird's-eye view across the dimensions that might matter for my department or my entity, and I can dig in further. So, it's like this concept of progressive disclosure that you start from a high, medium, low. You look at a list, you say, "Oh my, this looks really off." The more you dig in, the more MindBridge tries to tell you. And we just keep trying to give the auditor or the accountant more and more evidence to say, "I need to look at this, create a follow-up item, assign it to Sally or Bill or Bob, and off it goes into the process of planning my audit." Then they can work on it and everyone's on the same view.That's the other benefit of this; it's not passing spreadsheets or passing other things. They log in, they see exactly what their peers are doing within the audit.
Taki Sarantakis: So for those of you that work in this area, you would know more than others how labour-intensive this would be to pull out every transaction and start categorizing it. And not only is it labour-intensive, in some cases, it would give you very little value because if you have a transaction of, say, a thousand dollars—why would you spend two thousand dollars of transaction time going over a transaction of a thousand dollars? But also, quite frankly, you'd be making a lot of mistakes as a human being. Like, we're pretty smart, but we get hungry, we get tired, we get sore eyes, we get irritated, we have to go to lunch, etc. And so, I think another benefit of this is it takes away from human beings the stuff where our minds start to wander. It takes away the stuff that doesn't interest us because it starts to become routine and kind of monotonous. Is that a fair thing to say?
John Colthart: I believe so. I was kind of hoping to get a head nod from either Val or Nadine. Oh, we are. Perfect.
[Val and Nadine nod.]
Taki Sarantakis: Yeah. Now I'm going to bring Valerie into this. So, Val, you work at DND. And DND spends like a gazillion dollars a year on jets and planes and high-tech toilets and stuff. So, tell us what the capacity to have this kind of instant insight into how DND is spending that gazillion dollars every year. Tell us what this does vis-à-vis your job.
Valerie Brobbel: Frankly, it's a game changer for us. It's the ability to step away from sampling and assuming that the sample that we select actually represents the reality, to giving 100% assurance to the deputy minister. Right? It's this idea of having massive leverage with this kind of tool, because as you can see, it sifts through all of the transactions and only pinpoints the highest-risk transactions that we would be looking at. So, imagine—I'll just give you an idea—so, for example, the BSEG table, out of SAP, just one table, contains thirty million records on an annual basis. I mean, we can't hire that many humans to be able to look through them all. So, a machine does it for you and then you use—it's that Pareto principle, the 80-20 rule. In the past, we used 80% of our time sifting through data. Now we can use it to—actually the machine does it—and then we can use 80% of our time to analyze the data and take a look at what's actually happening. So, it's 100% visibility. One more thing, Taki, I want to point out is we do have tools right now which we can use to sift through the data. What the difference is between the existing tools and what AI, for example, brings to the table is that right now there is still a lot of human interaction in the process. So, if you think of these control points that you see these squares with the red and green and yellow—if you think of them as filters. Imagine a simple analogy of Excel, how you filter the data in Excel. So, then you would have a human auditor going in, applying one filter and then another set of data would pop up that's already filtered. And then if that person wants to know more, they'll apply another one, and another one. Well, what this thing does, it applies twenty-nine of them at the same time—all at the same time automatically without a human being involved in the picture. So, to answer your question, what this does, it allows us to find that proverbial needle in the haystack very quickly.
Taki Sarantakis: Yeah, and not only find it; in some respects, it actually finds it for you. Like it gives it to you.
Valerie Brobbel: Right!
So, Nadine, we've talked about sampling a little bit and sampling is just kind of taking a piece of a data set and kind of looking at that and hoping, as Valerie said, that it's representative. What's a good sample? Like 5%, 10%, 50%, 20%?
[The screenshare ends, and all five panelists fill the screen.]
Nadine Roy: Well, I think the use of this tool or any other AI tool is we don't have to do sampling, right? You can look at all of the data. So that's the benefit as well.
Taki Sarantakis: Right, but I'm trying to get to —if you don't have a tool like this, what are you looking at, typically? 5%, 10%, 20%?
Nadine Roy: Yeah, at most, probably. It depends on the audit, but it would be around 15%, 10%, maybe.
Taki Sarantakis: Right. So, you're looking at like one out of every ten entries or one out of every seven entries. And again, if you multiply that by thousands, if not millions of transactions in the Government of Canada, you can really, really quickly see how much more coverage you have because you've gone from like, again, one in ten, one in seven, to a hundred of a hundred. And you're looking at every single transaction and there's nowhere to hide. Now, this was a little bit of an anomalous year in the Government of Canada because of COVID and income support measures and the like. But this year, we spent six hundred and fifty billion dollars in the Government of Canada. One of the slides, Solon, you had—I think you said something like 5% fraud ratio typically. And I know it varies by industry. In restaurants, it's this, in retail, it's that, etc. But let's even forget about fraud. Let's start thinking about efficiency. Let's even assume there's no fraud. Talk to us a little bit, Solon, about how you can now start to use some of this data to see operations that are efficient versus operations that are inefficient.
Solon Angel: For sure. So, before we go there, if you don't mind, I'd like to sit on something, because if I was a student auditor specializing in analytics, I'd be nervous hearing Valerie and Nadine saying, "Hey, we don't need those people anymore," right? And there's quite a few, not that many, not enough, but it's quite a few. And then this is what I want to share so that those people don't get nervous here. Here's what's really going to happen; if you look at a team of one hundred people or fifty people, Taki, that today does auditing, you have maybe one or a few that know how to use CAATs, that know how to use analytics, [indistinct], all those things, in those departments.
[Solon's panel fills the screen. A purple text box in the bottom left corner identifies him: Solon Angel, MindBridge.]
Solon Angel: There's a handful—like the dual competency of being an accountant and auditor, already that's small. And then on top of that, knowing how to use data science, that's even smaller. So, what happens when you put something like MindBridge—you democratize access to insights to the whole team. Do you think that one or two person on that team of five that knows how to use analytics is going to be less demanded? It's the opposite. We didn't know when we started, we thought, "Well, you know, maybe they'll become the administrator of the system or maybe they'll be reassigned to other departments." But that's not what happened. In the last five years of using MindBridge, when you give the power of MindBridge to a team of 50 internal auditors, what happened is they find more stuff because of the capacity created. They do more missions; they turn things around faster. And then when they run into something that is a slide, is something very strange that MindBridge found, guess who they turn to, to recompose and to do what I called the black box crash test decomposition. You know, when the plane crashes, everyone looks for the box. Right? And this team becomes more and more useful for that.
[All five panelists return to the screen.]
Taki Sarantakis: Yeah, I get what you're saying, Solon. Another way of saying it is that the auditors actually do the work that they were trained in university and other higher institutions to do, which is the value-added stuff.
Solon Angel: Yes.
Taki Sarantakis: And like all the stuff that's less specialized and repetitive, that stuff is done for them. And a lot of auditors would be very grateful for the fact that they can actually use their brain to look at the things that requires their brains.
Solon Angel: Yeah, absolutely. Absolutely. So going back to your question where you talk about just inefficiencies, there's actually—you know, the government—actually if you look at the OSFI report, the government is the number one industry source of corruption, of course, fraud, inefficiencies. Those are just large organization with a lot of rules, so it becomes harder to manage everything. But if you step away from the consideration of fraud and you look at data—actually, the government does not have reliable data for inefficiency. But the private sector, with publicly traded companies do because they are accountable to the public sector, very, very much to public shareholders. And the data is staggering. 67% of CFOs of large corporations of the Fortune 500 say that they do not trust the financial reporting data.
Taki Sarantakis: OK so stop—stop for a second. Two out of three CFOs say they don't trust the data in their own companies?
Solon Angel: The one that your ISP relies on and your 401Ks in the US—because it's just a promise to big, right? The business of an entity, the business of a government or the business of a publicly traded company is not to run the finance department—it's to run the operation and serve constituents. So, it's normal that the finance department is always slightly underfunded compared to the task. Right? But recently, in the last two decades, because of a lot of SAP deployment, a lot of ORACLE financial, just a growth of data and the budgets of teams did not grow. Right? They just lost the ability to have that confidence. Right? So, if you're not even thinking of efficiency, because this is a lot of enhancements—just on the most important thing, financial reporting trust is gone. It's just calling it the way it is, right? And that's what's driving a lot of interest in Mindbridge today. What Valerie was saying today—to be able to tell a minister you can stand in front of Parliament and sleep in peace that the number is the number. There's not going to be someone that's going to argue and find something that has an opposing view on it. It's really powerful, right? Not just from a political standpoint, but just from an operational efficiency. The other thing is confidence is key for public constituents. There's nothing worse—I mean, you've seen it when, I think a couple of years ago, under the Harper government, there was a million dollars of expenses miscalculated on the Parliament. They had to spend eleven million dollars to Deloitte basically redo the whole audit. Right? And it caused a whole distraction from the whole public on how bad is it, right? This type of event traumatized the public mindset. Right? And people remember that.
Taki Sarantakis: Yeah. And it turns out there was no issue there. It was just like a mis-accounting, mis-coding and the like. I spent most of my career, or a good chunk of my career, on contribution agreements. And contribution agreements are kind of where the Government of Canada gives money to other entities, provinces, territories, etc. And we would get claims from the provinces and territories, and we would literally get them on paper and stacks of them. And it would be like, "We spent this much on cement. We spent this much on steel. We spent this much on engineers. We spent this much on labour." And we didn't have the capacity to be able to say, "You know what? Province X spends four times the amount of labour for a highway than province Y does." We just couldn't do that because it would cost a gazillion dollars to actually hire all these people to go through the claims. It sounds like this is something that you can do pretty easily.
Solon Angel: Yes, that's benchmarking. Right? So, this is the Holy Grail that I think very few leaders are—the thing is, if there's any leaders here and managers in this call—leaders don't know what they don't know. So, they can't ask what they don't know exists. Right? But if they would discover what it means for them, they would be demanding that every day, right? The ability to compare, benchmark yourself, compare it to others in that sphere, is very, very important to know how well you're performing as an entity and as a department. And the other thing that you bring up here that everyone needs to be aware of—I mean, we're all remote right now. When you used to be able to go on site, that's gone. So, you need to start from the data. You need to start remote, and you need to start with a scientific method to do that. Which is why right now, this is all of interest on MindBridge, because you can send the bots, deploy the bots, run it on the GL, right? And then before you start asking question, you get a scientific approach as to what's going on exactly.
Taki Sarantakis: Yeah. Now, Valerie and Nadine, thankfully, we have you here because I have a very technical question that's just come aboard. So, I'm going to turn it over to you guys. Is this solution designed to replace existing ERP solutions, i.e. SAP, for example? Why don't I try Valerie and then Nadine?
Valerie Brobbel: From my perspective, no. It sits on top of it.
Taki Sarantakis: Nadine?
Nadine Roy: I agree. Yes.
Taki Sarantakis: So, this is—
Nadine Roy: It's information from the SAP that goes into the tool.
Taki Sarantakis: Exactly. So, this is something on top that kind of takes the data. Let's go back now. We talked a little bit about efficiency. We talked a little bit about fraud. Talk to us a little bit about anomalous things. Valerie, you work at DND and you're going to see things like buying a missile. I don't work at DND. I work at the Canada School.
[A dog yaps.]
Taki Sarantakis: If I were to try and buy a missile, would this kind of software show people, like really quickly, that, "Hey, he usually buys pens and pencils, but he bought a missile. Like what's going on here?"
[Valerie's panel fills the screen. A purple text box in the bottom left corner identifies her: Valerie Brobbel, National Defence.]
Valerie Brobbel: Yes, exactly. That's exactly the point. And to be able to flag it pretty much in real time, as often as you upload your data. And this is a cloud-based solution, so not only can you do remotely, but also as often as you wish. The majority of the audits, so far, the reality is it takes a long time—sometimes several months. By the time the results come out, we could be in the next fiscal year. So, the value of this would be flagging things in flight while they can still be really looked at and fixed. Very proactive way of managing resources.
[Taki's panel fills the screen, then all panelists return to the screen.]
Taki Sarantakis: Exactly. Because there is—Nadine I'm going to tell a joke that you're not gonna like, but there's an old saying that auditors—or actually Valerie might not like it either. Auditors show up at the end of every emergency and shoot the survivors. And what Valerie I think has just said is we don't have to wait for the auditors at the end. You can actually use tools like this to course adjust because audits, in some ways—audits are a little bit like autopsies. You kind of go back and the program is dead because it's finished, or the program is dead because you've spent all the money. And it's nice to be able to say, "OK, well, next time we design a program, we're going to do this, this and this. We're going to learn from the autopsy." But wouldn't it be amazing to actually have the power to learn in real time, to make adjustments in program delivery for governments as you're actually delivering the program? Maybe I'll ask our two auditors who kind of deal in autopsies to kind of comment on that thought.
Nadine Roy: Well, I'd like to react to that. The dream is not to come at the end and kill everybody off, right? So, it's to be able to support and help, and be able to change Canadian lives at the end of the day. So, yes, these types of tools, like I said in the very beginning, it's automation. It's a way of using AI to help the auditors analyze and to help the auditors complete their audits and to focus on the right things. So, I do contest what you just said. I want us to change that mindset—
Taki Sarantakis: I know. That's why—that's why it was a joke.
Nadine Roy: That mindset about auditors, right?
Taki Sarantakis: Well, that was a joke. I've got like 80 jokes on auditors, if you're interested. Anyways, now, we have another question from the audience. Is there an opportunity to use the software outside of the financial management sphere? Solon and John.
Solon Angel: Oh, we have so many stories on that. So which ones can we share publicly, JC?
John Colthart: I think it's fewer than we would like, but it's kind of an interesting scenario because everything about us is configurable, extensible, putting power into the hands. So, when you can think of a data set as being—like just in simplified terms, a spreadsheet. Right? You've got rows and columns. If there's anything numeric and fundamentally financial in that, that's our bread and butter. We can look at those types of things. So, it could be security events, it could be down into actual, let's say, transactions that are happening at missions around the world. These are the types of things that it doesn't matter to us. It can be anything that you want to look at, and the configurability is really the key, because auditors, generally—and I'm sure that I might get a little bit of backlash for this—but auditors, generally, are some of the most creative folks because they're trying to figure out what happened to your point about going in after the fact. How about we look at it differently? And how about we look at putting the creativity first? So, the answer should always be emphatically from our side, yes. You have another data set that if you can imagine it as a rectangle, you can imagine it as a spreadsheet. That's our jam. Right? And we can help you really refine that. And we'd love to be participating in these things. As Solon said, I mean, we're a bunch of Canadians building an amazing set of technology that's being looked at worldwide. But why not just bolster our own stuff at home? I mean, it just seems to make more sense.
Taki Sarantakis: Yeah, because at the end of the day, this is just analytics. Right? And you've taken analytics and you've applied it to financial data, but you can apply it to the number of people that are visiting hospitals. You can apply it to how many knee replacements happen in jurisdictions. You're basically just looking at data and you're looking for patterns. You're looking for anomalies. You're looking for things that look normal. You're looking for things that kind of I'm not sure, that doesn't look normal. So, it's really the power of analytics. And it would seem that finance is kind of the perfect tool for the power of analytics because you kind of have homogenized data. Right? You have dollars. As we close, I'd like to maybe—is there anything in terms of comparability that are issues here? If this is such a great tool, and I don't mean MindBridge, but this tool per se, and it's here right now, there must be reasons why we're not using it. Does it cost a gazillion dollars? Is it too esoteric? Is it risky? Is it only left-handed people know how to use this stuff? How come—and I don't just mean the Government of Canada; there are other governments. There are other large companies that don't use this stuff. Is it just a matter of time? Maybe I'll ask each of you for your thoughts on this and then we'll have to wrap up, sadly.
Solon Angel: I think it's a matter of awareness. Quite frankly, we haven't had a marketing department until last year. We got so surprised by how many accountants and finance professionals needed that—I repeat, until last year, we didn't have a marketing department. What we've seen is as soon as we share—the two words to remember is we're all about risk discovery. So, you know, the data of National Bank is very different than another department is very different than the Bank of England. Right? All we enable is risk discovery that is automated level 5. Quite frankly, the number one thing is people awareness and fear factor. So sometimes it's augmented analytics. So, the traditional analytics team might be concern of, "Well, what am I going to do if this thing automates that much? The reality—there's always a lot of other work and there's always a lot of work in that space. But awareness is number one, right? It's the first time we come here to an audience in the Fed I think that is right, aside of FMI. And I think the number one thing is people thinking that it's more work than it takes. Right? We've had clients in days up and running. And from a costing perspective, it's really—it's less than one full-time employee, to put it this way to get started. Right?
Taki Sarantakis: So, Nadine and Val, your thoughts on this?
Valerie Brobbel: Can I jump in as well? I just want to team up with Nadine and talk a little bit about your auditor joke a little bit. We're not saying that auditors don't add value. They do. What this tool can open up is an opportunity for almost a new business line. If you think about it—currently, we provide assurance based on historical data. What the tools like that, language and others that use AI, it can allow us to become a strategic partner, a strategic advisor, somebody who is steering the big ship of defence it is. Right? So that's really a whole other way of looking at things. So, my answer to your question, your last question, would be the mindset. It takes time to shift people's mindsets, and a lot of information needs to be exchanged and a lot of personal communication, face-to-face, [indistinct, audio cuts out] showing the benefits. So that just takes time.
Taki Sarantakis: Nadine, as is often the case, the Office of the Auditor General gets the last word. Talk to us about how to—why these tools aren't used more and maybe whether they should be used more or not.
Nadine Roy: I think it's a question of time as well than mindset of being able to explore new ways of doing things, which does take time. But as I said, it may be slower in some areas than others, but it is taking place because in the audit world, we have standards that are saying that AI automation is actually a team member now. So it may be that it will take some time to change the mindset, but it is going to happen.
Taki Sarantakis: Terrific. So, Nadine, Valerie, John, Solon, thank you so much for spending this hour with us to demystify a little bit, not only artificial intelligence a little bit as a concept, but an actual real world live practical application of artificial intelligence that certain arms within the Government of Canada are already starting to utilize. And I think seeing things like this, you can project going forward that this will be the norm more soon than it will be the exception that it is today. Thank you so much for your time. Thank you so much for your energy. And thank you more than anything for being friends of Canada's public servants. All the best. Take care.
Nadine Roy: Thank you.
Taki Sarantakis: Bye!
[Participants smile and wave and the Zoom call fades out. The animated white Canada School of Public Service logo appears on a purple background. Its pages turn, closing it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. The government of Canada Wordmark appears: the word "Canada" with a small Canadian flag waving over the final "a." The screen fades to black.]