Language selection


GC Data Conference 2024: GC Data Stories

February 21-22, 2024 | Virtual

Now playing

Showing 1 to 13 of 13 entries

Transport Canada: EBIDA – Discover the power of your data

Get ready for a demystifying exploration of cutting-edge AI technology in the transportation industry.

February 6, 2024


Debbie: Hello Government of Canada Data Conference 2024 Data Enthusiasts and Transportation Trail Blazers. I am Debbie and I am thrilled to be one of your artificial intelligence generated interviewers.

Lee: I am Lee. I am also an Artificial Intelligence generated interviewer. In addition to being cool avatars, we are powered by massive neural networks with billions of parameters. In this video, we’ll be exploring the exciting intersection of artificial intelligence, data and cloud computing within the transportation industry.

Debbie: To help us do this, we are joined today by Doctor Rida Al Osman. Rida is the Executive Director of the Data Analytics Platforms and Operations Division at Transport Canada.

Lee: Rida, what have you and your team been up to lately?

Rida: Hey, Debbie and Lee, thanks for the kind introduction. Over the last few years, we’ve been busy. Developing a massive cloud data analytic ecosystem that we call, eBIDA. I’m pleased to announce that eBIDA is now live and operational. And can help us tackle and manage Government of Canada priorities including ocean protection plan, supply chain, strengthening cybersecurity as well as combating future contagion. In addition to streamlining operations and finding efficiencies. So think of eBIDA as the Cadillac of AI platforms that can ingest, store, analyze and visualize big data sets across different transportation modes.

Lee: Interesting. Have these cutting-edge analytical technologies been beneficial to the department?

Rida: Key capabilities like advanced data analytics using data bricks, geospatial development using ArcGIS and dynamic dashboarding using Power BI have been a lifesaver helping in tracking nationwide events, for example, the wildfires that we experienced last summer as well as monitoring and optimizing supply chain operations.

Lee: Artificial intelligence is currently the biggest emerging trend within advanced analytics. Does your platform incorporate AI, and what potential benefits does AI offer to transport Canada’s operations?

Rida: Absolutely. eBIDA features artificial intelligence including generative AI. At Transport Canada we’re already seeing many of the AI benefits. For example, we developed an AI model that helps in air cargo threat detection. The model identifies false positives and minimizes manual intervention. There’s an opportunity to apply the AI model across different transportation modes, including marine. Whenever if threat detection is required. The AI benefits to the Government of Canada are endless. At Transport Canada, we’re actively working with the program areas and business partners to identify use cases where AI and cloud data analytics can help us provide better services to Canadians across air, rail, marine and roads. Also, we’re looking into creating a cloud digital workspace where we can exchange data and share information and experiment with different emerging technologies to advance the Government of Canada Digital Agenda. In that space, we created a proof of concept where we exchange data with Statscan in a seamless and secure manner using cloud to cloud technologies. In addition, we’re actively working on identifying AI use cases that can help us address the challenges we’re having with the supply chain. For example, there could be an opportunity to explore AI driven models in reducing congestions at different transportation nodes including ports, railways, and highways. The AI models can, for example, examine historical data as well as incorporate external factors such as strikes, disruptions and weather conditions. The AI models can provide insights into improving inventory management. And optimizing resource utilization. So, AI’s impact will be felt throughout our department by industry partners and across the Government of Canada.

Lee: That is a lot to think about. I’m beginning to see the potentially endless benefits that this technology brings.

Debbie: I heard through the Grapevine that TC is also using generative artificial intelligence. How did you use generative AI to deliver better services at Transport Canada?

Rida: Yeah, that’s correct. We recently completed a generative AI chat bot in collaboration with our finance client. That gen AI chatbot responds to travel related questions and serves as a Copilot or a decision support system for the finance department inquiries. Over time, we anticipate growing its underlying documentation set so it can respond to other types of questions. So far, this gen AI chatbot has generated a lot of interest within our department as well as across the Government of Canada. Anticipate that over the next few years, Gen AI Chatbots will be the next big thing as it can help the Government of Canada to provide better services.

Debbie: Rida now, are you concerned about the rapid rate of change that artificial intelligence is bringing to the department?

Rida: Great question, Debbie. While AI brings a lot of benefits, we must consider its ethical implications. Potential biases that spread of misinformation, deep fakes, manipulation, hallucinations. And privacy concerns. Developing a clear understanding of the societal impacts. And addressing the diverse needs is crucial for responsible use of AI. I would also be adhering to TBS guidelines and adopting a user centric approach are key starting points. There are, of course, some principles that can help guide us. For example, we need to ensure that we design transparent solutions, ensure proper governance and data privacy standards for those who are using our services. Consider the social impacts of our solutions and, of course, maintaining an unwavering commitment to accountability.

Debbie: One last question. How should the Government of Canada workforce prepare for the emergence of artificial intelligence?

Rida: There are some practical steps that you could be doing today to get ahead of the changes that are coming. First, improve AI data literacy within your group. This can be done through research, skill development, working groups, hackathons, training programs. Subsequently, faster in culture of innovations and experimentation within your group. Consequently, I would say keep up to date on the policies and guidelines for the ethical use of AI within the government. This includes protocols for data handling, privacy protection and decision-making processes. We’re entering in AI enhanced era. Never stopped learning about how AI works and how it can help you improve service delivery.

Debbie: That makes sense. Is there anything more that people could be doing?

Rida: Yeah, of course. Considering the following. Reaching out to our industry partners, to different departments, to other countries to understand how they’re using AI. Start small with pilot projects, proof of concepts, develop some experience and acquire confidence before tackling larger projects. Finally, and that’s very important, developing a change management strategy. And of course, be prepared to share your experiences with others. The knowledge that you gain working with AI and cloud data analytics can help others as they begin their AI journey.

Lee: That sounds like good advice. You have given us a lot to think about. But that is all the time we have today. I encourage everyone to check out all the innovative artificial intelligence work that Transport Canada does. I also encourage everyone to check out the other great presentations that are part of the 2024 Data Conference. And on that note, I am Lee.

Debbie: And I am Debbie, We are your AI generated interviewers signing off. Thank you very much and enjoy the conference.

Agriculture and Agri-Food Canada: AgPal Chat

Agriculture and Agri-Food Canada discusses its new generative AI chatbot, which won the first Public Service Data Challenge.

February 6, 2024


Hello, I’m Elise Legendre, Chief Data Officer at Agriculture and Agri-Food Canada. Today, I’m excited to introduce our new generative AI chatbot – AgGPT. It’s set to launch this month and will be featured on our AgPal website. AgPal is a wonderful resource for folks in Canada’s agricultural and agri-food sector sector. It provides information on over 400 federal, provincial, and territorial government programs and services. The 32 AAFC programs represent nearly $13 billion dollars in funding that is available to the agriculture and agri-food sector. So it’s an important resource. But it was built years before the current generative AI boom. So when ChatGPT came out in late 2022, my team began exploring ways to use generative AI to improve our department’s service delivery to Canadians. AAFC entered a team in the first Public Service Data Challenge. It was made up of folks from policy, programs and AI experts from my team in the Office of the Chief Data Officer. It started as a general pitch to make it easier for Canadians to find agricultural information. But it eventually morphed into a working prototype of a generative AI chatbot for AgPal. One of the reasons we chose AgPal was because it has well-curated data and metadata. This made our job easier since we didn’t have to spend time cleaning and organizing the data. We were of course thrilled when the Public Service Data Challenge judges chose AAFC’s entry as the winner last June. That’s when the real work began to being our chatbot into production. I’m proud to say that our new chatbot is ready to go live this month. But before it does, here’s a sneak peek at how it’ll transform your AgPal experience.

[Demo Video Plays]

AgPal Chat is a generative AI chatbot that provides helpful, conversational advice to Canadians on government programs and services in the agricultural sector. It will be available on the AgPal website as a new, complementary way to connect users with relevant Canadian agricultural information. In this demo, we’ll show you how to use AgPal Chat, and talk briefly about our design philosophy for it. To begin, AgPal Chat can be accessed on the website by clicking on the chat icon. An introduction to AgPal Chat will appear, and you can click on “Begin Conversation” to start the chat. A disclaimer will appear, and you’ll need to accept the terms before being able to proceed. You’ll then select a region. This will help the chatbot better connect you with geographically relevant resources. For this demo, we’ll choose Alberta.Now, you can ask it a question by typing in the chat box. Since we chose Alberta, we’ll take on the persona of a cattle farmer and ask what types of funding are generally available to us, in plain language. Voilà! AgPal Chat has connected us to several options based on our question, with reference links to both the programs and local research centre information that it mentioned. We can click on the links to read on, or directly ask AgPal Chat for more information.

We’ve built AgPal Chat around several core design philosophy principles: Security: AgPal Chat is built and hosted on a secure environment with enterprise grade security controls. Privacy: AgPal Chat leverages state-of-the-art safeguards to protect the information that it processes. Accessibility: AgPal Chat is WCAG compliant, and users are able to ask the chatbot to modify its response format for personal accessibility purposes. For example, returning information in bullet points, or modifying the level of language used. Multi-language support: AgPal Chat is available in both official languages. Collaboration: AgPal Chat is built in a modular fashion. Our aim is to make the source code readily available to departments who would like to springboard their own generative AI use cases.


So there you have it, a glimpse of the future of service delivery with our AgPal chatbot. As far as we know, this is the Government of Canada’s first outward facing generative AI chatbot, and we’re proud of it. We have been overwhelmed by the interest and support these past few months, and we appreciate you joining us on this journey.

Thank you.

Canadian Food Inspection Agency: Nachet: Revolutionizing seed identification with AI

Providing lab analysts with the ability to quickly identify seed species with a single click.

February 6, 2024


Discover Nachet, our web application developed in-house at the CFIA by our Artificial Intelligence Laboratory in collaboration with the Saskatoon Seed Science Laboratory.

The CFIA works diligently to ensure that invasive weed species do not enter or leave Canada through the import and export of seed and grain.

Nachet is a web application utilizing open code, public data, artificial intelligence and computer vision. It provides our lab analysts with the ability to quickly identify seed species with a single click while remaining intuitive and simple to use. This system, having been trained on many seed images captured through microscopes, uses computer vision and AI to identify various seed species with great accuracy.

Nachet’s interface is focused on simplicity and ease of use. Analysts can quickly capture or upload seed images to begin identifying them with minimal training required on how to use the AI application.

Once the image is uploaded for appraisal, Nachet’s AI quickly recognizes each seed independently, outlining them with a red box, assigns them a species name and offers a reliability or accuracy score.

For example, in this sample, all seeds are accurately tagged as Ambrosia psilostachya with an accuracy rating above 90%.

The results are clear, and the system is user-friendly. Analysts can navigate between identifications or review saved images.

The “General” folder is the default folder, but creating custom folders allows you to manage, file and store images as you wish.

An image, or the entire contents of the cache file, can be saved.

Our goal is to make Nachet available to all our seed scientists who can use this as a another tool in their efforts to restrict or contain the spread of invasive weed species. The scope of this technology could evolve into other areas such as identifying invasive insects or identifying pure seeds. It is a versatile infrastructure that seamlessly integrates digital hardware and AI technologies.

Nachet is one example of how CFIA will be applying emerging technologies and blending them with traditional inspection activities to be more capable and effective in achieving results.

Thank you for exploring Nachet with us. For more information, visit the CFIA Artificial Intelligence Laboratory’s public Github repository.

First Nations Information Governance Centre: First Nation Data Governance Strategy

This video explains the background, context, and rationale behind the development of the First Nations Data Governance Strategy (FNDGS or the Strategy) and the need for First Nations to direct the establishment of a new network of First Nations-led, non political Information Governance Centres across Canada.

February 6, 2024


For millennia, First Nations’ ways of living, knowing, and being were protected.
Our languages, our identities, and our sovereignty thrived; we were safe; we were strong.

But with colonization, our sovereignty and the right to determine our own futures came under threat…
Control and ownership over our ‘ways of knowing and being’ began to break down – creating a void between who we are today and who the Creator intended us to be.
Yet, despite these hardships, we are a strong and resilient peoples with a proud history and an even brighter future.
In this era of reconciliation, a new story is being written.
This story sees us taking back what is ours. It sees us exercising our rights to
● self-determination and
● self-government.

But to do this, we must also exercise our rights over our information, over our knowledge… over our data!
In a rapidly evolving digital economy, Data – in the form of information, statistics, and research – is key.
To date, Canada has been — and is still — collecting our data, comparing our quality of life, AND planning our futures without our input or consent. We have been forced to fit into a mould that is not of our making – or of our choosing.
So, we’re at a loss for reliable, relevant data that speaks for us from our worldview.
Our data must belong to us.
It comes from us. It connects us. It defines who we are, and it has the power to guide us in building our future – in our own image.
It is our asset — our legacy.
Like all governments in the world, our own governments must also be empowered by reliable data to do their jobs – to make informed decisions and to plan for the future.
For this reason, our leaders have been advocating for rights over our data…

However, the recognition of our rights over data is not enough…
We need to build capacities to meaningfully preserve and leverage our data in impactful ways.
And although we have been collecting data, significant gaps remain, leaving many of our communities behind.
That’s why – as directed by Chiefs in Assembly – the First Nations Information Governance Centre and its regional partners developed a strategy – from the ground up – with the goal of building data capacities that our communities and leaders need.
This strategy, our plan, is all about establishing:
● Fully functional
● Non-political, and
● Expert-driven
Regional Information Governance Centers (one in every region) – led by First Nations.
These regional centres (or regional hubs) will provide a suite of shared data and statistical services to all communities, their governments, advocacy organizations, and service delivery agencies.
Leveraging economies of scale in every region, communities and their organizations will have access to the digital infrastructure and professional expertise they need to govern, access, and repatriate their data – leveraging its power.

So, what’s next?
Implementation will be community-driven, and nation based – rolling out in phases.
Phase 1 starts now, with the recruitment of a dedicated data champion team in every region that will engage leadership and help them define:
● First, How these Regional Centres will be governed?
● Second, What priority data capacities and services will they offer?
To build a solid foundation, this first phase needs to focus on the blueprints before we start to invest in the brick and mortar.
This work is expected to take about 3 years, with each region advancing at its own pace.
Once completed, we will move to Phase 2.
And that’s when we will bring our blueprints to life!

Never has the timing been so right and the opportunity so within our reach.
Our leaders have been trail-blazing this path for years…. And the Canadian government has responded with significant funding to implement our plan.
With this strategy, we will build our own data governance and statistical institutions, and they will be equipped with the modern capacities we need to hold and use all our data.
It will be our legacy for the next seven generations to come…

[Text on screen]
Who is FNIGC?
The First Nations Information Governance Centre is an independent, apolitical, and technical non-profit organization operating with a special mandate from the Assembly of First Nations’ Chiefs-in-Assembly (Resolution #48, December 2009)
FNIGC envisions that every First Nation will achieve data sovereignty in alignment with its distinct worldview.
With First Nations, we assert data sovereignty and support the development of information governance and management at the community level through regional and national partnerships.
We adhere to free, prior and informed consent, respect nation-to-nation relationships, and recognize the distinct customs of nations.

Global Affairs Canada: Developing the GC Data Reference Standard on Current and Past Countries, Territories, and Geographic Areas

Global Affairs Canada’s journey in developing a Government of Canada-wide data reference standard of current and historical countries, territories, and geographic areas.

February 6, 2024


After three years of hard work and extensive inter-departmental collaboration, Global Affairs Canada is excited to announce the approval of the data reference standard on codes for current and past official names of countries, territories, and geographic areas recognized by the Government of Canada. This standardized list is available across the entire government, making it simpler to identify and use official names.

So, what is a data reference standard? In simple terms, data standards are like a common language everyone agrees on. They make sure information is organized in a way that everyone can understand, like using the same rules for grammar or spelling. Why does this matter? Well, think about it: a common vocabulary creates less room for confusion and errors. It leads to better understanding.
These standards are the backbone of order in our digital realm. They ensure that computers, systems, and applications can easily share and understand information.

As a result, the development and application of such standards was confirmed to be a top priority of the 2023-2026 Data Strategy for the Federal Public Service.

The journey to create the GC data reference standard started in 2019. Our Data Management team collaborated with the Office of Protocol and geographic bureaus within Global Affairs Canada, kicking off the development of this data reference list. Simultaneously, consultations with other government departments were initiated.
In 2023, GAC’s Corporate Communications Division was identified as the business data steward. Why? Well, it’s simple. They are responsible for the department’s style guide, which lists the officially recognized names of countries. Assuming stewardship of the new data reference list was a natural progression.

The existing list also went through a thorough review against international standards, refining and expanding it. Throughout 2023, consultations were held with various departments and agencies to understand historical naming needs. For example, a department or agency that handles foreign travel documents might need to know that a document issued by a country that no longer exists was valid if issued between certain years, or that documents may have been issued by an administrative power for a country that was once a dependent territory.

The process faced challenges like:
• extracting data from historical records, determining the lineage of countries,
• confirming administrative powers for territories,
• addressing issues related to countries in conflict, and
• uncovering anomalies in international standards.

There were also challenges in content, notably, arriving at a consensus for politically sensitive territories and geographic areas. Despite these challenges, by working together to achieve our common goal of establishing a list, we were able to finalize an organization-wide set of information for all departments and agencies.

The goal of the GC Data Reference Standard on current and historical countries, territories, and geographic areas is simple: to enable government departments to easily access accurate and up-to-date country names from a standardized list. This ensures a unique identifier for each active and historical record. Our tools, like the data management and governance portal, make creating and revising country names automatic and efficient.
What’s next? We will continue to refine the historical data. Our immediate priority was to deliver a list of current and recent historical names. The next steps will be to expand upon this list, reviewing archival materials to include more historical names from the 20th Century.

In essence, our focus on data governance and proper data management sets the stage for common standards, promoting data interoperability across the Government of Canada.

For more information, please contact!

Innovation, Science and Economic Development Canada: Transforming service management reporting to enable data-driven decision making

This video highlights the transformation of the Innovation, Science and Economic Development Canada Service Inventory process to innovative systems that foster data-driven decision making.

February 6, 2024


Many departments are wrestling with the challenge of digital transformation and process automation to enable data-driven decision making while their legacy manual processes and tools struggle to keep up.   

At Innovation, Science and Economic Development Canada (ISED), one example of this was the Treasury Board requirement for departments to provide a Service Inventory update, which is a list of their external-facing and enterprise-internal services and accompanying service standards.   

Before 2023, the Service Inventory update was completed entirely through email correspondence and Excel spreadsheets. The process was inefficient, there were issues with data quality and document version control, leaving room for improvement, including on the user experience side.    

While the end product met the TBS reporting requirement, it didn’t fulfill its potential in terms of added value for internal planning and reporting.    

In early 2023, we embraced the agile, iterative mindset and began work to transform the process and to leverage technology to improve the data quality and the user experience for all involved.   

Our first step in introducing a more streamlined, digital process was to work with our IT colleagues to develop a Service Inventory application using the Power App tool.   

As with all digital service delivery, taking a user-centric design approach is critical so we made sure to collaborate with our service owners early and often to ensure that their requirements were met and to build buy-in as the future end-users.   

Through the application, service owners can now easily add or edit a service as well as the associated service standards.    

  • For example, when we select Edit Service, it takes us to the service review page where the service owner can quickly review all of the associated data and only update what needs to be changed.     
  • It also requires that both English and French content be added where applicable, such as the service description.   

 As we worked on the Inventory application, we also began work on a more robust way of reporting.   

Our goal was to enable self-serve access to more dynamic data for senior executives and staff working on planning and reporting.    

We started by collaborating with our corporate planning team and our Application Portfolio lead to align ISED’s Programs, services and enabling applications. Through a focus on linking data sources, standardizing fields and defining information requirements, the dashboard meets end users’ data and information needs. These data linkages were critical as we worked closely with an ISED data team over several months to develop the first, foundational iteration of our new Service Oversight Power BI dashboard. 

By linking to the Inventory Power App, the service data reflects the updates being made by service owners to their services and service standards. We have also identified additional data points such as which ones are flagged as “critical services” and which services are collecting fees or using the CRA Business Number.   

We have already received a lot of positive feedback from all levels, including from our service owners.   

We are now planning to develop a Power app for the Application Portfolio to digitally transform that process as well and we would also like to incorporate additional data sets into the dashboard.    

Both products were completed in a short amount of time and are a testament to the value of taking an agile, iterative approach and collaborating to find digital solutions for process improvement where efficiencies and improved user experience are needed.  

Treasury Board of Canada Secretariat: Introduction to the Office of the Chief Human Resources Officer’s Human Resources Data Model

A brief description of the human resources data model and benefits along with an example of its application to help make more connections with human resources data.

February 6, 2024


Hello and welcome to the introduction of the Human Resource Data Model, where we will explore the significance of data in an increasingly integrated world.

Imagine the data model as a blueprint or a roadmap guiding us through the complexities of information.

Its value lies in fostering interoperability with other processes, breaking away from the isolated silos of data and adopting a more integrated workforce approach.

Security and protection of data is paramount in a world where workforce information accumulates at an exponential rate. The model gives you confidence in knowing what data is of value, sensitive and in need of protection.

As your organization changes, so do your data needs. Scalability and flexibility are crucial for the organized integration of data. The model scales with you, adapting and accommodating without losing its structure.

Think of data navigation as a journey; the data model acts as your map, efficiently guiding you through the universe of Human Resources to your desired destination.

When applied, the model helps ensure effective use of information, much like managing an inventory or a catalogue.

Above all, the data model facilitates efficient and automated decision-making, a key driver of positive business outcomes and mature data management practices.

The current state of managing data is very much like having your office littered with countless Lego pieces, necessitating cumbersome searches or reliance on others for information.

By embracing the data model together, we aim to connect information and gain insights, empowering us to act swiftly and proactively shape change.

Fundamentally, the model can optimize data quality management helping you to empower your organization with better workforce intelligence.

Now, let’s dive into the data model itself.

This simplified core workforce page illustrates the key components. For example, a person, who becomes an employee is linked to a position (and in some cases not), which is categorized by a job, that fits within an organization.

Surrounding the person are various data points associated with their profile, which expand upon becoming an employee.

The interplay between these core areas forms the basis of HR, where everything revolves around the comparison of data between work and people.

This page truly focuses on how to answer the question of “who is where, doing what, and for what reason?”.

To access the data model and its corresponding data dictionary, visit our GCXchange site for a downloadable copy.

We recommend navigating the Model’s PDF file for a better view of all the details and connections, including the nine supporting business lines.

Remember, using a data model isn’t complicated, but not using one is.

To learn more about the model and how to apply it, we encourage you to connect with us to explore your HR data management challenges and questions.

Contact our email address included on this slide ( to set up a session where we can collaborate, set expectations, and implement common HR data practices together.

Fisheries and Oceans Canada: AI innovations – Modernizing fisheries electronic monitoring

Fisheries and Oceans Canada demonstrates one of the ways they are using AI to streamline operations and promote the sustainable management of fisheries.

February 6, 2024


AI innovations at DFO: A closer look at how DFO uses AI to modernize Electronic Monitoring of Fisheries.

Canada is home to the world’s longest coastline and one-fifth of the world’s fresh water.

The Department of Fisheries and Oceans Canada is dedicated to managing and protecting these invaluable resources.

The task is daunting.

The ocean is vast, and much of what happens on or beneath its surface remains a mystery.

The only way to get insight is by collecting data, and using sensors and satellite technologies. The amount of data can be overwhelming.

Many processes to analyze this data are manual and time-consuming.

That’s where Artificial Intelligence comes in.

The potential of AI for DFO is immense.

We can utilize AI to streamline scientific research,
detect catch events, and ensure quality control of ocean data.

AI can also help us identify ghost gear from underwater imagery.

Imagine an AI-powered fisheries atlas that predicts catches per fishing region, a chatbot that promptly answers questions about fishing licenses, rules, and regulations, or predictive maintenance solutions for vessels.

These are just a few ways AI can be a powerful tool in the preservation of our waters.

As an example, we’re testing an AI project within the fishing industry in the Pacific Ocean, specifically those using large nets to catch fish.

This project aims to accurately calculate the weight of the catch and the distribution of different fish species.

This will help to reduce mistakes that can occur when humans handle this process.

Let’s take a look at how this works.

Electronic monitoring reviewers can enter the review screen and select an electronic monitoring video to load to their browser from their computer. Reviewers can play back the video to find sections where fishing activity is occurring and select frames that are of interest for catch weight and distribution estimation.

When a frame is chosen, the reviewer has the ability to input the tow number, camera details, and the vessel’s name, and then forward these image frames for AI analysis.

The goal of the analysis is to predict catch weight and species distribution.

Each image can be selected to examine the catch weight estimate and the species distribution provided by the AI model.

The reviewer then has the option to either accept or reject the image based on the confidence level of the AI’s analysis.

This solution aligns with DFO’s goal to improve fisheries monitoring for timely access to catch information.

What’s more, it provides the potential for accurate catch weight estimates on every tow in every trip while maintaining the same reviewer time investment.

This increases review efficiency and effectiveness.

This solution also provides a comprehensive analysis at various levels the individual vessel, the specific fishery, and the broader region.

This is particularly important for ensuring compliance with total catch quotas.

This project is just one demonstration of how we’re using AI to streamline operations and promote the sustainable management of our fisheries.

While we are excited about the potential of AI, we are equally committed to using it in a responsible, ethical, and transparent manner.

Our work aligns with the Treasury Board Secretariat’s Directive on Automated Decision-Making, ensuring that our AI applications respect the principles of privacy, fairness and accountability.

Fisheries and Oceans Canada: Data modernization for the Pacific Salmon Strategy Initiative

At Fisheries and Oceans Canada we are innovating to better connect with, discover, share and use data. Learn how this is benefiting the Pacific Salmon Strategy Initiative.  

February 6, 2024


Hello. My name is Dominique Lalonde and I am the Director of the Enterprise Data Hub with the Chief Digital Office Sector at Fisheries and Oceans Canada. 

As is not uncommon elsewhere, at DFO, we have data that is scattered and in silos.  There is no easy way to discover and access it when needed.  As a result, data may be duplicated and not used efficiently to its full potential. 

Through our Enterprise Data Hub program, we are innovating to better connect with, discover, share and use data.  Our goal is to take data from its multiple sources and turn it into strategic digital assets for the department.

Learn how this is benefiting the Pacific Salmon Strategy Initiative.

And I’m Joey Seto, a Senior Data Advisor with Strategic Data Policy & Analytics. 

Data Modernization is a big part of the Pacific Salmon Strategy Initiative, a $686 million federal endeavor aimed at conserving and protecting Pacific salmon, their habitats, and ecosystems across British Columbia and Yukon.

Salmon stocks are facing a steep decline due to a range of stressors, including climate change, habitat degradation, illegal or unregulated fishing and others.  

Data collection is highly specialized, regionally focused, and much of it happens in the field – or stream, rather.

An area biologist might survey during spawning season by walking in a stream, doing a snorkel swim or fence counts. Measures include the number of fish, by age and species, water levels, temperatures and weather conditions.

This data combines with other subject areas to help answer the simple question “How are the salmon doing”?

The data may be gathered on paper or digitally, in many different formats.   The same data may be collected, stored and used by different interest groups.  Copies are sometimes made and eventually, it’s hard to tell if the data is fit for purpose, ultimately it impacts confidence in the data.

The vision of modernizing Pacific Salmon data is to provide trusted data, data sharing and collaboration mechanisms, as well as modern analytic tools to derive insights, answer business questions that impact policy, program and investments accordingly.

DFO recently launched the Enterprise Data Hub (or EDH) to connect to data and make it discoverable, shareable and usable across the department. It’s also the way to share DFO data externally. The EDH is a broker, connecting those who have data with those who need it, in a secure, Protected environment.

For the modernization of Pacific Salmon data, we are working to build a web-based data portal that will showcase key data, information, and data products that support conservation and restoration activities.   

The data portal will feature interactive dashboards, story maps, and modern data visualization tools in a manner that can be easily used by specialists as well as the general public.

How are we doing this? 

Step 1 was to address storage.  We set up the technical components for a Data Lakehouse.

This solution uses Cloud computing, with a Databricks medallion architecture to support raw, standardized and curated data sets.

Step 2 was to bring in different types of data from multiple sources. It included structured, semi-structured and non-structured data, such as spatial data and sonar images, into one place.

Step 3 will involve standardizing, incorporating and merging data for centralized analytics and reporting for the Pacific region.  

What were some of our challenges?

First of all, with dozens of different data subject areas spread over 100 different data sources, where do we even start?

We were very lucky to have team members that have a good understanding of the Salmon Data ecosystem, including what data were appropriate, where the data was located, how it was structured and managed.  This points to the importance of data governance and data stewardship.

Another challenge was dealing with protected data, such as commercial fisheries licensing data. We looked at using masked data, with random data generator and other tools, each with pros and cons. 

Ultimately we developed multiple production environments to meet architectural as well as security requirements.  

A significant and ongoing challenge is building a data culture, digitalizing the way we do things, and changing the view that data is not just for a specific activity, but data as a strategic, enterprise and crown asset. 

Success is a 3-way partnership between Business, Data Governance and Technology.

In summary, we are connecting to data, and allowing users to discover, share and use data to derive insights and impacts will help conserve and protect salmon.  

We are excited to leverage the same abilities and turn data into strategic enterprise digital assets for other purposes and interest groups. This is just the beginning!

Thank you for watching.

Global Affairs Canada: Leveraging OpenGov, ChatGPT and Power BI to visualize ministerial travel 


This video introduces a dashboard that was created to better understand international travel patterns by members of Cabinet, focusing on the hurdles we faced in its creation, and how we leveraged data available on OpenGov as well as ChatGPT to overcome them. 

February 6, 2024


Hi everyone, my name is Ewelina, and I’m a Data Analyst at Global Affairs Canada and work within the Indo-Pacific branch Data Team.

In this video, I will be introducing a dashboard we created to better understand international travel patterns by members of Cabinet, focusing on the hurdles we faced in its creation, and how we leveraged data available on OpenGov as well as ChatGPT to overcome them. 

I’ll start with insight into how this project came about. Last November, Global Affairs Canada launched Canada’s Indo-Pacific Strategy. One component of this strategy is increasing engagement with the region to demonstrate its ongoing importance to Canada. One way to report on this engagement is to look at travel to the region by Ministers and high-level officials, which is what we were tasked to do. Some key questions we were looking to answer to help us tell this story of increasing engagement included: For one, where in the Indo-Pacific are Ministers and high-level officials traveling to? Where else in the world are they traveling to? How does one region of the world compare to another in terms of number of trips? What does this look like over time?

The next step of the process was locating the data we needed. We were able to locate data for our own ministers, leveraging a Microsoft List maintained by our briefing unit, but we knew of no readily available data on Ministerial travel outside of Global Affairs Canada. This was a major gap in assessing Canada’s engagement, and presented us with a challenge.

Do we ask other departments to share with us when their ministers are traveling to the region? Do we coordinate with the missions in the Indo-Pacific, who would likely know when a minister was coming? Can we scrape government news releases for what we need? We explored these options and others, and concluded that they were either impractical or unreliable. But, we also learned of a central repository of information available not only to public servants, but to all Canadians and that’s the Proactive Disclosure for Travel Expenses maintained by the Treasury Board Secretariat, where senior officials are mandated to disclose all expenses incurred for work-related travel, and this includes the location and timeframe information we were after. A subset of it will appear on your screen momentarily. While this data source is not necessarily complete or always timely, it does provide the most comprehensive repository of ministerial travel data that is publicly accessible.

The location information that was crucial for us was in this data source, but the way in which it appears is incredibly inconsistent, and definitely not in the specific format the program we used (PowerBI) requires. For example: we required the correct 3-digit internationally recognized code to identify the country of travel. However, the location data provided was in a wide variety of formats. Sometimes we saw country names, sometimes it was abbreviations, sometimes city names or city abbreviations, sometimes we had multiple countries in a field or multiple cities in one field, and combinations thereof.

Manually going through and cleaning and coding nearly 90,000 rows of data was also not feasible. But, given we were working with publicly available data, we called on ChatGPT to comb through the locations and generate the 3-digit code we needed to determine the countries corresponding with each trip.

The result of this work is the dashboard page you see here. In the map, each bubble represents ministerial travel to that country since 2018 with the colours aligning with the region. Hovering over each bubble provides information about each trip that was taken. To the right, we have a chart showing the distribution of trips across regions, as well as one with the distribution of trips across the regions for each minister. Finally, a number of filters provide the ability to refine what is displayed, and using the year filter, we can see trends over time.

So, in summary, by using data from OpenGov we were able to responsibly use ChatGPT to automate parts of the creation and updating of a dashboard made to explore ministerial travel to the Indo-Pacific region and beyond. This dashboard continues to be a work in progress, but has already sparked several discussions on how it can be leveraged for reporting and further refined. We think it’s a great example of using novel approaches to take data collected for one purpose and using it for another, in this case to help answer how we are advancing an important priority for Canada. I’d encourage you to think about what other sources of data there are out there that could, unexpectedly, help you answer a question of your own.

Thank you for watching!

Global Affairs Canada: SGA – Data Centre of Expertise

This video is a representation of one of the projects set up to support the DPF focal point team for partners. It involves gathering data from different teams and creating a data model in Power BI, to be able to represent the partners’ portfolio in terms of projects in an easy-to-use and up-to-date dashboard.

February 6, 2024


Jane: Hi there! Do you think your team can create something automated to help us with our Partner Portfolio Dashboard? it is used for the Partner Focal point initiative.

John: Yes sure! tell me more about this initiative

Jane: The CFO Partner Focal Point initiative was created to manage from a holistic perspective the CFO-to-CFO relationship with strategic partner organizations that implement several international development projects funded by Global Affairs Canada, for increased efficiencies both for the organizations and the Department.

Under the initiative, a Grants and Contributions management expert is named as the CFO Partner Focal Point for each participating organization. The Focal Point and the CFO of the organization then meet one-on-one on a regular basis to discuss and address systematic issues and challenges related to the financial matters of their projects. This greater openness and transparency between Global Affairs Canada and the partners allow to gain knowledge about persistent or recurring problems, which then informs continuous improvement of financial policies development, due diligence, and oversight.

John: What information does the Focal point need?

Jane: The role of a CFO Partner Focal Point requires having a good understanding of the organization’s overall portfolio of ongoing projects and those under evaluation, organizational mandate, fiduciary and financial risks, past audits, etcetera. However, the department’s grants and contributions management system did not allow to easily gather this information, and sometime stored in various databases owned by multiple teams. When the initiative started few years ago and up until Fall 2023, an Excel dashboard was created and updated manually to compile all of this portfolio-level data about the participating partners and their operational projects. This was a time-consuming task, making the tool irrelevant most of the time since the data was rarely up-to-date.

The CFO partner focal points also use individual documents in various formats like excel, word and even one Note to document their exchanges with partners. This makes it very challenging to extract any summary information from the initiative such as the most recurrent issues or any related efficiencies achieved.

John: Interesting! Let me see what we can do.

John: Hello! I have great news for you! We collaborated with the different teams and successfully gained access to the different databases managed by each team. We then used Power B I to construct a comprehensive data model. This data model served as the foundation for our dashboard. we meticulously designed and developed a dynamic dashboard that presents key insights and information derived from the amalgamation of data. We successfully presented the consolidated information in a visually compelling and easily understandable format.
You can see here in the home page the list of partners that are part of the initiative. You can select a partner from here and click on Partner details. It will take you to the partner portfolio view. In this view you will find general information about the organization at the top of the page. You can use the buttons in the middle of the page to see different visuals related to the partner’s latest risks or to the key issues and efficiency gains discussed with the partner. You can easily drill through from this page to a specific project, by selecting a project from the table and clicking on project details.

In the project details view you will be able to see general information at the top of the page. The page opens in the overview by default, but if you are trying to have more information about the financial standing of the project you can click on the “project progress” button. There’s also a “Fret” button to view the project’s risk assessment with the risk response measures in place.

The second tab of the power B I app is a summary of all key issues and efficiency gains addressed during meetings with partners. Since you didn’t have a standardized note taking method, we created an excel file that Partner Focal Points can use. The excel table is clear and easy to use, it can be found in your share point page. we retroactively added previous notes that we were able to find in it to get some data.

Jane: Thank you so much for this! I am sure it will be helpful to us!

Jane: Hello! After testing the new dashboard, we noticed great improvements. in particular: we can now easily access accurate and up to date data related to partners, resulting in time savings. in addition, the dashboard now allows us to perform analysis on the key issues raised by Partners and actions taken or to be addressed. we are now able to share this information with colleagues and management to support a more cohesive engagement with external partners and inform improvements in the Grants and Contributions management.

John: Thank you for the update! I am happy to hear this!

Public Health Agency of Canada: Data modernization and interoperability initiatives

This video will demonstrate the various data modernization and interoperability initiatives at the Public Health Agency of Canada.

February 6, 2024


We would like to begin by acknowledging that the land on which we gather to capture this video is the traditional unceded territory of the Algonquin Anishnaabeg People.

At the Data, Surveillance, and Foresight Branch, we have a mandate to work collaboratively with our partners to drive change in the underlying systems at the Public Health Agency of Canada, to develop smarter, faster, more precise, and cost-effective public health action. Today, I will walk you through two initiatives that are driving this change at PHAC, the Data Modernization and Interoperability Initiatives, also referred to as DMII.
Let’s breakdown and discus the two coordinating pillars of DMII and how they intersect, starting with Data Modernization.

The Data Modernization Initiative describes the data management models of the Public Health Agency of Canadas’s surveillance systems. Understanding the diversity of core data management activities will play a key role in preparing and supporting programs with the tools they need to meet future challenges. The initiative is identifying concrete opportunities to enhance and modernize data capabilities, interoperability, and programmatic efficiencies for surveillance systems.
The DMI ensures that data management models remain efficient, agile, and responsive to evolving public health needs.

Given the heterogeneity of data management within the Public Health Agency of Canada’s surveillance systems, a survey of open- and closed-ended questions was distributed to surveillance systems to gain an understanding of the current data management. This survey comprehensively examines data management at all stages of the data life cycle – from collection to dissemination – and identifies challenges, solutions, and areas for efficiencies throughout.

These results will generate evidence-informed recommendations to improve how data moves through surveillance systems and the agency, while recognizing the diversity of data, challenges, and surveillance systems within the agency.
From there, strategic enhancements will aim to modernize data management by improving integration and interoperability throughout the data life cycle.

Addressing the need for a people-centred approach, adequate resources, and the integration of data management and epidemiology was identified as a key consideration from preliminary consultations.
Recommendations will include forward-thinking strategies and evidence informed interventions to enhance the underlying surveillance systems, leading the Public Health Agency of Canada towards smarter, faster, more precise, and cost-effective public health actions driving the Agency forward.

The next piece of DMII is the Interoperability initiative.

Interoperability is a key factor for Data Modernization, and refers to the capacity for seamless sharing of health data and information between health sector stakeholders by means of policy, governance, workflow and technical alignment. With increased interoperability, programs become more resilient during public health responses and are better able to withstand staff turnover, budget fluctuations, and evolving data needs.

Through collaboration and shared learning with partners and stakeholders, the Public Health Agency of Canada’s interoperability initiative aims to assess and enhance connectivity – both at the agency and at the individual surveillance system-levels.
There are four components of interoperability:

  1. Data Governance & Data Policy
  2. Data Standards
  3. Technology, and
  4. Interoperability Resources & Supports

Interoperability is a continuous path rather than a destination. So, to measure where the surveillance systems are currently in their interoperability journeys, maturity assessments have been developed. These assessments are used to evaluate the interoperability of each surveillance system, as well as what structures are in place at an Agency level to support the systems. Each component is evaluated on a five-point Likert scale as part of a good, better, best model to promote a culture of continuous growth. Additionally, an Interoperability Council was created to leverage stakeholder expertise, align agency and program priorities, coordinate work, and amplify efforts to advance interoperability.

So, what does this tell us?

The maturity assessment results and surveillance system needs are used to develop interoperability roadmaps to guide activities at both agency- and system-levels.

The Interoperability Council will coordinate the development of tools and standards as part of an interoperability toolkit used to support individual surveillance systems as they evolve to increasing degrees of interoperability.

Specific recommendations will include lessons learned from implementation of the Interoperability framework and will guide progress at the agency- and surveillance system-level, with the goal that the Public Health Agency has smarter, faster, more precise, and cost-effective public health actions.

As we enter this time of forward progress, we are excited to lead the implementation of strategic enhancements for the modernization of data management and interoperability at the Public Health Agency of Canada. This project and overall collaboration will ensure that the Agency has smarter, faster, more precise, and cost-effective public health actions.

Statistics Canada: Simplify information extraction with AI powered solutions

Study these three financial documents. In the next five seconds, can you find and identify the name of store, telephone number and total price? Ready? GO!

February 6, 2024


(Three financial documents appear on screen followed by a timer with “5” seconds on it)

Study these three financial documents. In the next 5 seconds, can you find and identify the name of the,

(The words “Store Name, Phone Number and Total Price” appear on screen)

store, telephone number and total price? Ready? GO!

(The time ticks down to 0)

Daunting, isn’t it?

(Seven more financial documents fill the screen)

Would you like to try now? Didn’t think so.

(Various words, symbols and numbers are decoded on screen revealing the name of the business, email address, type of service and total price)

When it comes to extracting meaningful information, you’re tired of parsing through stacks of complex, unstructured information in file formats that are trickier than most.

(A human icon appears on screen with a symbol of artificial intelligence)

(The AI loading bar reaches full before the human one)

You want your data faster.

(The numbers “$999” appear on the human side. The numbers “$17” appear on the AI side)

You want to keep expenses down.

(The word “Errors” appear on both sides, the human side counts to 6 while the AI side stays at 0)

You want to reduce human error.

(The AI symbol pushes the human section out of the way. Four icons representing disaggregated data and technology appear from the AI symbol)

The answer? Look no further: solutions developed by experts at Statistics Canada Division automate your Information Extraction needs

(The text “Information Extraction” appears on screen)

(Several blank PDF’s and images fly into the screen)

Information  extraction is the process of parsing through various types of unstructured information like pdfs and images,

(A magnifying glass comes into view and reveals hidden information inside the image)

and extracting essential information into more editable and structured data formats.

(All the PDF’s and images clear from the frame except for one PDF file that has lines and a graph on it)

Our experts use cutting-edge technology to simplify these processes.

(Two icons roll into the frame, one of a man with a gear in his head and the other, a brain with a microchip)

(The words “Machine Learning” and “Deep Learning”  appear on screen)

This technology uses various Machine Learning and Deep learning algorithms to extract information

(The PDF icon spins and turns into and image. It the flies off the screen, revealing an icon of a lady)

from not only tables and text-based documents, but images as well.

(Wifi bar symbols emerge from Stella, followed by a photo of her passport which appears next to her)

Take Stella for example. She uploads a photo of her passport to the ArriveCAN app.

(The passport moves over, revealing the AI icon)

With the use of Optical Character Recognition,

(A blue bar scans over the passport and outlines Stella’s key information on her passport)

 her information is quickly captured and disaggregated,

(Relevant information is transferred to a phone displaying the ArriveCan app)

 promptly being applied to her profile. Saving her time, and removing human error.

(Everything is removed from the screen except for Stella’s passport)

(Two more passports emerge from behind Stella’s and get plugged into a computer. Three phones displaying the ArriveCan app get plugged into the other side of the computer)

We even employ automated batch processing, which can help extract information and data from several PDFs and images at a time.

(Everything flies off screen. The AI icon returns followed by four icons which spin around each other)

Our information extraction solutions are applicable to so many domains, ranging from financial reporting…

(The icons move out of the way of an icon displaying a financial document that shifts to the center of the screen. Four more financial icons emerge from the original)

financial reporting…

(The icons move out of the way of an icon displaying a medical bag that shifts to the center of the screen. Four more medical icons emerge from the original)

Healthcare systems.

(The icons move out of the way of an icon displaying an opened letter that shifts to the center of the screen. Four more HR icons emerge from the original)

Human Resources

(The icons move out of the way of an icon displaying a medical instrument that shifts to the center of the screen. Four more medical icons emerge from the original)

Even Scientific and pharma research!

(The icons move out of the way revealing a person)

And hopefully you as well.

(The words “For more information about our information extraction solutions, please contact:

(The Canada wordmark appears on screen)

Date modified: