Hope Lompe
Local Journalism Initiative Reporter, Gabriola Sounder
Fire District content from Derek Kilbourn, Editor, Gabriola Sounder
AI is rapidly changing how governments work, with several risks to be considered like personal and private information entered into data servers in countries outside of Canada.
After the Sounder reported in the April 1 edition that the Islands Trust does not fully know how AI is used by consultants and staff, we asked Gabriola’s other local governments what their AI policy is, and how it is being used internally.
Regional District of Nanaimo (RDN)
The RDN’s Chief Technology Officer, Jason Birch, told the Sounder that while they are not bound by B.C.’s Artificial Intelligence (AI) policy, they do share an obligation to protect personal information under the B.C. Freedom of Information and Protection of Privacy Act.
“Our internal AI governance policy allows staff to explore the potential of AI while protecting personal information and managing the risks inherent in these tools, such as bias, incompleteness and ‘hallucinations.’” writes Birch.
It does this by:
• Clarifying that staff remain accountable for the work they produce with assistance from AI tools: reviewing output for accuracy, bias and tone; ensuring that sensitive information is not unintentionally disclosed; and sharing new uses of AI with their supervisors.
• Requiring an additional review before approved AI tools are incorporated into workflows dealing with sensitive personal information (information that could cause significant personal harm to an individual if disclosed).
• Defining a process for the review and approval of new AI tools by a cross-departmental governance committee.
Like the province, all RDN staff with computer access are provided a version of Microsoft Copilot Chat that is included in our existing licensing and has been approved for uses consistent with policy, writes Birch.
They are also running a limited pilot of more advanced Copilot features, with representative staff learning from real world use to identify benefits and risks and ensure any future adoption is thoughtful, measured, and cost effective, he says.
“Our experience in this pilot has so far been positive, with some day-to-day uses such as supporting structured research, summarizing internal meetings and assisting with data analysis providing value, and it has also shown that gains are sometimes offset by the careful review required to correct errors, ensure completeness,” he writes, adding it also helps avoid confirmation bias commonly found in AI tools.
“There is a risk when using AI because while it may feel like it is providing independent or impartial information, the interactions with AI can create conditions where it is telling us what we want to hear. Staff need to be aware of this when using AI.”
Nanaimo-Ladysmith Public Schools (NLPS)
A specific AI policy is currently in development, an NLPS spokesperson tells the Sounder in an email.
However, while that is being developed, they say NLPS Policy 305.3 (Personal Information) and Policy 401.13 (Appropriate Use of School District Information Technology) would cover use of AI technology.
The personal information policy refers to the B.C. Freedom of Information and Protection of Privacy Act, which NLPS is also subject to and only allows for personal information to be handled so outlined in the Act, which does not include AI.
The second policy refers to use of school district technology, which must be consistent with the Freedom of Information and Protection of Privacy Act, and acknowledges technology can often move at a faster pace than what policies can keep up with.
“The entire range of possible uses that currently exist, or those that may emerge due to technological development cannot be anticipated at the time of establishing the appropriate use policy,” it reads.
Gabriola Fire Protection Improvement District
The Gabriola Fire Protection Improvement District does not currently have any policies in place to guide Trustees or staff on the use of generative AI.
The Board was asked at the April 8, 2026 general meeting on this.
Chair Erik Johnson said he personally has a real problem with AI. “I think it’s gonna, I think it’s gonna screw us badly. It’s a great tool for research, but you have to check your research at all times. And I think to a large extent, we are allowing AI to make decisions for us that they shouldn’t make, including moral decisions.
“So I’m really in favor of not using it…then again, I know that we do have an AI program that’s being used in house to a certain extent.”
Trustee David Chorneyko confirmed the Fire District does not have any AI policy, saying, “it’s something that we need to work on but we have priorities.”
He pointed to work being done with policies on meeting procedures, updating bylaws, and the code of conduct, saying that yes, an AI policy is needed, “but it needs to rise….to the top before we get working on it.”
Trustee Diana Moher said she’s been seeing plenty of coverage on the use of AI, and where organizations are clamping down on the use of it, such as in schools.
Moher said this could be something that could be added to the work of the policy committee to look at, “especially if the government is starting to put stuff out as to what we could use. And that would give us some time to start doing some research on what the recommendations are going to be.”
The Fire Board was also asked if generative AI software was used in crafting communications, either internal or external, for the Fire District or Gabriola Volunteer Fire Department.
Trustee Wayne Mercier said, “my understanding is that it’s not used in external communications, but the Improvement District does use Microsoft 365, which includes a suite of tools. Those tools include CoPilot, which is a generative AI tool, and that inserts itself into the composition of emails, suggesting words. So short of not using the email system that we pay for and use, we can’t completely dissociate ourselves from it, because it’s built in, like the grammar suggestions.
“So it would be strictly inaccurate to say we never use it for anything…to my knowledge, we don’t use it to write bylaws or policies or that sort of thing.”
Trustee Moher said the practice is to not use AI by giving prompts by saying, “write me an email that has something to do with this.”
The Local Journalism Initiative (LJI) is a federally funded program to add coverage in under-covered areas or on under-covered issues. This content is created and submitted by participating publishers and is not edited. Access can also be gained by registering and logging in at: https://lji-ijl.ca
You can support trusted and verified news content like this.
FIPA’s news monitor subscribers, donors and funders help make these available to everyone rather than behind a paywall. We appreciate every contribution because it makes a difference.
If you found this article interesting and useful, please consider contributing here.