The guideline most firms have not read yet
At engineering conferences, in firm principals' offices, and across LinkedIn threads, the same question keeps surfacing: what does PEO say about using AI tools for sealed work? The question is often asked as though the answer is still pending.
It is not. On 22 November 2024, Engineers and Geoscientists BC published a Practice Advisory titled Use of Artificial Intelligence (AI) in Professional Practice. A year later, Professional Engineers Ontario formally adopted that advisory for use as a guideline in Ontario. The cover sheet on PEO's hosted copy reads: "The following Practice Advisory was developed by Engineers and Geoscientists BC and has been adopted by PEO for use as a guideline in Ontario." The adopted text is specific about what AI use looks like in regulated practice, what quality management it requires, and what it does not require.
Most Ontario firms have not yet read it. The advisory sits at peo.on.ca under the 2025-11/ path and runs ten pages. It is not a press release and did not generate much coverage. But it is the thing the profession has been asking for.
The advisory does not stand alone. It rests on a regulatory framework that was already rewritten in 2021 to express the responsibility question in tool-agnostic terms. Read together, the two documents describe a regulatory environment that is permissive of AI-assisted drafting within a responsibility-based model, and they are quite specific about where the engineer's obligations begin and end.
What section 53 actually says in 2026
Ontario Regulation 941, made under the Professional Engineers Act, defines professional misconduct for licensed engineers in Ontario. The load-bearing provision for AI-assisted drafting is section 53, which was rewritten by O. Reg. 837/21 and has been in force in its current form since 2021. Section 53(2) reads:
"A practitioner shall, subject to subsections (7) and (8), sign, date and affix their seal to an engineering document if, (a) the document's engineering content is prepared by the practitioner; or (b) the practitioner otherwise assumes responsibility for any part of the document's engineering content."
The word "otherwise" in clause (b) is the provision that makes AI-assisted drafting workable under Ontario regulation. The regulator has written a sealing rule with two alternative pathways: preparation, or assumption of responsibility for content that someone or something else prepared. That second pathway is not new to the profession. It is the rule that has always let a Professional Engineer seal work drafted by an EIT, a technologist, or an external consultant. The 2021 rewrite simply stated the principle directly, rather than leaving it inferential.
Section 53(11) adds a presumption: the presence of a practitioner's seal on an engineering document "shall, for the purposes of a proceeding before the Discipline Committee, be presumed to indicate that the practitioner assumed responsibility for engineering content in the document." The seal is evidence of the assumption. How the draft was produced is not the regulator's question. Whether the engineer took responsibility is.
Section 72(2)(e) of Regulation 941, which lists the acts constituting professional misconduct, now simply cross-references section 53: an engineer commits professional misconduct by "signing, dating or sealing an engineering document, or failing to do one or more of them, in contravention of section 53." The misconduct test and the sealing rule are the same rule. The substantive language about authorship and checking lives in section 53.
Other provisions of section 72(2) remain directly relevant to AI-assisted work:
72(2)(a) Negligence, defined in 72(1) as "an act or an omission in the carrying out of the work of a practitioner that constitutes a failure to maintain the standards that a reasonable and prudent practitioner would maintain in the circumstances." Standard of care applies to AI-assisted work product identically to any other. An AI-drafted report that contains an unchecked hallucinated citation is not excused because a tool introduced the error.
72(2)(d) Failing to make responsible provision for complying with applicable statutes, regulations, standards, codes, by-laws and rules. When an AI-drafted report cites OBC 1.2.2.2 or CSA A23.3, the engineer remains responsible for the accuracy of those citations. This responsibility does not shift to the tool.
72(2)(h) Undertaking work the practitioner is not competent to perform. Competence bounds apply to AI-assisted work. An engineer using an AI tool to draft in an area outside their competence does not become competent by virtue of the tool.
The through-line across all of these is consistent: the regulation governs the engineer's responsibility for the output, not the method of producing the first draft. An AI-drafted report reviewed and sealed by a licensed professional is governed by exactly the same rules as a junior-drafted report reviewed and sealed by the same professional. The standard of care is identical. The accountability is identical. The seal means the same thing.
What the EGBC advisory adds on top
The advisory PEO adopted is directly about AI. It does not replace Regulation 941; it describes what responsible AI use inside the regulation looks like.
The core principle is simple and repeated: "engineering/geoscience professionals remain professionally responsible for their work even when it is generated by or includes AI output." Professional responsibility is the anchor, AI use is the variable.
The advisory organises operational expectations into four quality management pillars. Each is a direct quotation from the adopted text:
-
Documented checking. For AI-based tools, documented checks may include "Noting the make and version of the AI-based system or tool used", developing test cases, recording input data and outputs for validation, and for dynamic AI systems, validating outputs with each use.
-
Direct supervision. "The engineering/geoscience professional taking professional responsibility for the work must apply the same standard of care as if they were using the AI-based system or tool themselves."
-
Document retention. "Records must be retained and preserved for a minimum of 10 years after the end of a project or 10 years after a document used in continuing work is no longer in use." This is stricter than Regulation 941's general retention posture and applies specifically to AI-involved work.
-
Independent review for high-risk professional activities and work.
The advisory explicitly addresses hallucinations, the failure mode that caused sanctions in the legal profession when Mata v. Avianca (2023) exposed ChatGPT-fabricated case citations. "Engineering/geoscience professionals remain responsible for their work product and can face professional consequences for work product containing AI hallucinations." This is one of the few places where the advisory speaks about a specific AI failure mode rather than about responsibility in the abstract.
The advisory is also specific about what it does not require. On disclosure: "Engineers and Geoscientists BC does not have explicit requirements on the disclosure of the use of AI-based systems and tools." There is no mandatory disclosure. Voluntary disclosure, as part of the report's description of work or as part of the firm's quality management documentation, is compatible with the advisory.
On the threshold between light and heavy documentation, the advisory draws a useful line: "if AI is used to assist with written content for a report (e.g., grammar or paragraph structure) and this content is then reviewed by an engineering/geoscience professional, this may not need documentation. However, if AI is used to generate results for a report or to make design decisions, then its use must be documented to the same extent as if an engineering/geoscience professional had researched the sources or completed the calculations themselves."
This threshold matters for firms evaluating AI drafting tools. A tool that organises the engineer's own field observations into report format sits closer to the first example. A tool that generates engineering conclusions from raw data sits closer to the second. The documentation burden follows the role of the AI in producing the output, not the mere fact of AI involvement.
The historical precedent still holds
Even without the advisory, PEO's prior pattern on tooling transitions would have predicted the outcome. The profession has absorbed several generational tooling shifts while keeping the same professional-responsibility framework.
When digital seals replaced wet-stamp seals, PEO updated the Seal Guideline to address electronic seals and signatures. The guideline did not change the standard of care. It confirmed that the same professional responsibility applies regardless of whether the seal is applied with ink or with a digital certificate. The 2021 amendment to Regulation 941 likewise added electronic seal provisions to section 53(3) without changing the substance.
When Building Information Modelling changed how engineering documents were produced and shared, PEO did not publish BIM-specific practice guidance. The existing framework - the engineer seals what the engineer supervises - covered the transition. BIM changed the production method. It did not change the professional obligation.
When professional engineers began collaborating through cloud platforms and remote review tools, PEO's position remained the same: the professional who seals the document is responsible for its contents, regardless of how the collaboration occurred.
PEO regulates the drafter, not the drafting tool. This pattern is durable because it is tool-agnostic by design. A regulatory framework that specifies permissible tools becomes obsolete every time the tooling changes. A framework that specifies the professional's obligations remains valid across every tooling transition. The EGBC advisory the province adopted in 2025 is consistent with this pattern. It describes what the professional must do when AI is involved. It does not restrict which tools the professional may use.
Where the remaining gaps are
The combined framework is more developed than it was a year ago, but not complete. Three genuine gaps remain.
AI-specific audit trails are not yet standardised. The advisory sets a 10-year retention floor, but it does not prescribe what the audit trail must contain. A firm that retains the original AI draft, the edits the engineer made, the quality flags that were raised, and the timestamp of attestation has a defensible record. A firm that retains only the final sealed version has a weaker evidentiary position if the record is ever examined. The regulator has not yet specified the minimum contents of the audit trail, and individual firms are making their own decisions about what to keep.
Training-data and confidentiality norms are contractual, not regulatory. The advisory acknowledges privacy and intellectual property concerns. It recommends that firms "be familiar with the contract or end-user license agreement governing their use of an AI product." It does not specify the terms firms should demand. Whether project data becomes training data for the AI vendor is a contract question between the firm and the vendor. PEO has not, so far, taken a position on acceptable vendor terms.
Jurisdictional alignment beyond Ontario is uneven. The advisory is EGBC's text adopted by PEO. APEGA has not yet adopted a comparable advisory. Engineers Canada has not issued a national guideline as of mid-2026. Firms operating across provinces face a patchwork until national alignment catches up.
None of these gaps creates a regulatory barrier to AI-assisted drafting. Each is a reason for firms to maintain their own internal discipline rather than to wait for further guidance.
How firms are actually operating
Under the combined framework, Ontario structural and building-science firms are converging on a similar pattern:
The AI tool is a drafting instrument, not a professional service provider. The firm treats the AI tool the same way it treats a Word template or a CAD package: as a production tool that assists the engineer's work. The firm does not delegate professional judgment to the tool.
The reviewing engineer is responsible for every element of the output. Every observation, finding classification, code citation, photograph caption, and recommendation in the AI-drafted report is reviewed by the engineer before sealing. The review is not a skim; it is the same review the engineer would perform on a draft produced by a junior staff member.
The firm's quality management process documents the AI drafting step. Firms are noting in their internal quality management documentation that the firm uses AI-assisted drafting, that all output is reviewed by a licensed professional before sealing, and that the firm's standard of care is unchanged. This satisfies the EGBC advisory's documented-checking pillar.
Client data is not shared for model training. Firms are evaluating vendor terms of service to confirm that project data - site photographs, observation notes, sealed report content - is not used to train AI models. This is a contractual matter, not a regulatory one, but firms are treating it as a PEO-adjacent concern.
The audit trail is kept for at least ten years. Firms retaining original AI drafts, edit diffs, quality flags, and attestation timestamps for the ten-year minimum the advisory sets have evidence they can produce if the work is ever examined.
These are conservative practices, and conservatism is appropriate. Firms following this pattern are on solid ground under both Regulation 941 section 53 and the adopted EGBC advisory.
The question is not whether
The question is not whether PEO permits AI-assisted drafting of sealed work. PEO has permitted it, in writing, since adopting the EGBC advisory. The question is whether your firm's workflow treats the AI tool as what it is: a drafting instrument that produces output requiring the same professional review as any other draft, retained in an audit trail that would stand up to examination, inside a quality management process your firm can describe to its insurer, to its clients, and if ever asked, to the regulator.
If your firm can describe those things, you are already operating inside the framework Ontario has in place.