Introduction
National and global public policy is one of the most critical tools available and underpinning efforts to advance responsible technology practices and outcomes. These policies exist in a complex network of laws, policies, and regulations — in Canada and globally. This document serves to paint a picture of this network, providing context about how technology is regulated in Canada.
While policy and law struggle to keep up with the rapid growth of technological development, a range of new bills and laws in Canada and around the world have recently been introduced, with some already coming into effect. Even in regions or areas of law where minimal progress has been made — for example, privacy laws in most regions apart from the EU and California — existing legal frameworks may ensure some degree of accountability. The following sections speak to Charter considerations, including hate speech, consumer protection laws, and labor rights, as well as privacy considerations.
This document is structured as follows. The first section discusses technology policy frameworks in Canada. The second section summarizes Canadian laws, directives, and administrative agencies related to technology — especially the primarily privacy-related ones. The third section summarizes technology-related laws enacted by Canadian provinces. The fourth section covers an array of international laws, principles, and norms regarding technology. The items covered in Part III do not represent a comprehensive summary but serve to discuss some of the most notable and well-known technology laws and norms among international organizations and civil society organizations that are adopted and/or are pertinent to Canada.
Charter Considerations
The Canadian Charter of Rights and Freedoms constitutes the set of fundamental rights recognized by Canada’s constitution. It sets a normative framework guiding legislation and policy and shapes all public and private entities’ conduct. Even if sections of the Charter do not mention technology-specific risks and harms directly, it is understood that the technology sector should respect and protect those rights. Arguably, every element of the Charter can be interpreted as having relevance for technology regulation; this section presents an overview of some of the more directly relevant considerations.
A selection of relevant Charter legislation that encompasses the use or regulation of technology is highlighted below. Many of these frameworks were established prior to the creation of modern technologies, but recent amendments to laws and policies attempt to expand the scope of these frameworks to include current technologies and broadcasting platforms.
Freedom of Expression (Section 2(b))
Emerging technologies raise questions about setting legitimate limits on speech. Democracies have been grappling with the boundaries of hate speech, incitement to violence, and disinformation/ misinformation. In a world of democratic backsliding, the fear that authoritarian and semi-authoritarian regimes will resort to censorship of speech they deem undesirable is adding to concerns around the freedom of expression. While none of these considerations are inherently about technology, social media companies, having opted for maximum customer engagement as a core component of their business model and developed powerful recommender systems to achieve that goal, receive considerable criticism for their content moderation practices. The extremely limited liability model embodied by Section 230 of the Communications Decency Act (1996) has shaped companies’ approach to content moderation in the United States, where self-regulation appears to be the norm for the most part. Other legal systems have become a lot more proactive since the mid-2010s: for example, the European Union legislated the Digital Services Act in 2022 to make illegal online what is illegal offline.
Canada’s Criminal Code bans “public incitement of hatred” (Article 319). This ban on hate speech has been upheld by the Supreme Court, as well. A more specific provision about hate speech through telecommunications media was contained in Section 13 of the Canadian Human Rights Act (1977) between 2001 and 2014 when it was repealed by the Parliament. Bill C-36 of 2021, which seeks to reinstate the provisions of Section 13, has not progressed after the first reading in the House of Commons on June 23, 2021.
Another piece of relevant legislation is the Broadcasting Act of 1991, whose main goal is to protect Canada’s economic and cultural interests in the broadcasting industry. The Act itself does not cover digital media, but the newly passed Online Streaming Act bill passed on April 27, 2023, seeks to amplify the Act’s jurisdiction and grant the Canadian Radio-television and Telecommunications Commission the power to regulate online content. Specifically, it sets rules on content created and uploaded on Canadian platforms and seeks to enhance diversity, equity, and inclusion in online representation.
Open Courts Principle (Section 2(b))
The Open Courts Principle is a fundamental principle of the Canadian legal system, enshrined in Section 2(b) of the Canadian Charter of Rights and Freedoms. This principle holds that courts should be open and transparent and that the public has a right to access court proceedings and documents. Privacy and data protection are critical elements of the Open Courts Principle, and if practiced irresponsibly, could have serious implications for individuals, including identity theft or damage to reputations. Therefore, as courts increasingly rely on technology to manage and store records with sensitive information including health and financial records, there is a need to strike a delicate balance between two competing values: 1) the safeguarding of an individual’s rights to a fair trial and privacy, and 2) upholding the public’s right to freedom of expression and access to information. Persons seeking to limit the public’s access to, and the publication of, court proceedings and records must establish a legal basis to do so through a process called the Dagenais/ Mentuck test.
Courts and policymakers should adopt responsible technology practices that prioritize the protection of personal information while still maintaining the Open Courts Principle.
It is essential that courts continue to prioritize the rights of individuals to a fair and impartial hearing, while implementing strong data protection policies, ensuring that court records are stored securely and accessed only by authorized individuals, and using technology in a manner that is consistent with these principles. Additionally, providing clear and transparent information to individuals about the collection and use of their personal information is of utmost importance.
In conclusion, the Open Courts Principle is a cornerstone of the Canadian legal system, but it must be balanced with the right to privacy and protection of personal information in the context of responsible technology. To achieve this balance, courts and policymakers must adopt responsible technology practices that prioritize data protection and transparency, while still ensuring that the fundamental principles of due process and fair trial are upheld.
Right to Liberty (Section 7)
The right to liberty is a fundamental human right enshrined in Section 7 of the Canadian Charter of Rights and Freedoms. The Supreme Court of Canada has established that individuals have a reasonable expectation of privacy in their personal information and that any interference with this right must be based on clear legal criteria and subject to procedural protections. In the context of responsible technology, the right to liberty is closely linked to the issue of privacy and data protection, but it also includes the right to be free from arbitrary government surveillance or interference in personal communications. This has become an increasingly important issue in light of revelations about government surveillance programs and the potential misuse of data by law enforcement agencies. Accordingly, companies and governments must take steps to ensure that the collection, use, and storage of personal data are done in a manner that respects individual privacy and autonomy. This may include implementing data protection policies, providing clear and transparent information about data collection and use, and allowing individuals to control and manage their own personal information. It is also essential that policymakers, companies, and individuals work together to develop clear guidelines and standards for the collection and use of personal data. This may involve greater transparency and accountability in the use of algorithms and artificial intelligence, as well as stronger protections for vulnerable groups such as children and marginalized communities.
Although there are existing guidelines for the use of surveillance technologies in Canada, the government has been shown to not comply with its own policies on the use of surveillance technologies. In 2020, reports surfaced that various Canadian law enforcement agencies had been using Clearview AI, a facial recognition software in a manner that is not in compliance with Canadian privacy legislation. As a result, an investigation was launched, leading to findings on the use of biometric technology, particularly facial recognition.
In addition to privacy rights, the right to liberty is often discussed in the context of the implications of technological innovation on policy and ethics. The fundamental principle of liberty grants individuals the freedom to act as they deem fit, as long as their actions do not violate the rights of others. In the context of technology, this liberty is embodied in the freedom to utilize technology in ways that do not affect the rights of others or cause them harm. For example, the City of Hamilton in Ontario has a by-law that strictly limits the use of video surveillance cameras to monitor and record a homeowner’s property within its borders (property lines). In other words, the by-law ensures that security cameras are installed and used in a manner that respects the privacy of residents and neighbors.
National Laws, Directives, and Programs in Effect
National Privacy Laws
As of May 2023, two main pieces of legislation that regulate technology and data are in effect in Canada. It is worth noting that independent administrative agencies play a major role in implementing these laws; therefore, laws should be seen as a combination of legal texts and policies that interpret and apply them in concrete situations.
Personal Information Protection & Electronic Documents Act (PIPEDA, 2000)
The Personal Information Protection and Electronic Documents Act is a Canadian law relating to data privacy. It governs how private sector organizations collect, use, and disclose personal information in the course of commercial business. A breach of security safeguards is defined in PIPEDA as the loss of, unauthorized access to, or unauthorized disclosure of personal information resulting from a breach of an organization’s security safeguards, or from a failure to establish those safeguards. The law requires that organizations report any breach of security safeguards involving personal information if it is reasonable to believe that the breach creates a risk of significant harm to an individual. The Office of the Privacy Commissioner (OPC) released a set of scenarios in 2019 grounded in analysis of breach records to date.
Digital Privacy Act (Bill S-4, 2015)
This amendment to PIPEDA applies to organizations in Canada that collect, use, or disclose personal information produced by an individual in the course of an individual’s employment, business, or profession. The act requires these organizations to record and report breaches of security safeguards regarding personal information that they have collected, as well as avoid obstructing the commissioner of the investigation of a complaint or audit. All records of personal information breaches must be recorded and maintained indefinitely, and there are no set thresholds regarding the severity or scope of breaches, which means that organizations are required to maintain records of every breach of security safeguards that occurs. The Digital Privacy Act also enables the disclosure of personal information without the individual’s knowledge or consent. It does so by permitting organizations to disclose personal information to non-law enforcement organizations in order to investigate a breach of business contracts or federal or provincial laws if notifying those individuals could lead to a compromised investigation. The Act references the prevention, detection, or suppression of fraud and protecting victims of financial abuse as reasons for this non-consensual information sharing.
Privacy of Children Considerations
Children’s rights advocacy groups have long fought to carve out special protections for children in the context of online harms and privacy violations. Children may be exposed to harmful online content, including content featuring self-harm or suicide and terrorism propaganda, whose circulation is often amplified by recommender systems. Online sexual abuse, cyberbullying, and extortion have also been sources of concern for parents and children’s rights advocates. In addition to risks from online content, corners of the Internet host online drug markets that target teenagers. The content of video games has received much attention from the press, but another serious problem is the monetization of children’s data through these games. For example, the U.S. Federal Trade Commission fined Epic Games, the maker of Fortnite, over half a billion U.S. dollars in 2022 for violating children’s privacy, and for using manipulative tactics to keep children online longer.
Self-regulation by online platforms constitutes part of the online safety framework, but it has failed to address major concerns. For example, despite TikTok’s own policy of disallowing children under 13, it is well known that many children below that threshold continue to use the application — the U.K. Information Commissioner’s Office estimates that 1.4 million children under 13 used it in that country in 2020. Thus, accusations that content on social media platforms may jeopardize minors’ emotional and physical well-being have prompted legislative efforts around the world. In the United States, the Children’s Online Privacy Protection Act (2020) lays out measures to protect children’s data privacy. In 2021, the U.K. enacted the Age Appropriate Design Code (“Children’s Code”) to set standards for online platforms hosting minor users — a law with the same name was adopted in the U.S. state of California in late 2022.
The death by suicide of 15-year-old Amanda Todd in 2012 brought cyberbullying and sexual extortion before the Canadian public’s attention; her victimizer was sentenced to 13 years in prison in October 2022, but the underlying threat has not been fully addressed, as witnessed by the suicide of 17-year-old Daniel Lints in June 2022. The absence of a children’s online privacy law remains a source of criticism as of June 2023. Elizabeth Denham, former U.K. Information Commissioner and former Information and Privacy Commissioner for British Columbia suggests that Canada could emulate the U.K. Age Appropriate Design Code. Taylor Owen, founder of The Center for Media, Technology and Democracy and McGill Public Policy professor, and Frances Haugen, Facebook whistleblower and social media accountability advocate, recommend a “child-centric approach” to online safety legislation that includes “requiring privacy-by-default for children’s accounts, putting additional limitations on children’s data collection, halting the algorithmic targeting of kids for advertising purposes and mandating the takedown of child sexual abuse material.” Haugen also makes the case for transparency: businesses could be required to publish statistics on the problematic content seen by children on their platforms.
The first major legislative step is the prospective online safety bill, which will be introduced in fall 2023. Heritage Minister Pablo Rodriguez convened an expert panel to address online harms, including harm to children, in early 2023. One children-specific measure that is circulated is requiring porn sites to verify that the users are 18 years of age or above. Another idea debated at the expert panel is limiting children’s access to content that promotes eating disorders and/or suicide.
In Canada, the legal framework for the protection of privacy rights does not distinguish between minors and adults, unlike in the United States where the Children’s Online Privacy Protection Act (COPPA) specifically applies to any entity that directs websites or apps to children. The Canadian Charter of Rights and Freedoms, as well as the Personal Information Protection and Electronic Documents Act (PIPEDA), provide a comprehensive framework for the protection of privacy rights that applies equally to all individuals.
However, the Office of the Privacy Commissioner of Canada (see below) acknowledges that there are unique considerations when it comes to the online privacy of children. As such, the Office has issued guidelines for organizations that collect information from children, which address specific challenges related to the collection, use, and disclosure of personal data of minors. These guidelines are in addition to the general principles of privacy protection that apply to all individuals.
The proposed Consumer Privacy Protection Act (CPPA) under Bill C-27 aims to provide enhanced protections for minors by mandating a higher standard of care and protection for the collection and processing of their personal information. This proposed legislation aligns with international trends in the protection of children’s personal information and organizations operating in sectors that involve the handling of children’s personal information will need to be vigilant in their compliance with the proposed regulations. Other resources relating to privacy rights and the protection of kids in Canada can be found here and here.
There are also several provincial laws that govern personal information about “children in school” (i.e., students below the age of 18) that schools may share with online educational tools.
For example, in Ontario, the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) establishes guidelines that school boards and other municipal institutions must adhere to when it comes to gathering, utilizing, retaining, and sharing personal information of any individual. Although the primary emphasis herein is the access to and privacy of, students’ personal information, it should be noted that the rights and responsibilities outlined in MFIPPA pertain to the personal information of any individual.
Under MFIPPA, institutions are generally mandated to seek consent for the use or disclosure of the information of an individual. However, section 32(C) of MFIPPA provides that schools have the discretion to share student information without parental consent only when the information is ‘for the purpose for which it was obtained or compiled or for a consistent purpose’. A consistent purpose is one that the parent or student would reasonably expect, such as using the information for the improvement of instruction of the student like the use of online educational tools.
Regarding the use of online educational tools, the Information and Privacy Commissioner of Ontario recommends that schools and school boards considering the use of online educational services take the following steps:
1. Assign responsibility for making decisions on the use of online educational tools.
2. Develop and implement policies and procedures to evaluate, approve, and support the use of online educational services for use in the classroom.
3. Develop a list of approved apps and services.
4. Provide privacy and security training and ongoing support for teachers and staff etc.
Laws Beyond Privacy
Copyright Act
Canada’s Copyright Act, which originally came into effect in 1985, establishes a legal framework for the protection of creative works such as literary, dramatic, musical, and artistic works. It gives authors and creators exclusive rights to their works, including the right to reproduce, distribute, and perform them. The act balances the rights of creators with the interests of users, promoting creativity and innovation while also allowing for reasonable use of copyrighted materials.
The Act has been amended several times to keep pace with changing technologies and the digital age. Some of the notable changes include the extension of the copyright term, the introduction of digital locks provisions, and the legalization of format shifting for personal use. The act has also been updated to address the needs of education, libraries, and museums. The most recent amendment was made in 2012 with the Copyright Modernization Act, which brought the legislation up to date with international standards, including the US Digital Millennium Copyright Act (DCMA), and addressed the challenges posed by the internet.
Canada Antitrust Law: The Competition Act
Canada’s antitrust law, known as the Competition Act, is the primary legislation governing competition law in Canada. It has undergone about 18 amendments since its creation in 1986, with the latest being in June 2022. It is a body of legislation designed to promote and protect competition in the Canadian marketplace. The primary purpose of the law is to prevent anti-competitive behavior by businesses, including mergers and acquisitions that may reduce competition or create a monopoly in a particular market. The law is enforced by the Competition Bureau, which is an independent law enforcement agency that reports to the Minister of Industry.
The Act is composed of two main parts: the civil provisions and the criminal provisions. The civil provisions deal with anti-competitive practices, such as agreements between competitors to fix prices or allocate markets, Non-Criminal Agreements Between Competitors, Abuse of a Dominant Position, and Restrictive Trade Practices, while the criminal provisions deal with more serious offenses, such as bid-rigging and price-fixing.
Among the regulations made under the law are the Regulations Respecting Anti-Competitive Acts of Persons Operating a Domestic Service (SOR/2000–324). The regulations in question focus on anti-competitive acts, and essential facilities and services within the context of the Canadian transportation industry. They aim to prevent unfair practices and promote fair competition among air service providers.
The Competition Bureau has the power to investigate and prosecute violations of the Competition Act. This can include reviewing mergers and acquisitions to ensure they do not create a substantial lessening of competition in a market, as well as investigating and prosecuting anti-competitive conduct by businesses, such as price-fixing, abuse of dominance, and misleading advertising.
One key aspect of the Canadian antitrust law is the concept of “abuse of dominance.” This refers to a situation where a business with a dominant market position engages in conduct that is anti-competitive, such as charging excessively high prices or refusing to deal with competitors like a dominant firm refusing to supply essential goods or services to its competitors in an attempt to drive them out of the market. This conduct, known as predatory exclusion, aims to eliminate competition and maintain or strengthen the dominant firm’s market position. To establish an abuse of dominance under section 79 of the Act, three essential elements need to be demonstrated:
Control: One or more individuals or entities must exercise substantial or complete control over a specific class or type of business across Canada or within a particular area.
Anti-competitive behavior: The entity or entities in control must have engaged in, either in the past three years or presently, anti-competitive practices. These practices can take various forms and are intended to impede or distort competition in the market.
Impact on competition: The anti-competitive practices must have, currently have, or are likely to have a significant effect on preventing or substantially reducing competition within the market.
If a business is found to engage in abuse of dominance, the Competition Bureau can take enforcement action to protect competition and consumers. The Bureau has the ability to challenge a transaction before the Competition Tribunal when it has determined the transaction will likely result in a substantial lessening or prevention of competition, which it has done nine times since 2009.
Another important feature of the Canadian antitrust law is the ability of private parties to bring civil actions for damages resulting from anti-competitive conduct. This allows individuals and businesses that have suffered harm due to anti-competitive behavior to seek compensation from the offending parties.
Overall, the Canadian antitrust law is designed to promote and protect competition in the Canadian marketplace. By preventing anti-competitive behavior and ensuring a level playing field for all businesses, the law aims to promote innovation, reduce prices, and improve consumer choice. The Competition Bureau plays a vital role in enforcing the law and promoting competition and continues to evolve its practices and policies to ensure that the Canadian marketplace remains competitive and fair.
In conclusion: While the exact nature and timing of subsequent rounds of reforms remain uncertain, further modifications may be forthcoming. Furthermore, the Bureau has adopted a progressively litigious stance towards merger review, all while maintaining its role as an active enforcer in other domains falling within its jurisdiction. A breakdown of this law can be found here.
Online Streaming Act
An Act to amend the Broadcasting Act and to make related and consequential amendments to other Acts (“Online Streaming Act”) received royal assent on April 27, 2023, after rounds of debates in the House of Commons and the Senate. As its name implies, it amends the Broadcasting Act (1991). The Act aims to regulate platforms providing “online undertakings” using a framework similar to the regulation of radio and television services. In concrete terms, supporting Canadian content stands out as the leading goal. Unlike conventional broadcasters, online platforms are not expected to be Canadian-owned, but nonetheless, they fall under the regulatory jurisdiction of the Canadian Radio-television and Telecommunications Commission (see below). One interesting and perhaps controversial element of the Act is that broadcasters are not exempt from the stated obligations even when the content published on their platform is user-generated.
Online News Act
An Act respecting online communications platforms that make news content available to persons in Canada, commonly known as the Online News Act or Bill C-18, received royal assent on June 22, 2023, and will come into effect within six months of that date. The stated goal of the Act is “fair revenue sharing” between digital platforms and news outlets. More concretely, it sets up a voluntary framework whereby online search engines and social media platforms compensate media companies for the news they monetize; if the platforms and news outlets cannot reach a voluntary agreement, a “mandatory arbitration framework” kicks in as a last resort. Canada’s policy follows a similar model introduced in Australia in 2021 which passed the News Media Bargaining Code. Just as in Australia, the new law has resulted in Google and Facebook announcing that they would no longer publish Canadian online news content, and the federal government retaliating by stopping all advertising on Facebook and Instagram. Google was not on that list as of late July 2023 having expressed openness to finding a solution. Online platforms claim that they provide news outlets with free marketing opportunities. Proponents, however, point out that these platforms fail to acknowledge the costs of high-quality journalism. What is more, the rise of online search engines and social media has resulted in dwindling ad revenues for small and local newspapers, which have closed in record numbers in the past decade. An important critique of the law in its current form highlights that the Act will compensate media conglomerates, but will do little or nothing to help small and independent journalism.
Directives & Administrative Guidelines in Effect
Directive on Automated Decision-Making (2019)
The Government of Canada is increasingly looking to utilize artificial intelligence (AI) to make or assist in making, administrative decisions to improve service delivery in a manner that is compatible with core administrative law principles such as transparency, accountability, legality, and procedural fairness. This Directive will continue to evolve to ensure that it remains relevant as technology and technology practices evolve.
Directive on Open Government (2014)
The Directive on Open Government is Canada’s “open by default” policy, providing clear and mandatory requirements to departments that will ensure that Canadians get access to the most government information and data possible.
Government Investments, Commitments & Programs
In addition to laws and regulations, there are many government programs that act as support to advance technology that benefits society (for a longer list and additional detail, see the appendix).
Canada’s Digital Charter
Canada’s Digital Charter is a 10-item charter to guide Canadian government work to strengthen privacy protections for Canadians as they engage in commercial activities.
Universal Broadband Fund
The $2.75 billion Universal Broadband Fund supports high-speed Internet projects in rural and remote communities,” primarily by providing funding to businesses (largely major telecommunications) to build broadband infrastructure.
Broadband Fund
The Broadband Fund is a CRTC fund that will provide $750 million over 5 years to support fixed and mobile broadband internet access in underserved parts of the country.
CanCODE Program
The CanCODE Program proposes to deliver $80 million over three years (starting in 2021–22) to provide K-12 students with opportunities to learn digital skills, with a focus on underrepresented groups.
Digital Citizen Initiative
The Initiative develops a strategy to build “citizen resilience against online disinformation and building partnerships to support a healthy information ecosystem.”
Digital Citizen Initiative
This project focused on online hate and disinformation, in three phases between 2019 and 2023: 1) Media Training and Coordination, 2) Research and Analysis of the Canadian Media Ecosystem, 3) Policy Development.
Canada First Research Fund
The Fund aims to strengthen research institutions’ work in areas with long-term social and economic potential. The fund provided Université de Montréal with $124 million in 2023 for “R3AI: Shifting Paradigms for a Robust, Reasoning, and Responsible Artificial Intelligence and its Adoption” and York University with $105 million for “Connected Minds: Neural and Machine Systems for a Healthy, Just Society.”
Standards Council of Canada
The Standards Council of Canada, a crown corporation that reports to the Canadian Parliament, provides a number of standardization efforts across a range of technology areas. These include the AI and Data Governance (AIDG) Standardization Collaborative, working to develop standardization solutions for AI and data governance; a cybersecurity certification program; a standardization program for digital credentials; and an initiative to provide standardization for innovations.
Administrative Agencies
Office of the Privacy Commissioner of Canada (OPC)
The Office of the Privacy Commissioner of Canada (OPC) is an independent agency responsible for overseeing compliance with the federal private sector privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), as well as the privacy provisions of certain other federal laws such as the Privacy Act. The OPC has the authority to investigate complaints related to handling personal information by federal government institutions and private sector organizations. They can initiate investigations on their own or in response to complaints from individuals. During investigations, the OPC can compel organizations to provide access to records, examine witnesses, and gather evidence.
The OPC’s powers include investigating complaints, conducting audits, pursuing court action under PIPEDA and the Privacy Act, writing public reports and research, and communicating with the public. The OPC does not have any direct enforcement powers. For example, the OPC investigated complaints that Home Depot had disclosed personal customer information to Meta without consent. They then made recommendations directly to Home Depot to change its practices around the use of Meta business tools.
In recent years, the OPC has focused on several key areas in its work to protect the privacy rights of Canadians. Some of the recent major contributions of the OPC include addressing privacy concerns related to the COVID-19 pandemic, advocating for stronger privacy laws and protections, investigating major privacy breaches and data, addressing privacy concerns related to emerging technologies, and educating Canadians on privacy issues.
In 2019, the OPC published a report looking at the learnings from its first breach record inspection exercise, which examined seven organizations’ breach records to assess compliance, and sought to get a better sense of the plans, tools, and approaches organizations are using to meet their breach recording and reporting responsibilities. The telco industry was selected because it was one of the top industry sectors that reported breaches to the OPC in 2019, alongside the financial, retail, and insurance sectors. Where businesses have experienced a breach of their security safeguards, mandatory breach reporting and notification requirements under PIPEDA will apply if it is reasonable in the circumstances to believe that breach creates a Real Risk of Significant Harm (RROSH) to an individual.
One major contribution of the OPC has been its advocacy for stronger privacy protections for Canadians. For example, the OPC has called for greater transparency and control over how companies collect and use personal information, including the right for individuals to request that their personal information be deleted. The OPC has also pushed for stronger enforcement powers, including the ability to impose fines and other penalties on organizations that violate privacy laws. Another major contribution of the OPC has been its focus on emerging privacy issues, such as the use of artificial intelligence and the Internet of Things. The OPC has conducted research and engaged with stakeholders to better understand these issues and has provided guidance and recommendations to help ensure that privacy is taken into account in the development and deployment of these technologies.
The OPC does not have the power to impose financial penalties or sanctions on organizations for privacy violations, however, it can make non-binding recommendations for corrective action against an organization that has violated privacy laws or failed to comply with their recommendations. These recommendations serve as guidance for the organization to rectify the privacy issue and bring their practices into compliance with privacy laws. If an organization refuses to implement the OPC’s recommendations, the OPC can apply to the Federal Court of Canada for a hearing. The court has the authority to issue orders requiring compliance with the recommendations, but it does not have the power to impose financial penalties or sanctions. The OPC’s reports contribute to public discourse and inform decision-making on privacy-related matters. Most importantly, The OPC has the power to take legal action to enforce privacy laws and protect individuals’ privacy rights.
Canadian Radio-Television and Telecommunications Commission (CRTC)
The CRTC, established in 1968 to regulate broadcasting and received the additional mandate to regulate telecommunications in 1976, is an entity that reports to the Parliament of Canada through the Ministry of Canadian Heritage. The Canadian Radio-television and Telecommunications Commission Act, the Broadcasting Act, and the Telecommunications Act set the legal parameters of its mandate. In addition to radio and television, it has limited authority on Internet regulation, but it is not known for digital platform content regulation. This may change, as the Online Streaming Act of 2023 gives the CRTC authority to regulate broadcasting over social media or online streaming services. The language of the law suggests that the CRTC will be tasked with ensuring that the online broadcasting industry serves the diversity of Canadians’ needs and interests, and provides opportunities to publish content in English, French, and Indigenous languages.
Provincial Laws in Effect
Canada has a range of privacy laws at the provincial level that impact Canadians in their capacities as consumers, business owners, and medical patients, among others. A link to privacy legislation at the provincial or territorial level can be found here. A summary of some of these laws can be found below.
General Provincial Privacy Laws
As a general rule, the laws in Alberta, British Columbia, and Québec are interpreted as sufficiently similar to the federal PIPEDA so that the PIPEDA does not have to apply in these provinces. However, federal-level organizations (for example, banks, telecommunications, and transportation companies) and transactions involving the transfer of cross-border personal information are still covered under the PIPEDA in these provinces.
Alberta Personal Information Protection Act (2004)
The Personal Information Protection Act of Alberta, which came into force in 2004, is a privacy law that governs the collection, use, and disclosure of personal information by organizations in the province of Alberta, Canada. It provides individuals with the right to request access to their own personal information while providing private sector organizations with a framework for conducting the collection, use, and disclosure of personal information. The act requires organizations to obtain consent for the collection, use, and disclosure of personal information and to protect the privacy rights of individuals. Government of Alberta.
British Columbia Personal Information Protection Act (2003)
The Personal Information Protection Act of British Columbia (PIPA), which came into force in 2003, is a privacy law that governs the collection, use, and disclosure of personal information by organizations in British Columbia, Canada. It requires organizations to obtain consent and protect the privacy rights of individuals and gives individuals the right to access and control their personal information. The Office of the Information Privacy Commission (OIPC) of BC recently published guidance on the mandatory breach notification in the public sector when a privacy breach “could reasonably be expected to result in significant harm. OIPC is expecting to receive a significant number of reports” in the coming period. Government of BC.
Québec Protection of Personal Information in the Private Sector (1996)
The object of this Act is to establish, for the exercise of the rights conferred by articles 35 to 40 of the Civil Code concerning the protection of personal information, particular rules with respect to personal information relating to other persons which a person collects, holds, uses or communicates to third persons in the course of carrying on an enterprise within the meaning of article 1525 of the Civil Code. There are two exceptions to this law: when public organizations hold, use, store, or collect personal information, and when personal information is used for journalistic purposes.
Legal Framework for Information Technology (2001)
The Act to Establish a Legal Framework for Information Technology is provincial legislation in the province of Quebec that aims to provide a comprehensive legal framework for the use of information technology. The act defines the legal framework for electronic transactions, digital signatures, and the protection of electronic information. It establishes the legal validity of electronic records and signatures, allowing for the use of electronic documents in place of paper-based documents in many cases. The act also provides for the protection of privacy and confidentiality of electronic information and creates a framework for the resolution of disputes arising from electronic transactions. This act plays a significant role in promoting the use of electronic transactions and electronic commerce in Canada.
Modernizing the Protection of Personal Information (2021)
This Quebec-province Act requires enterprises to hire officers in charge of the protection of personal information, conduct privacy assessments, and explain the purposes for which personal information is used, among other things.
Health Sector Provincial Privacy Laws
Because health is governed largely at the provincial level, several provinces have province-specific health laws that govern and are relevant to information custodians operating within and between provinces and territories, as well as service providers that fall outside of the formal healthcare systems. These rules apply to the collection, use, storage, and disclosure of data and focus on matters such as privacy and confidentiality of personal health information. They also provide guidance on the information needed to provide health services to those in need and to monitor, evaluate, and improve the health system.
In Ontario, the Personal Health Information Protection Act (PHIPA) was introduced in October 2004. Similarly, New Brunswick implemented the Personal Health Information Privacy and Access Act (PHIPAA) in June 2009. Newfoundland and Labrador followed suit with the enactment of the Personal Health Information Act (PHIA) in June 2008. Nova Scotia also implemented the Personal Health Information Act (PHIA) in June 2013. These acts serve to safeguard the privacy and access to personal health information within each respective province, ensuring that sensitive medical data is handled with the necessary confidentiality and security.
Provincial Labour Laws
Ontario — Employment Standards Act — Amendment on ‘Right to Disconnect’ (2021)
In 2021, Ontario created an amendment to the Employment Standards Act on the Right to Disconnect from workplace communications off-hours as part of the Working for Workers Act. Government of Ontario
The law, however — which is the first of its kind in Canada — only requires companies to have a policy on the topic rather than conferring any substantive change in terms of employee obligations. It also excludes 39% of private sector workers who work in companies with less than 25 people or that are federally regulated. Lawyers, however, believe that this amendment could have little or no impact on workers’ actual ability to disconnect.
Municipal Bylaws
City of Mississauga
Bylaw 0097 — This By-law provides guidelines for the City of Mississauga on maintaining and safeguarding records in a secure and easily accessible manner, regardless of whether they are in printed or electronic format.
City of Hamilton
Bylaw 10–122 — Bylaw To Prohibit and Regulate Fortification and Protective Elements of Land — 10 ( C ) prohibits the application of visual surveillance equipment, including video cameras, ‘night vision’ systems, or electronic listening devices capable of permitting either stationary or scanned viewing or listening, designed or operated so as to listen or view persons or land beyond the perimeter of the land actually owned, leased or rented by the occupant, or the use of visual surveillance equipment where the exterior lenses are obstructed from view or which are employed so as to prevent observation of the direction in which they are aimed.
City of Burlington
Bylaw 108–2002 — This Bylaw aims to control the excessive “fortification of land” and establishes that “excessive protective elements” include the use of surveillance equipment, such as video cameras, ‘night vision’ systems, or electronic listening devices capable of allowing the operator or viewer to observe or listen beyond the boundaries of the land owned, leased, or rented by the occupant.
Town of Halton
Bylaw 2003–0079 — This Bylaw is designed to govern the excessive “fortification of land,” and it specifically defines “excessive protective elements” as the use of visual surveillance equipment, such as video cameras, ‘night vision’ systems, or electronic listening devices that allow the operator or viewer to observe or listen beyond the boundaries of the Land actually owned, leased, or rented by the occupant.
Town of Milton
Bylaw 30 (2003) — This Bylaw aims to control the excessive “fortification of land” and specifically outlines “excessive protective elements” as the use of visual surveillance equipment, such as video cameras, ‘night vision’ systems, or electronic listening devices, which allow the operator or viewer to observe or listen beyond the boundaries of the Land actually owned, leased, or rented by the occupant.
City of London
Fortification of Land Bylaw — PW-8 — Prohibits the use of visual surveillance equipment, including video cameras, night vision systems, or electronic surveillance devices capable of permitting either stationary or scanned viewing or listening, beyond the perimeter of the land.
City of Calgary
Bylaw 1h2021 — Online Advertising of Public Works Notices and Tax Recovery Sales Charter Bylaw.
Reports and Investigations
Protecting Privacy in a Pandemic
The Privacy Commissioner of Canada investigated and presented a report on the federal government’s privacy practices in relation to measures adopted during the COVID-19 pandemic.
The OPC mentioned that its investigations revealed that the government’s measures implemented during the pandemic, with a few exceptions, adhered to applicable privacy laws and were deemed necessary and proportionate in addressing the extraordinary public health crisis.
International Principles in Effect
United Nations Guiding Principles on Business and Human Rights
The UN Guiding Principles on Business and Human Rights, endorsed in 2011, is a set of 31 principles to guide businesses in respecting human rights in their operations and throughout their supply chains. Based on the principles of protecting and respecting human rights and remedying business-related abuses, they emphasize the responsibility of companies to avoid harm and to ensure effective remedies for any human rights abuses that occur. While the Principles were not developed explicitly in response to technological development, they remain highly relevant to tech businesses operating today.
Internet Governance Forum
The Internet Governance Forum (IGF) is a global platform that brings together stakeholders to discuss and address issues related to the governance and use of the Internet. It provides a space for multi-stakeholder dialogue and cooperation on issues such as access, security, human rights, and innovation.
OECD Principles on Artificial Intelligence
The OECD developed a Recommendation on Artificial Intelligence policy aimed at providing guidelines for the responsible development and deployment of AI, promotion of human rights, privacy, transparency, and human-centered values. They aim to build trust and confidence in AI.
OECD AI Policy Observatory
The OECD AI Policy Observatory is a comprehensive platform for information and guidance on AI policies globally. It offers tools, resources, and best practices for developing effective AI policies that promote innovation, growth, and well-being.
OECD Working Party on Artificial Intelligence Governance
The OECD’s Committee on Digital Economy Policy (CDEP) has a Working Party on Artificial Intelligence Governance (AIGO, see mandate and Bureau) to oversee its work on Artificial Intelligence (AI) policy. Working party members are nominated by OECD member governments and are primarily national officials responsible for AI policies in their prospective countries.
OECD Database on Trustworthy AI
The OECD Database on Trustworthy AI provides information and guidance to promote responsible and ethical development and deployment of AI. It offers resources such as ethical frameworks, guidelines, case studies, and tools.
Global Partnership on AI
The Global Partnership on AI is an international initiative aimed at promoting responsible and ethical AI practices. It brings together governments, industry, civil society, and academia to collaborate on AI ethics, security, privacy, and more.
Technology Ethics Frameworks
AccessNow Principles on Platform Governance
The AccessNow Principles are a set of guidelines for the protection and promotion of human rights online. They outline the obligations of governments and other stakeholders to respect freedom of expression, privacy, and other human rights in the digital realm in times of crisis. The principles aim to ensure that the internet remains a space for free expression and creativity.
Montreal Declaration on Responsible AI
The Montreal Declaration on Responsible AI is a call to action for the responsible development and use of artificial intelligence (AI). It emphasizes the importance of respecting human rights, promoting ethical and transparent AI practices, and ensuring AI benefits all people. The declaration encourages governments, businesses, and other stakeholders to work towards a human-centered AI.
Concluding Reflections
This report provides a holistic perspective on the legal landscape surrounding technology regulation in Canada. It describes constitutional considerations, national laws, policy directives and administrative agencies, provincial laws, and a number of ethics guidelines proposed by international organizations or non-profits to regulate technology. Canada’s technology landscape has developed in the presence of laws and policies, especially concerning privacy, but it is safe to argue that major gaps exist as of 2023. Even though this report does not address stakeholder engagement, one of the recurring themes in the journalistic coverage reviewed for this report is the importance of developing effective mechanisms for engaging and consulting with Canadians, including civil society stakeholders. This entails transparent communication that explains the reasoning behind legal and policy decisions and allows the public to actively participate. Thus, this report serves as an essential resource, highlighting the existing policies and regulations that govern technology in Canada and beyond.
Below are some of the key takeaways from the research that guides this report:
The Government of Canada’s strategy for engaging the public concerned about the effects of technology should itself reflect responsible technology principles. This means communicating the reasoning behind legal and policy decisions in a transparent manner and promoting public engagement and reflection.
Responsible and transparent decision-making requires communication with civil society stakeholders in particular.
A good starting place would include developing proactive community engagement mechanisms to inform the development of bills that aim to advance the public interest.
Technology development is likely to outpace legal and policy development. As of this writing, debates around the potential benefits and risks of generative AI, particularly of Large Language Models, have been dominating the news cycle around the globe. Thus, future law and policy should take into account the importance of establishing guardrails in the face of new technology in a speedy and effective manner.
This part of the report addresses the laws, policies, directives, and agencies already in place in Canada. Part 3 of this report will identify and describe proposed bills that are currently under consideration in the Parliament, while Part 4 will release a summary of the key litigation that has surfaced in recent years related to emerging issues around technology.
Appendix 1: Summary of the Policy Landscape in Canada
Canada’s Digital Charter
This is a 10-item charter to guide Canadian government work to strengthen privacy protections for Canadians as they engage in commercial activities
Telecommunications Act
Regulates telecommunications by ensuring reliable services, protecting privacy, and protecting and encouraging the Canadian media.
Bill C-10: Act to Amend Broadcasting Act
The bill would update the Broadcasting Act to include online broadcasters, who would then have to invest in Canadian programming.
Online Streaming Act
The Online Streaming Act requires streaming services like Netflix, Crave, and Spotify to follow Canadian content rules, ensuring the companies pay into cultural funds and display a certain amount of Canadian content. This law places online streaming services under the Broadcasting Act and provides the CRTC with the tools to put in place a modern and flexible regulatory framework for broadcasting. These tools include the ability to make rules, gather information, and assign penalties for non-compliance.
Bill C-36: Act to amend the Criminal Code and Canadian Human Rights Act
Bill C-36, an Act to amend the Criminal Code and Canadian Human Rights Act and to make related amendments to another Act (hate propaganda, hate crimes, and hate speech) “proposes to amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online.” This bill features amendments to 1) the Youth Criminal Justice Act, 2) the Criminal Code, and 3) the Canadian Human Rights Act.
Universal Broadband Fund
The $2.75 billion Universal Broadband Fund supports high-speed Internet projects in rural and remote communities,” primarily by providing funding to businesses (largely major telecommunications) to build broadband infrastructure.
Broadband Fund
The CRTC’s fund will provide up to $750 million over 5 years. These funds will support projects to build or upgrade access and transport infrastructure to provide fixed and mobile wireless broadband Internet access services in eligible underserved areas of Canada.
CanCODE Program
The purpose of the $80 million CanCode program is to equip Canadian students (K-12), particularly underrepresented groups, with the skills needed to prepare for advanced digital skills and STEM courses between 2021–2022.
Police Use of Facial Recognition Technology in Canada and the Way Forward
Presents research findings and draft policy on facial recognition use among police in Canada.
Algorithmic Impact Assessment
Intended to support the Treasury Board’s Directive on Automated Decision-Making, this mandatory risk assessment questionnaire aims to determine the impact level of automated decision systems.
Privacy Guidance on the Internet of Things Devices
Guidance to help ensure that IoT device manufacturers are PIPEDA compliant, protecting privacy.
Directive on Automated Decision-Making
The objective of this Directive is to ensure that Automated Decision Systems are deployed in a manner that reduces risks to Canadians and federal institutions and leads to more efficient, accurate, consistent, and interpretable decisions made pursuant to Canadian law.
Directive on Open Government
Canada’s “open by default” policy, provides clear and mandatory requirements to departments which will ensure that Canadians get access to the most government information and data possible.
Montreal Declaration on Responsible AI
The Montréal Declaration is a collective endeavor that aims to steer the development of AI to support the common good and guide social change by developing recommendations with a strong democratic legitimacy.
Global Partnership on Artificial Intelligence
A multi-stakeholder initiative that aims to bridge the gap between theory and practice on AI by supporting cutting-edge research and applied activities on AI-related priorities.
Digital Citizen Initiative
A multi-component strategy that aims to support democracy and social cohesion in Canada by building citizen resilience against online disinformation and building partnerships to support a healthy information ecosystem.
Digital Democracy Project
The project aims at better understanding and ultimately developing policy options to address the challenges posed by disinformation and hate online. The project has three distinct phases: 1) Media Training and Coordination, 2) Research and Analysis of the Canadian Media Ecosystem, and 3) Policy Development.
Policy on Service and Digital
Serves as an integrated set of rules that articulate how the Government of Canada organizations manage service delivery, information and data, information technology, and cyber security in the digital era.
National Cyber Security Strategy
This document outlines the key elements of the global cybersecurity environment and articulates some of the ways that the Government of Canada will respond to an array of new challenges and opportunities in cyberspace.
Pan-Canadian AI Strategy
AI strategy to advance Canadian AI largely focused on supporting academic research.
National Strategy on Countering Radicalization to Violence
Contains some considerations relevant to online radicalization.
National Strategy for the Protection of Children from Sexual Exploitation on the Internet
Four pillars: 1) prevention & awareness, 2) Pursuit, Disruption, and prosecution, 3) Protection, and 4) partnerships, research, and strategic support.
British Columbia Technology and Innovation Policy Framework
A long-term roadmap that will guide government investment in technology and innovation throughout B.C. and across all sectors.
Ontario’s Digital and Data Strategy
A strategy for how the Ontario govt approaches digital services and user data.
References
The State of Global AI Regulations in 2023. Holistic.ai. January 2023. Link
To grant a publication ban or sealing order, the court must conclude that the ban or order is essential to prevent a significant threat to the administration of justice, including the possibility of a compromised trial, and that it is the least intrusive approach to achieving this goal. The final Supreme Court of Canada in the case of Toronto Star Newspapers Ltd. v. Ontario, 2005 SCC 41 (CanLII), [2005] 2 SCR 188 can be found here. Also see: R. v. Mentuck, 2001 SCC 76 (CanLII), [2001] 3 SCR 442
R v Cole, Supreme Court of Canada. Oct 19, 2012. R v. Cole
Personal Information Protection and Electronic Documents Act (S.C. 2000, c. 5). Government of Canada. Link
What you need to know about mandatory reporting of breaches of security safeguards. Office of the Privacy Commissioner. Government of Canada. August 13, 2021. Link
2019 Breach Records Inspection. Office of the Privacy Commissioner. Government of Canada. Sept 2020. Link
The Digital Privacy Act and PIPEDA. Government of Canada. November 2015. Link
“UK watchdog fines TikTok $16 mln for 'misusing children's data',” Reuters, April 4, 2023. Link
Omar Mosleh, “How Fortnite maker’s $520M settlement in the U.S. shows Canada has fallen behind in protecting kids’ privacy,” Toronto Star, December 19, 2022 Link; Supriya Dwivedi, “Stop prioritizing the profits of Big Tech over the mental and physical health of our kids,” Toronto Star, December 30, 2022. Link
Elizabeth Denham, “As lawmakers consider how to keep children safe online, they should look to Britain and California,” The Globe and Mail, October 13, 2022. Link.
Taylor Owen and Frances Owen, “Don’t Ban TikTok. Make it Safer,” The Globe and Mail, May 26, 2023. Link
Raisa Patel, “Would the 'Freedom Convoy' have been different if Ottawa had an online harms law? A Facebook whistleblower says yes,” Toronto Star, October 30, 2022. Link
Marie Woolf, “Suicide and anorexia promotion sites must be addressed in online safety bill, parents and child-protection experts say,” The Globe and Mail, March 30, 2023. Link
Municipal Freedom of Information and Protection of Privacy Act, R.S.O. 1990, c. M.56. Link
Copyright Act (R.S.C., 1985, c. C-42) Government of Canada.
Directive on Automated Decision-Making. Treasury Board Secretariat. (2017) Link
Directive on Open Government. Treasury Board Secretariat. (2017) Link
Investigation into Home Depot of Canada Inc.’s compliance with PIPEDA. Office of the Privacy Commissioner of Canada. (PIPEDA Findings # 2023-001) Jan 26, 2023. Link
Act Respecting The Protection Of Personal Information In The Private Sector. Government of Quebec. Updated to 22 September 2022. Link
Act to Establish a Legal Framework for Information Technology Government of Quebec. 2001.
Written policy on disconnecting from work, Government of Ontario - Link
Working for Workers Act, 2021, S.O. 2021, c. 35 - Bill 27, Government of Ontario, December 2, 2021 - Link
Ho, Solarina. Ontario's 'right to disconnect' law: Who qualifies and what are the loopholes? June 7, 2022. CTV News. Link
UN Guiding Principles on Business and Human Rights. Link
Global Partnership on Artificial Intelligence. Link
Declaration of principles for content and platform governance in times of crisis. AccessNow. Link
Montreal Declaration on Responsible AI. Nov 2, 2017. Link
— — —
June 2023 — Version 1.0
Legal Contributors: Barakat Abdulmumini and Femi Gbolahan
Policy Contributors: Onur Bakiner, Renee Black, Khiran O’Neill