Introduction
In the fall of 2022, the GoodBot team - consisting of a group of socially-minded professionals working in law, technology, and policy - came together to develop a snapshot of Canada's current Responsible Technology landscape. This is a space that to date had been heavily defined by voices from the United States. Our goal is to understand the Canadian ecosystem's current composition, capacity, and direction, particularly how technology impacts Canadians, what issues are of focus in research and programming, and how Canada’s policy landscape is evolving at the provincial, national, and international levels.
Technology is revolutionizing everything from healthcare to education to law to climate science and government but is also associated with a wide range of risks, making responsible technology governance a critical priority for governments, nonprofits, and markets.
New and clear policy frameworks, effective implementation methods, and multi-stakeholder oversight bodies are needed to navigate this landscape. It also requires public interest-focused strategic collaboration, that includes: 1) long-term research on harms; 2) more transparency and collaborative efforts with technology companies to strengthen safety; 3) effective mechanisms to hold companies accountable when they fail to act in response to harm; 4) investment in public interest technology and philosophies that prioritize healthy technology ecosystems; 5) understanding the leverage points and incentive systems that contribute to these outcomes, and; 6) the development of the critical capacities needed to meet this moment.
When technology tools and platforms become deeply embedded in institutions like media and education, governing them becomes more challenging, leading to escalating and lasting harm. Therefore, the pressing question is how to forge a new direction, recognizing the immediate need to take action.
This is the introduction to a research project on Canada’s Responsible Technology landscape. GoodBot’s goal with this research is three-fold:
To understand the current landscape and priorities of Canada’s Responsible Technology ecosystem including the who, what, where, and how;
To highlight gaps and opportunities within this ecosystem that can help Canada develop a more robust, impactful, and collaborative approach and agenda, and;
To understand what role GoodBot can play in advancing Responsible Technology at home and around the world.
What is Responsible Technology?
Responsible Technology acts as an umbrella for a range of approaches and terms that all focus on different issues or specific intervention points in technology and business life cycles. It includes concepts such as ethical tech, humane tech, tech stewardship, and public interest tech, all of which are connected and overlap, but which also center on different locus of influence. These and other terms will be explored in Part 1 of our series.
Responsible Technology is a relatively new framing that includes a wide range of issues related to technology. Some issues - like privacy and freedom of expression - are long-standing cornerstones of the most established technology nonprofits in the country, while conversations on Generative AI risks are relatively new. Yet technology and AI ethics have been around for decades, with themes prominent among academic labs, human rights defenders, peacebuilders, and in other convening spaces. What has changed is the scale and pace of technology and the amplification of new risks and narratives surrounding technology harms. These have created a growing awareness of the need for safety-focused design and research at the outset, meaningful oversight of technology companies, and effective accountability mechanisms when companies fail to act in the public interest.
Many technology tools and businesses currently fail to meet even a vague understanding of Responsible Technology. This is especially true for small to medium technology companies, who focus on survival and leave proactive assessment and harm mitigation as an after-thought if they get any attention at all.
Few companies start with the goal of causing harm, but unintended consequences can arise due to a lack of intentional consideration, capacity, and unanticipated and conflicting priorities. Additionally, as products scale, seemingly harmless matters can lead to real harm as user bases grow, use cases expand, and incentive structures change. The explosive emergence of generative AI has made it even more clear that left unaddressed, these structural factors risk widening the gap between privatized profit and socialized risk.
Recently, several Big Tech companies have chosen to lay off large segments of their Trust and Safety teams, seeing them as cost centers that add undesirable complexity. Even when companies have Trust and Safety teams in place, they are often pitted against product teams. The result is companies that increasingly seek to automate these decisions, which frequently and disproportionately impact minority communities.
Some companies have begun sharing Transparency Reports, which is a move in the right direction. However, there are no agreed standards and metrics against which to assess companies' commitment to social sustainability, nor is there any external oversight. These factors risk lead to the possibility of ‘tech washing’ and cherry-picking data that provide the appearance of taking action but in ways that lack substantive effect, or cause new harms.
Moreover, even companies that have the desire to act responsibly can lose sight of their original goals when they face demands for outsized returns from investors - including venture capitalists and private equity investors - which can place them at odds with decisions that are in the public interest.
In this context, a wide range of social harms and externalities have arisen - many of which are unintentional - and which include:
Unaccountable & Untransparent Automation
Bias in untransparent algorithms that discriminate against marginalized groups
Disruption of the workforce by generative AI in almost all professional sectors
Big Tech Domination
Big Tech market domination to control value chains through predatory pricing and terms
Non-consensual selling of personal data to and from third-party data brokers
Addiction & Mental Health
The use of dark patterns to drive engagement and addiction in gaming and social media
The decline of attention spans at a population level in the last 20 years
The decline in mental health and body image, especially among youth
Harassment, Violence & Extremism
Incitement to radicalization, extremism, and even genocide
Trolling, doxing, and harassment, including targeting women, trans and BIPOC people
Bad Actors
Targeted and opportunistic disinformation and microtargeting to undermine democracies
Scams to steal millions from people through generative AI or with crypto hacks
Trafficking of women and girls on the dark web and in mainstream platforms
This is by no means a comprehensive list. In response, Responsible Technology advocates have advanced efforts in recent years to understand the wide array of externalities impacting different levels of society. These initiatives variously aim to understand harms and causes, increase public awareness and engagement, incentivize governments to enact new laws and enforce existing ones, and create solutions that lead to safer and more responsible technology-enabled environments.
Additionally, a new wave of government and non-profit investigations and litigation aims to clarify technology companies’ responsibilities and identify leverage points to incentivize responsibility. These efforts have had some successes but are often too incremental to keep pace.
The scale and complexity of issues arising from technology are unprecedented. We need a clear Responsible Technology agenda and sufficient investment to move us toward a technological future defined by healthy people, businesses, markets, societies, and democracies.
The Asymmetry & Sustainability Gap
A key barrier facing Responsible Technology advocates - including journalists, academics, tech nonprofits, technologists, tech ethics experts, policymakers, and citizens - in addressing technology harms is an existing and growing set of asymmetrical disadvantages when compared to the companies and sectors responsible for harm.
These disadvantages manifest in many forms including limited access to talent, data sets, algorithms, infrastructure, information, internal research and audits, knowledge, and resources. It shows up in access to capital, restrictions on how capital can be used, and comparatively robust ethical requirements. It is further exacerbated by companies’ anti-competition tactics to buy out, underprice, feature bundle, and otherwise aggressively quash any disruptors who may be able to offer healthier alternatives.
Additionally, asymmetries show up when comparing the inputs and outcomes around harm. It is - for example - much less resource-intensive to create campaigns to disseminate disinformation on vaccines - than it is to undo the damage caused by this disinformation. This reality places a societal premium on considering what effective oversight, governance, and accountability of technology look like, but also raises the need to balance corrective actions with Freedom of Expression (FoE) norms. The global nature of current technology results in the effective imposition of US FOE norms on Canada and other countries, yet Canada has its own unique and well-established interpretations of fundamental freedoms that should be considered and protected in the face of technological change.
Canada’s Responsible Technology ecosystem is small and under-resourced compared to its US counterparts, and at this time, there are no prominent ecosystem-level organizations that are well-positioned to guide a Responsible Technology community and strategy. Yet to effectively influence outcomes, Responsible Technology advocates need to work together, including by establishing new governance innovations.
A critical factor in the Canadian context is that much of the funding to date has focused on understanding symptoms and immediate causes rather than underlying structural issues and incentive systems at play. Some of this is a product of new and emerging organizations with limited track records, while other issues arise from inadequate and restrictive funding and opportunities. Others - such as those that focus on disinformation - do so at a project level rather than a mission level. Such factors are important to understanding the limitations on Canada’s capacity and sustainability. Our third report will focus on a deeper exploration of Canada’s nonprofit capacity.
An additional barrier is that there are no obvious market solutions to address many of the problems that arise in technology. This reality is exacerbated by the fact that markets recently rewarded technology companies for cuts that significantly depleted their Trust & Safety teams, even when it occurred at a time of high salience around the risks posed by technology platforms.
We also face issues of changing incentives once tools become entrenched in society. Some companies argue that if they do not employ harmful tactics such as polarizing content to get attention, they will lose out to rivals who will. These realities point to an increasing need for sectors to collaborate toward reducing harm and promoting public interest.
Indeed, in areas where companies have invested material resources - including to address extremism and protect children - even Big Tech lacks the bandwidth to address the complexity of issues on their own, making collaboration a necessity. Within the tech sector, new multistakeholder initiatives such as the Global Internet Forum to Counter Terrorism and the Tech Coalition have been launched in an effort by technology platforms - and include human rights advocates, governments, and researchers - who work collectively to reduce extremism and child sexual abuse material online, respectively. Such challenges are even more significant for smaller platforms and startups that lack internal resources to respond to current and emerging issues.
Ultimately, while technology companies contribute many benefits to society, they also create significant problems that neither society, the markets, nor they are presently able to solve. They often lack incentives to prevent and address problems up front, and even when they want to do the right thing, investor incentives can derail their decisions. There are limited market incentives to address challenges created by businesses and market competition. Moreover, our governance institutions operate in ways that are incompatible with the fast-moving pace of technology which presently tend to focus on harm to individuals instead of on harm to society. These factors together mean that we are in a situation that is by definition unsustainable. We need collective action to address these risks including through responsible resources, capacities, frameworks, budgets, and policies.
The Policy Landscape
While several technology-focused bills are in development, Canada’s current policy landscape is far behind the advances of the last 20 years. This is also true of many other countries that currently lack the policies, capacities, institutions, and enforcement mechanisms needed to govern a rapidly evolving technology environment. Furthermore, research shows that even where effective policies are in place, under-resourced enforcement mechanisms such as antitrust hamper the ability of governments to enforce existing laws. In this context, the role of civil society organizations is particularly important.
While the policy landscape is expected to evolve rapidly, we do not know what effect these policies will have or how coherent they will be across jurisdictions and issues. Canada has introduced many bills for consideration, but few have passed, and all leave much to be desired. In the intervening time, legal firms and the Privacy Commissioner have advanced litigation - with a particular focus on privacy and antitrust - to hold technology companies accountable.
Civil society organizations often lack consensus on addressing key issues. For instance, Online Harms advocates support stricter content moderation to tackle harassment, extremism, and child exploitation. In Canada, some lean towards Freedom of Expression, while others recognize the need to address such harms but fear that well-intentioned policies could inadvertently suppress the voices of vulnerable communities. Balancing these competing rights is complex. Responding to emerging issues and harms that affect us all requires better public engagement. This includes open and inclusive dialogue, transparent consultation processes, and effective accountability mechanisms to navigate these complexities and to help uphold and balance a range of fundamental Canadian rights.
Indeed, there are no ‘right’ answers but rather ‘different trade-offs.’ Moreover, there are ways to escape such polarity traps which can easily become politicized resulting in deadlock. Even among nonprofit organizations such as the ones that are seemingly at odds on online harms, for example, both largely agree that passing comprehensive privacy frameworks would mark an important victory and achievement.
Yet even if we achieve effective regulation and enforcement, addressing entrenched asymmetries - especially those caused by Big Tech - requires a collective agenda and roadmap. What is clear is that our current institutions lack the capacity and resilience needed to address the challenge we face.
As Canadians, we have an opportunity to draw upon the values that make us strong as we reimagine our relationship with technology. Ideas such as Indigenous approaches to data sovereignty, collaboration, and multiculturalism have much to teach us about how to navigate these complex issues. This moment presents an opportunity to rethink and localize the who, how, and why of technology governance. This is both a daunting and an exciting challenge.
Open-Sourcing GoodBot’s Research
For GoodBot, our first step is to practice our open principles by sharing what we have learned. We hope that this research can lay the groundwork for building a Canadian coalition to support its nascent and necessary Responsible Technology movement.
Getting to impact requires understanding Canada’s existing capacity, understanding the systemic issues at play, exploring moral and policy considerations, surfacing current and emerging asymmetries of power, and exploring how AI is upending companies and industries. It also requires a collective strategy and targeted action focused on moving toward responsibility.
These documents are intended to act as a primer for anyone seeking to make an impact in addressing critical priorities facing Canada and the world. While our early research may initially be of more value to nonprofits and academics and should be considered a Work-in-Progress, we aspire to a future where solutions-oriented multi-stakeholder collaboration is a new norm. Our research will be broken into two parts:
Part #1 is a Canadian Responsible Technology Landscape that explores common terms, civil society stakeholders, current and emerging policies and litigation, and how asymmetries of power manifest.
Part #2 reviews the results of a survey conducted, key observations on the current and emerging landscape, and critical reflections on how to strengthen interdisciplinary collaboration among nonprofit organizations, academia, the tech sector, and government.
In an ideal world, this work will lead to high-level consultations and strategies developed in collaboration with other ecosystem organizations motivated to move this conversation forward.
Canada as a Global Leader in Responsible Tech
Despite and perhaps because of the wide array of challenges, Canada has an opportunity to become a global leader in developing, deploying, and governing technology in socially sustainable ways. Getting there requires an urgent focus on strengthening national capabilities by investing in strategic and systems-focused multi-stakeholder mechanisms.
Indeed, organized effectively, Canadian civil society represents critical and untapped assets to help meet this moment. There is also a need to strengthen citizen education, advance responsible policy and oversight, create technical solutions to advance the public interest, introduce responsible technology certifiers, and respond to systemic factors that lead to harmful outcomes. Canadians can no longer afford to wait. The time to engage is now.
Version 1.0. Written by Renee Black. August 2023.