Article

One Step Forward, Two Steps Back in Online Harms Bill – Part 1

A A

 

March 7, 2024
Available in Audio Format:


What do pornography and hate speech have in common? Well, the federal government says they are both harmful. That’s why they’ve wrapped these issues up together in their recently announced Online Harms Act, otherwise known as Bill C-63. As the government’s news release stated, “Online harms have real world impact with tragic, even fatal, consequences.” As such, the government is of the mind that the responsibility for regulating all sorts of online harm falls to them. But the approach of the government in Bill C-63, though it contains some good content, is inadequate.

Background

In June 2021, the federal government introduced hate speech legislation focused on hate propaganda, hate crime, and hate speech. The bill was widely criticized, including in ARPA Canada’s analysis, and failed to advance prior to the fall 2021 election. Nonetheless, the Liberal party campaigned in part on a promise to bring forward similar legislation within 100 days of re-election.

Over two years have passed since the last federal election. In the meantime, the government pursued a consultation and an expert panel on the topic of online harms. Based on these and feedback from stakeholders, the government has now tabled legislation combatting online harm more broadly.

Bill C-63 defines seven types of ‘harmful content’:

(a) intimate content communicated without consent;

(b) content that sexually victimizes a child or revictimizes a survivor;

(c) content that induces a child to harm themselves;

(d) content used to bully a child;

(e) content that foments hatred;

(f) content that incites violence; and

(g) content that incites violent extremism or terrorism.

The hate speech elements of Bill C-63 are problematic for Canadians’ freedom of expression. We will address those further in a subsequent article. But though the bill could be improved, it is a step in the right direction on the issue of child sexual exploitation. 

Digital Safety Oversight

If passed, part 1 of the Online Harms Act will create a new Digital Safety Commission to help develop online safety standards, promote online safety, and administer and enforce the Online Harms Act. A Digital Safety Ombudsperson will also be appointed to advocate for and support online users. The Commission will hold online providers accountable and, along with the Ombudsperson, provide an avenue for victims of online harm to bring forward complaints. Finally, a Digital Safety Office will be established to support the Commission and Ombudsperson.

The Commission and Ombudsperson will have a mandate to address any of the seven categories of harm listed above. But their primary focus, according to the bill, will be “content that sexually victimizes a child or revictimizes a survivor” and “intimate content communicated without consent.” Users can submit complaints or make other submissions about harmful content online, and the Commission is given power to investigate and issue compliance orders where necessary.

Social media services are the primary target of the Online Harms Act. The Act defines ‘social media service’ as “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.” Further clarification is provided to include

 “(a) an adult content service, namely a social media service that is focused on enabling its users to access and share pornographic content; and

(b) a live streaming service, namely a social media service that is focused on enabling its users to access and share content by live stream.”

Oversight will be based on the size of a social media service, including the number of users. So, at the very least, the Digital Safety Commission will regulate online harm not only on major social media sites including Facebook, X, and Instagram, but also on pornography sites and live streaming services.

Some specifics are provided in Bill C-63, but the bill would grant the government broad powers to enact regulations to supplement the Act. The bill itself is unclear regarding the extent to which the Commission will address online harm besides pornography, such as hate speech. What we do know is that the Digital Safety Commission and Ombudsman will oversee the removal of “online harms” but will not punish individuals who post or share harmful content.

Duties of Operators

Three duties laid out in Bill C-63 apply to any operator of a regulated social media service – for example, Facebook or Pornhub. The Act lists three overarching duties that operators of social media services must adhere to.

Duty to Act Responsibly

The duty to act responsibly includes mitigating risks of exposure to harmful content, implementing tools that allow users to flag harmful content, designating an employee as a resource for users of the service, and ensuring that a digital safety plan is prepared. This duty relates to harmful content broadly. Although each category of ‘harmful content’ is defined further in the Act, the operator is responsible to determine whether the content is harmful (subject to possible direct intervention from the Commission). While it’s important for the Commission to remove illegal pornography, challenges may arise with the Commission seeking to remove speech that a user has flagged as harmful. 

Duty to Protect Children

The meaning of the duty to protect children is not clearly defined. The bill notes that “an operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age-appropriate design, that are provided for by regulations.” This could refer to age-appropriate designs in the sense that children are not drawn into harmful content, it could refer to warning labels on pornography sites, or it could potentially require some level of age-verification for children to access harmful content. These regulations, however, will be established by the Commission following the passage of the Online Harms Act.

The Liberal government says that its Online Harms Act makes Bill S-210 unnecessary. Bill S-210 would require age-verification for access to online pornography. In its current form, however, the Online Harms Act does nothing to directly restrict minors’ access to pornography. It would allow minors to flag content as harmful and requires ‘age-appropriate design’ but would not require pornography sites to refuse access to youth. As such, ARPA will continue to advocate for the passage of Bill S-210 to restrict access to pornography and hold pornography sites accountable.

Duty to Make Certain Content Inaccessible

Finally, Bill C-63 will make social media companies responsible for making certain content inaccessible on their platforms. This section is primarily focused on content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. ARPA has lauded provincial efforts in British Columbia and Manitoba to crack down on such content in the past year. If such content is flagged on a site and deemed to be harmful, the operators must make it inaccessible within 24 hours and keep it inaccessible.

In 2020, Pornhub was credibly accused of hosting videos featuring minors. Additionally, many women noted that they had requested Pornhub to remove non-consensual videos of themselves and that Pornhub had failed to do so. At the time, ARPA Canada submitted a brief to the Committee studying sexual exploitation on Pornhub. Our first recommendation was that pornography platforms must be required to verify age and consent before uploading content. Second, we recommended that victims must have means for immediate legal recourse to have content removed from the internet. This duty to make content inaccessible will provide some recourse for victims to flag content and have it removed quickly. Further, the Commission will provide accountability to ensure the removal of certain content and that it remains inaccessible.

The Act creates a new bureaucratic agency for this purpose rather than holding companies accountable through the Criminal Code. The Criminal Code is arguably a stronger deterrent. For example, Bill C-270, scheduled for second reading in the House of Commons in March 2024, would make it a criminal offence to create or distribute pornographic material without first confirming that any person depicted was over 18 years of age and gave express consent to the content. Bill C-270 would amend the Criminal Code to further protect vulnerable people. Instead of criminal penalties, the Online Harms Act would institute financial penalties for failure to comply with the legislation.

Of course, given the sheer volume of online traffic and social media content and the procedural demands of enforcing criminal laws, a strong argument can be made that criminal prohibitions alone are insufficient to deal with the problem. But if new government agencies with oversight powers are to be established, it’s crucial that the limits of their powers are clearly and carefully defined and that they are held accountable to them.

Conclusion

This first part of the Online Harms Act contains some important attempts to combat online pornography and child sexual exploitation. As Reformed Christians, we understand that a lot of people are using online platforms to promote things that are a direct violation of God’s intention for flourishing in human relationships. This bill certainly doesn’t correct all those wrongs, but it at least recognizes that there is improvement needed for how these platforms are used to ensure vulnerable Canadians are protected. Most Canadians support requiring social media companies to remove child pornography or non-consensual pornography. In a largely unregulated internet, many Canadians also support holding social media companies accountable for such content, especially companies that profit from pornography and sexual exploitation. Bill C-63 is the government’s attempt to bring some regulation to this area.

But Bill C-63 also raises serious questions about freedom of expression. Some of the problems addressed through the bill are objectively harmful. But how do we avoid subjective definitions of harm? We’ll look at the topic of hate speech as it relates to this bill in Part 2.


Bill C-63: Online Harms Act Email Us 

Get Publications Delivered

TO Your Inbox

Sign up for our newsletter to stay informed about upcoming events, action items, and everything else ARPA
Never miss an article.
Subscribe