New “Digital Charter” Hints at Data Portability, Digital Identity, and Penalties

The federal government announced a new “digital charter” today, emphasizing Canadians’ control over their own personal information and hinting at a “strong enforcement” regime aimed at global internet companies that violate privacy laws.

The digital charter does not have the power of law, but is rather “set of principles that all government policy and legislation will be measured against.” There is no time left in the current federal government’s mandate to reform existing privacy laws and the charter is a halfway measure, signalling to Canadians, and to social media and internet companies especially, that change is coming and what that change might look like.
The digital charter includes a background paper on reforms to the federal privacy law that suggests such reforms will come only after the October election. As currently contemplated, such changes would represent the most significant reworking of Canadian privacy law since its inception.

Notably, the charter proposes data portability (essential for consumer-directed open banking and other consumer-directed innovation), hints at increased support for digital identity tools, and suggests that privacy and security certifications, codes and standards may supplement existing laws.

Finally, the government noted that the Standards Council of Canada will launch the new Data Governance Standardization Collaborative, to coordinate the development and compatibility of Canadian data governance standards, suggesting that these standards may (intentionally or otherwise) a benchmark for privacy practices and possibly, liability.

The Digital Charter

The charter consists of 10 data principles:

  • Universal access
  • Safety and security
  • Control and consent
  • Transparency, portability and interoperability
  • Open and modern digital government
  • Level playing field
  • Data and digital for good
  • Strong democracy
  • Free from hate and violent extremism
  • Strong enforcement and real accountability

According to Minister Bains, Minister of Innovation, Science and Economic Development, the first principle reflects government’s commitment that all Canadians have an equal opportunity in accessing the digital world.

The second principle focuses on “integrity, authenticity, and security of services” recognizing that trust in the digital economy is key to consumer adoption, and a robust digital economy.

The third principle would give “control over what data they are sharing, how is using their personal data and for what purposes.” This suggests a form of “data portability”, a positive right currently absent in existing Canadian privacy laws, and a right essential for frameworks such as open banking.

The fifth principle emphasizes the ability of Canadians to access digital services provided by the government, and suggests that open data initiatives, digital identity, and other approaches that facilitate faster and frictionless interaction will be a priority.

The sixth principle focuses on “fair competition in the online marketplace” aimed at seeing Canadian businesses grow. Minister Bains also announced that he is working with the Competition Bureau to ensure it has the tools necessary to promote competition while creating a healthy environment for small and medium-size enterprises. This reflects international developments that have seen competition authorities impose restrictions on companies and, in the case of the financial sector, impose open banking rules. The Canadian Competition Bureau has also been more active in the area of data recently, publishing its Big Data and Innovation paper last year.

The seventh principle responds to increasing concern that many new technologies and data may not benefit individuals and may, in fact, be put to uses which disadvantage them. This principle focuses on ensuring the “ethical use of data to create value, promote openness, and improve the lives of people.” This could include algorithmic transparency requirements.

The eighth principle focuses on ensuring transparency of political discourse, defend freedom of expression and protect against online threats. This principle reflects the fallout from the Cambridge Analytica incident, and calls that online platforms in particular be accountable for assisting in the maintenance of democratic traditions and principles.

The ninth principle aims to ensure “that digital platforms will not foster or disseminate hate, violent extremism or illegal content”, and is timely in light of the Christchurch, New Zealand shootings which led to criminal charges against individuals who spread the video of the incident, and generated global debate on whether social media platforms themselves could be criminally charged.

Finally, the tenth principle aims to ensure “there will be clear, meaningful penalties for violations of the laws and regulations that support these principles.” Businesses will want to pay close attention to the development of the enforcement regime.

What does the Digital Charter mean for business

While the charter itself will have no immediate binding impact, it will likely heighten the consumer demand for, and regulatory interest in, privacy rights – which means the bar will be set higher for businesses. The maturation of privacy rights from a defensive shield into something that looks more akin to a sword means companies will need to begin assessing their handling of personal information as a legal and compliance issue, and not merely a marketing matter.

The digital charter, coupled with guidance on meaningful consent and proposed changes to transborder data flows, means companies can expect an increased compliance burden.

What should business be doing now?

There are steps companies can take now to allow them to be among the first to benefit from increased market opportunities created by enhanced consumer trust, and to ensure that they do not find themselves the focus of increased regulatory scrutiny:

Data mapping: As businesses grow, expand, and innovate, new data is collected, used in new ways and shared across a broader ecosystem. Most companies do not have a good understanding of their data flows; without this understanding, companies have significant compliance and regulatory risk, as well as foreclosing the effectiveness of due diligence defences where such defences are available.

Review third party contracts: As part of the accountability and transparency principles already embedded in existing privacy laws, companies should be able to tell consumers with whom they are sharing their information. As companies increasingly outsource various services, consumer information also follows – but the accountability for that information remains with the transferring organization. Businesses must understand their obligations to third parties, as well as the obligations they have imposed (or failed to impose) on third parties.

Re-evaluate the role and duties of the privacy officer: Companies handling personal information have been required to have a privacy officer for some time now. However, the role was often a second thought, and simply tacked on to someone’s already existing duties and responsibilities. Companies should review the privacy officer position and ensure that the right person, at the right level, and with the right budget, is charged with those responsibilities. Companies should anticipate a talent shortage and make decisions about recruiting and/or training appropriate people.

Review privacy policies, consent language, and CASL compliance processes: Plain language policies that explain what is being done with consumer’s personal information is a first step towards obtaining meaningful consent, the new standard by which consent is measured by the Office of the Privacy Commissioner of Canada. An inability to demonstrate that consent is meaningful will render a business unable to use to the consumer information it has gathered, significantly impacting its market reach and potentially its bottom line. Anti-spam compliance measures should be checked, especially in light of the recent decision by the CRTC that held a director personally liable for the acts of his corporation.

Privacy impact assessments: Businesses that are evolving and innovating, partnering with others in their sector ecosystem, and launching new digital tools should be implementing privacy impact assessments as a matter of course. This helps ensure that no new technology gets launched or data use occurs without being vetted against a rigorous set of privacy criteria. Rapid turnaround on privacy impact assessments is critical to get buy-in from the business, but ultimately such assessments can avoid consumer backlash for undisclosed privacy practices, loss of trust, reputational impacts, and post-launch abandonment of projects.


For more information about Denton’s data expertise and how we can help, please see our unique Dentons Data suite of data solutions for every business, including data mapping, contractual review, and consent benchmarking. Our Transformative Technologies and Data Strategy page has more information about our sophisticated tech practice, which focuses on data-driven technologies such as artificial intelligence, data analytics, and digital identity.

Subscribe and stay updated
Receive our latest blog posts by email.
Kirsten Thompson

About Kirsten Thompson

Kirsten Thompson is a partner in our Corporate group in Toronto and is the national lead of the Transformative Technologies and Data Strategy group. She is also a key member of the Privacy and Cybersecurity group. She has both an advisory and advocacy practice, and provides privacy, data security and data management advice to clients in a wide variety of industries.

Full bio