Investors must be aware of human rights risks posed by technology

At RightsCon last month I advocated for strong regulation and better investor engagement with big tech

Lauren Compere Managing Director/Director of Shareowner Engagement at Boston Common Asset Management

|

Lauren Compere, managing director, Boston Common

As investors we must understand the risks and opportunities posed by new technology, which, when used without proper governance, ethics, or attention to UNGP-aligned human rights, can cause real harm.

I represented this perspective in person in Costa Rica at the RightsCon conference in June, speaking on panels about responsible business conduct in conflict-affected regions and US shareholder resolutions filed with big tech.

Along with many other stakeholders, I advocated for supporting strong global regulatory systems, demanding company accountability and transparency, and aligning investments and engagement with convictions.

Technology has expanded far beyond its sector, reaching into core business functions and daily life. AI and algorithm decision-making is used by telecom, healthcare and finance companies as well as state actors for surveillance, delivery of social services, education, public housing, immigration and criminal proceedings.

Meanwhile, there is increasing evidence of biased data sets that further perpetuate discriminatory practices in healthcare delivery, creditworthiness and surveillance, making a comprehensive approach to ethical AI advocacy supported by robust engagement increasingly vital.

Engagement can deepen understanding of the implementation of human rights and ethics across the technology value chain from pre-design, R&D, monitoring, sales and the ability to modify or dual-use capacity. Investor focus areas for engaging technology companies on these areas include:

  • Advocating for systematic adoption of public ethical AI frameworks.
  • Requesting robust disclosure on oversight and implementation.
  • Understanding the human rights due diligence (HRDD) process.
  • Encouraging adaptation to keep pace with changing regulations across sectors and the use of AI by state and non-state actors. 

Enhanced due diligence

The adoption of enhanced due diligence for products and services sold in conflict affected areas is essential. Even when not directly linked to military use, requests for internet shutdowns, collection of user information, spread of disinformation, and the inciting of violence are palpable risks in conflict regions.

While the Myanmar coup and Ukraine/Russia crisis have put a spotlight on the need for responsible divestment, we must focus on responsible entry in conflict affected areas and access to remedy options for reporting and informing individual user inquiries.

During the 2023 US AGM season, 15 proposals at companies such as Alphabet, Amazon and Meta raised a variety of human rights concerns:

  • Inadequate content moderation.
  • Proliferation of hate speech.
  • Lack of transparency and accountability in use of opaque algorithms and AI.
  • Violations of privacy rights.
  • Risk of targeted advertising business.
  • Dual class share structures that limit shareholder voting rights.

Boston Common was the lead filer asking Alphabet to issue a report on its plans to minimize legislative risk by aligning YouTube policies and procedures with worldwide online safety regulations. This, coupled with ongoing concerns over the platform’s role in the child sexual abuse exploitation ecosystem—YouTube has also been implicated in human trafficking—underpinned our ask that Alphabet provide necessary transparency for investors to evaluate the efficacy of its policies. While our proposal received only 18% of the overall vote, this translates into 46.1% of the outside vote at Alphabet, a promising signal. 

Shareholder proposal requests this season asked companies to address investors’ human rights concerns by conducting independent assessments, reviewing policies to protect vulnerable populations, and strengthening oversight to address prevailing risks as regulatory supervision increases.

RightsCon provided a valuable forum to voice these investor concerns and discuss human rights in the digital age with communities of rights defenders. In its aftermath, investors have an opportunity to leverage robust stakeholder engagement to press companies on board level oversight and training, enhanced HRDD, and human rights impact assessment.

So, let’s move forward and build on the knowledge sharing RightsCon enabled by examining the risks new technologies pose, advocating for robust human rights data from research providers, demanding company transparency on risk assessment and management and supporting enhanced regulatory oversight.

MORE ARTICLES ON

Latest Stories