30 Mar 2021 3 min read

Engagement: big tech, human rights, and extreme content

By Clare Payn

As the technology sector has grown in importance and value throughout the pandemic, we have continued to push its leaders to embed respect for human rights in their businesses.

 

mobile-phone.jpg

Tech companies are integral to global society. The internet, mobile phones, and social media are part of the fabric of our everyday lives. Their platforms are used every day by billions of people, and they have brought many social benefits, including better access to information.

However, these platforms also bring new challenges linked to complex issues such as the gathering, use and commercialisation of personal data, alongside issues such as content moderation, extremism and terrorism, electoral manipulation, and the effects on vulnerable and at-risk groups. In our view, risks to human rights are often implicitly enshrined in tech giants’ business models, corporate governance, and incentive structures.

To counteract such risks, we recommend that human-rights considerations are integrated into tech titans’ business strategies, policies and planning. To help put this into practice, in June last year the Council on Ethics of the Swedish National Pension Funds and the Danish Institute for Human Rights – with the support and engagement of several global investors including LGIM – developed a set of investor expectations for global tech companies on human rights. The expectations demand that technology firms reinforce measures to respect human rights and fully align their work with the UN Guiding Principles on Business and Human Rights.

We do not have all the answers to these issues, as in many ways this is a new and fast-changing playing field for investors, but we know from our experience of engaging with other sectors over the years that difficult questions can be addressed if you work in a structured way with the problems.

Our goal is for these published investor expectations to be a platform for that work. The document was sent to a number of the largest tech companies, and a series of engagement meetings followed. As a group, we continue to have constructive dialogue with the tech sector regarding the companies' responsibility for safeguarding users’ human rights.

Facing up to responsibilities

As a major shareholder in many of these stocks, we are also willing to engage these companies directly or in partnership with other firms and organisations on our specific concerns. One particular example from the past year illustrates our determination, and how we can have an influence.

In March 2019, 51 people were murdered in two mosques in Christchurch, New Zealand while they worshipped. It was an appalling crime that was livestreamed and disseminated across platforms run by Facebook*, Alphabet* and Twitter*.

In late 2019, alongside more than 100 other investors representing £7 trillion of assets under management, we joined a global collaboration to encourage these three groups to strengthen their controls to prevent the spread of such horrific content.

We requested:

• Clear lines of governance and accountability for senior executives and board members to ensure the platforms cannot be used to promote content of this nature;

• Sufficient resources to be dedicated to combating the livestreaming and spread of objectionable material across their platforms.

The group also set out that we encourage and expect modernisation of legislation to protect the public from exposure to similar content in the future. Policy must be built on robust evidence, so each company needs to be open about how their platforms are built and operated.

Meetings with the companies took place and, in December 2020, Facebook updated the charter of its audit and risk oversight committee to explicitly include review of content-related risks that violate its policies, and it will move not just to monitor or mitigate such abuse but also to prevent it.

Facebook – additional engagement outcomes

 

• All employees are required to complete a mandatory annual privacy training course that reinforces obligations to protect privacy and treat data responsibly.

• The company formed a Privacy Committee with independent board members to monitor privacy compliance. An independent, third-party assessor will also review Facebook’s data practices and report on them to the Privacy Committee and the Federal Trade Commission on a quarterly basis.

• Facebook invested $3.7 billion on safety and security in 2019 (circa 5% of revenues).

We will be publishing our 2020 Active Ownership report on 31 March, which will contain further detail on our approach to responsible investing and additional case studies.

*For illustrative purposes only. Reference to a particular security is on a historical basis and does not mean that the security is currently held or will be held within an LGIM portfolio. The above information does not constitute a recommendation to buy or sell any security.

Clare Payn

Senior Global ESG & Diversity Manager

Clare is responsible for the team’s stewardship activities for the technology, media and utilities sectors. She communicates with companies, investors and other market participants on various ESG issues, with a specific focus on diversity and other social issues, and she chaired the UK’s 30% Club Investor Group for three years. Clare sits on several internal and external committees focused on diversity and inclusion. With over 20 years’ ESG experience, you could consider ESG to be her life, but Clare is also a committed runner and has a passion for fashion.

Clare Payn