United States Securities and Exchange Commission

Washington, D.C. 20549

 

NOTICE OF EXEMPT SOLICITATION

Pursuant to Rule 14a-103

 

Name of the Registrant: Alphabet Inc.

Name of persons relying on exemption: Boston Common Asset Management

Address of persons relying on exemption: 200 State St. 7th Floor, Boston, MA 02109

 

Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934.

                                                                              

 

 

May 8, 2023

 

To Alphabet Inc. Stockholders:

 

 

Boston Common Asset Management seeks your support for Proposal 13 on the Company’s 2023 Proxy Statement. The Proposal asks Alphabet to issue a report on how the Company intends to minimize legislative risk by aligning YouTube policies and procedures with worldwide online safety regulations.

 

The resolved clause of the proposal states:

 

Shareholders request that Alphabet issue a report at reasonable cost and omitting proprietary information, disclosing whether and how the Company intends to minimize legislative risk by aligning YouTube policies and procedures worldwide with the most comprehensive and rigorous online safety regulations, such as the European Union’s Digital Service Act and the UK Online Safety Bill.

 

 

 

Rationale to vote FOR the Proposal

 

 

Summary of Rationale

 

·Despite apparent effort and leadership at YouTube, the platform remains a meaningful part of the child sexual abuse exploitation ecosystem by being a place of contact for grooming and coercion, live-streaming, and housing child sexual abuse material.

 

 1 
 

 

·Numerous online safety regulations and legislation have been implemented within the United States and many other countries in which YouTube functions.

 

·Further transparency from the Company is necessary for investors to understand YouTube and Alphabet’s risk assessment and preparedness for upcoming regulation, and to align the Company’s policies and practices with best practice to keep ahead of emerging digital online safety regulation.

 

Introduction

 

As shareholders, we seek to understand YouTube and Alphabet’s risk assessment and preparedness for upcoming regulation and alignment of our company’s policies and practices with emerging digital online safety regulation.

 

YouTube and parent company Alphabet have faced numerous problems associated with its content moderation and platform design, including the site being a central repository for and viral propagator of conspiracy theories, propaganda, fake news, extremist, hateful, inciting and violent content; facilitating the sexual exploitation of women and children and other crimes impacting the most vulnerable, including trafficking, sextortion and harassment.

 

All these problems have surged with the pandemic, the US elections and the violent insurgency on our nation’s capital—in the absence of sound regulation of social media. This maelstrom of events has resulted in a series of YouTube advertising boycotts, lawsuits and greater regulatory scrutiny. All publicity on these issues pose reputational damage to the company’s brand and pose threats to public safety, public health, social cohesion and democracy itself.

 

Negative Public Impact

 

A failure by Alphabet to minimize legislative risk by properly aligning its policies with the most comprehensive and rigorous online safety regulations, and report to its shareholders regarding such action, will result in continued reputational risk to the Company. Alphabet, and particularly YouTube, has already received significant negative media attention for apparently failing to properly mitigate its role in the spheres of child sexual abuse, trafficking, hate, incitement, extremism, division, unrest, violence, and harassment. Shareholders require further disclosure from Alphabet to ensure that legislative risk is being properly mitigated.

 

Child Sexual Abuse

Despite the apparent effort and leadership at YouTube, the platform remains an important part of the Child Sexual Abuse (CSA) Exploitation Ecosystem, by being a place of contact for grooming and coercion, live-streaming and housing CSA material.

 

 2 
 

 

Importantly, due to video watching being a top online activity of children, YouTube is where abusers interact with unsupervised children.1 In Tanzania, for example, while representing a smaller share of online child sexual exploitation and abuse cases relative to other platforms, total cases for YouTube increased by 50% in two years between 2017 and 2019.2 In Thailand, of the 43 children ages 12-17 who were described in 2022 as “most recently” being offered money or gifts in return for sexual images or videos, 60% reported YouTube as the platform it occurred on,3 (in Kenya it was 24%4 and Uganda was 12%).5

 

Human Trafficking

YouTube has also been implicated in human trafficking. Traffickers in industries such as Agriculture and Animal Husbandry; Bars, Strip Clubs, and Cantinas; Pornography and Traveling Sales Crews used YouTube to recruit and interact with those eventually trafficked6.

 

Hate, Incitement, Extremism

While YouTube has reduced online extremist content and disinformation, popular channels including those of Tim Pool, the Young Turks, and Mike Cernovich continue to monetize their content on YouTube,7 even while continually flagged for hateful content, disinformation and incitement to violence.

 

Division, Unrest, and Violence

The intended consequences of online hateful and inciteful content are division, extremism, factionalism, and violence. Recent examples include India and Sri Lanka where studies have found “hateful messages about minority groups spread through Facebook, YouTube, Twitter and WhatsApp have led to targeted violence against them.”8

 

Harassment

An American Defamation League survey, “Online Hate and Harassment: The American Experience 2021”, found 21% of those who experienced online harassment or hate reported that at least some of that harassment occurred on YouTube.9

 

Alphabet’s Current Insufficient Disclosures

 

The crux of the Proposal ask is to understand how the inherent risks of content moderation – particularly as it pertains to the most vulnerable groups – are being addressed by Alphabet and specifically YouTube, which the Proponent views as the highest risk due to its user-generated content. Alphabet’s Human Rights Policy explicitly states the company’s support for internationally recognized human rights standards, and clarifies that human rights governance and due diligence are overseen by Alphabet’s Audit and Compliance Committee.

 

_____________________________

1 Gaming and video watching are key online activities of children reported by Disrupting Harm, a project of ECPAT International, INTERPOL and UNICEF in their first four country reports. Disrupting Harm https://www.end-violence.org/disrupting-harm.

2 https://www.end-violence.org/sites/default/files/2022-03/DH_Tanzania_ONLINE_final_revise%20020322.pdf

3 https://www.end-violence.org/sites/default/files/2022-02/DH_Thailand_ONLINE_final.pdf

4 https://www.end-violence.org/sites/default/files/2021-10/DH%20Kenya%20Report.pdf

5 https://www.end-violence.org/sites/default/files/2021-11/DH_Uganda_ONLINE_final%20Report.pdf

6 https://polarisproject.org/wp-content/uploads/2018/08/A-Roadmap-for-Systems-and-Industries-to-Prevent-and-Disrupt-Human-Trafficking-Social-Media.pdf

7 https://bhr.stern.nyu.edu/youtube-report

8 https://issuu.com/cigi/docs/saferinternet_paper_no_1 (citing Laub, Zachary. 2019. “Hate Speech on Social Media: Global Comparisons.” Council on Foreign Relations, June 7. www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons)

9 https://www.adl.org/online-hate-2021

 

 3 
 

 

However, despite these policies and governance standards, we continue to see room for improvement in a couple of areas which are directly aligned with the resolution proposal:

 

The Google Civil Rights Audit10 published in early March 2023 recommended improvement of the Company’s content moderation policies regarding hate speech and harassment and specifically highlighted the risk of YouTube. We believe that many of the Audit recommendations of WilmerHale regarding hate speech align with the request in the Proposal, since many of the recommended policies also work to combat racism, xenophobia, and hate speech. Despite our dialogue with the Company on this issue, we are unsure of whether or how Alphabet intends to adopt the recommends from the Audit. Therefore, in addition to lowering reputational and legal risk, implementing this Proposal will also strengthen Alphabet’s human rights risk oversight policies.

 

Further, while the findings from the Civil Rights Audit were helpful, we remain concerned about the specific legislative and reputational risk related to children’s rights. For example, we think it is important for investors to understand how Alphabet is adapting its policies and practices to comply with legislation such as the California Age-Appropriate Design Code Act and other international regulations.

 

Online Safety Regulations and Legislation

 

As stated in the Proposal, online safety legislation is emerging domestically and internationally. Failure to adequately prepare for the implementation of these comprehensive and rigorous regulations will have a material financial impact on the Company through regulatory fines and penalties. The Proposal requests a report on whether and how Alphabet intends to mitigate such risk.

 

United States Initiatives

In September of 2022, the White House convened a Listening Session on Tech Platform Accountability, announcing core principles for forthcoming reform to: Provide robust federal protections for Americans’ privacy; Stronger privacy and online protections minors, including prioritizing safety by design standards and practices for online platforms, products, and services; Remove special legal protections for large tech platforms when they host or disseminate illegal, violent conduct or materials; Increase transparency about platform’s algorithms and content moderation decisions; and Stop discriminatory algorithmic decision-making.11

 

In March of 2022, the US State Department announced the Roadmap for the Global Partnership for Action on Gender-based Online Harassment and Abuse, “remedying the insufficient incentives and responsibility for technology platforms to monitor, prevent, and address the problem; strengthening laws and other frameworks to deter perpetrators and hold them accountable.”12

 

_____________________________

10 https://kstatic.googleusercontent.com/files/01269107bcc8c970d023ff5aababe405b1e463aa777d7d0a767f783be99876c043d100c7c2f2555eda6b
89547ae2c49bb11f22feba7930993852f0a82658d3ae

11 https://www.whitehouse.gov/briefing-room/statements-releases/2022/09/08/readout-of-white-house-listening-session-on-tech-platform-accountability/?utm_source=newsletter&utm_medium=email&utm_campaign=newsletter_axioslogin&stre

12 https://www.state.gov/2022-roadmap-for-the-global-partnership-for-action-on-gender-based-online-harassment-and-abuse/

 

 4 
 

 

The California Age-Appropriate Design Code Act, which will go into effect in California on July 1, 2024, was intended by legislators to “promote innovation by businesses whose online products, services, or features are likely to be accessed by children by ensuring that those online products, services, or features are designed in a manner that recognizes the distinct needs of children at different age ranges.”13 The Code has been described as “a groundbreaking bill that requires online platforms to proactively consider how their product design impacts the privacy and safety of children and teens in California.”14

 

Australia’s Online Safety Act of 2021.

The Online Safety Act of 2021 strengthened Australia’s existing laws for online safety. The Act has “significant implications for online service providers because it makes them more accountable for the online safety of the people who use their service.”15 Earlier this year, Australia’s eSafety Commissioner, Julie Inman Grant, sent legal notices to Google requiring them to answer “tough questions about how they are tackling online child sexual abuse” under the Online Safety Act 2021.

 

Ms. Inman Grant explained the critical role that tech companies play in curtailing the online exploitation of children: "The creation, dissemination and viewing of online child sexual abuse inflicts incalculable trauma and ruins lives. It is also illegal. It is vital that tech companies take all the steps they reasonably can to remove this material from their platforms and services.”16

 

European Union’s Digital Services Act and Digital Market Act

The Digital Services Act and Digital Market Act “form a single set of rules” that have two main goals: (1) “to create a safer digital space in which the fundamental rights of all users of digital services are protected;” and (2) “to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.”17

 

United Kingdom’s Online Safety Bill

The Online Safety Bill is pending in the UK Parliament, but if passed, it will “make social media companies more responsible for their users’ safety on their platforms.”18 The Bill protects both children and adults by, in part, preventing children from accessing harmful and age-inappropriate content, enforcing age limits and age-checking measure, and providing greater control to adults over the content they see and who they engage with online.19

 

_____________________________

13 https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=202120220AB2273&showamends=false

14 https://www.humanetech.com/insights/why-the-california-age-appropriate-design-code-is-groundbreaking

15 https://www.esafety.gov.au/sites/default/files/2021-07/Online%20Safety%20Act%20-%20Fact%20sheet.pdf

16 https://www.esafety.gov.au/newsroom/media-releases/twitter-tiktok-and-google-forced-answer-tough-questions-about-online-child-abuse

17 https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package

18 https://www.gov.uk/guidance/a-guide-to-the-online-safety-bill

19 https://www.gov.uk/guidance/a-guide-to-the-online-safety-bill

 

 5 
 

 

Rebuttal to Company’s Opposition Statement

 

In the opposition statement to this Proposal, Alphabet explains that first, it believes the Company provides sufficient disclosures about YouTube’s policies and procedures and second, that the Company completes “extensive regulatory compliance work.”

 

First, considering the examples cited above, it appears that Alphabet is not appropriately mitigating concerns regarding: child sexual abuse, trafficking, hate, incitement, extremism, division, unrest, violence, and harassment. The Proponent disagrees with Alphabet’s assertion that it is currently providing sufficient information about YouTube’s policies and procedures to “further [Alphabet’s] commitment to online safety.” These major reported risks, and potential failure to align policies with regulatory requirements, pose continued reputational and legal risk to the Company.

 

Second, to the extent that Alphabet believes that it is appropriately overseeing risks and monitoring compliance regulations, as alleged in the opposition statement, the Proposal requests that Alphabet provide sufficient transparency for investors to evaluate the efficacy of such policies. This requested disclosure goes beyond any mandatory disclosures under various regulatory frameworks. Instead, the purpose of this disclosure is for investor oversight of Alphabet’s policy to align YouTube policies and procedures with the most rigorous online safety regulations and thereby minimize legislative risk.

 

 

Based on the above Rationale, we believe that Alphabet’s current disclosures are inadequate to protect shareholder interests. We urge you to vote FOR Proposal 13, the stockholder proposal requesting a report on the alignment between the Company’s policies with applicable legislation.

 

Sincerely,

 

 

Lauren Compere

Boston Common Asset Management 

 

 

This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card; Boston Common Asset Management is not able to vote your proxies, nor does this communication contemplate such an event. Boston Common Asset Management urges stockholders to vote for Proposal 13 following the instructions provided on the Company’s proxy mailing.

 

 

6

 

 

 

Alphabet (NASDAQ:GOOG)
과거 데이터 주식 차트
부터 11월(11) 2024 으로 12월(12) 2024 Alphabet 차트를 더 보려면 여기를 클릭.
Alphabet (NASDAQ:GOOG)
과거 데이터 주식 차트
부터 12월(12) 2023 으로 12월(12) 2024 Alphabet 차트를 더 보려면 여기를 클릭.