Testimony before the House Financial Services Committee on Citizen and Stakeholder Engagement to Increase the Effectiveness and Legitimacy of Regulations

--

By Beth Simone Noveck

This is an excerpt from the testimony delivered to members of the U.S. House Committee on Financial Services Subcommittee on Oversight and Investigations during a hearing, titled “Fake It till They Make It: How Bad Actors Use Astroturfing to Manipulate Regulators, Disenfranchise Consumers, and Subvert the Rulemaking Process,” which was held on February 6, 2020.

Download the full written testimony here and find more information about the hearing here.

Testimony of Beth Simone Noveck, Professor and Director, The Governance Lab, New York University

Introduction

Chairwoman Waters, Ranking Member McHenry, thank you for the opportunity to participate in today’s House Financial Services Committee Oversight and Investigations Subcommittee hearing: “Fake It till They Make It: How Bad Actors Use Astroturfing to Manipulate Regulators, Disenfranchise Consumers, and Subvert the Rulemaking Process.”

I am a Professor of Technology, Culture and Society at New York University’s Tandon School of Engineering, where I direct the Governance Lab, a nonprofit action research center focusing on the use of new technology to improve governance and strengthen democracy. At the Governance Lab, I direct our work on “CrowdLaw,” where we collaborate with public sector partners to study and design use of new technology to improve the quality of law and policymaking. I previously served as Deputy Chief Technology Officer and Director of the Open Government Initiative under President Obama, where I led White House policy and projects on citizen engagement. I currently also serve as Chief Innovation Officer of the State of New Jersey and as a Member of Chancellor Angela Merkel’s Digital Council.

In this submission, which reflects only my personal opinions, I set out the crucial importance of citizen and stakeholder engagement to increase the effectiveness and legitimacy of regulations, and to strengthen democracy and trust in policymakers when both are under severe challenge. I examine some difficulties attached to public commenting in rulemaking, and how they can be overcome using new tools and technology. Finally, I showcase how jurisdictions around the world are turning to crowdlaw, the use of online public engagement in order to improve the quality of the law and rulemaking process, and provide examples that the United States could draw on as it seeks to deepen the foundations of its democracy in uncertain times.

Using New Technology to Improve the Quality of Public Participation

The Administrative Procedure Act of 1946 provides the public with an opportunity to participate in the rulemaking process through the submission of data, views, or arguments, which a federal agency is then required to consider prior to promulgation.The right to public participation is not intended to elicit popular opinion about the draft rule or to have people vote on the proposal. It is not an occasion for what constitutional law scholar Alexander Meiklejohn (1872–1964) described as “unregulated talkativeness.” Instead, it is an important opportunity for the public to participate in politics, when “everything worth saying shall be said.” In other words, the goal of public participation in rulemaking is to apprise the relevant agency of the best available information, in order to inform how it crafts the rule. As the Senate Permanent Subcommittee on Investigations found in its 2019 report on Abuses of the Rulemaking process, “agencies depend on relevant, substantive information from a wide variety of parties to assist them in developing and updating federal regulations.” Furthermore, the regulations.gov website states, “public participation is an essential function of good governance. Participation enhances the quality of law and its realization through regulations (e.g. Rules).”

High quality participation in rulemaking is also vital for Congress in its oversight capacity. Although agencies often promulgate rules without significant oversight, Congress still retains and uses its lawmaking authority after it delegates responsibility for implementing laws to regulatory agencies. In addition to oversight hearings, members frequently communicate with agencies during the rulemaking process through meetings, letters and calls. For example, this Committee’s Democratic members wrote to the Comptroller of the Currency to ensure that the upcoming Community Reinvestment Act regulatory processes would include meaningful engagement with the public and have suggested extending the period for public commenting from 60 to 120 days to facilitate more diverse participation. Moreover, since the enactment of the Congressional Review Act, Congress has possessed and used its sweeping powers to review and overturn rules and policies within sixty days of submission to Congress (the 60 days from submission technicality is enabling Congress to overturn many rules and policies that have long been in effect).

Thus, the process of public commenting provides a vital opportunity for agencies and Congress to obtain important and relevant information from diverse audiences that will help them to understand whether and how a regulation fulfills its legislative purpose.

However, new technology has also created challenges to successful public participation in rulemaking. The shift from a predominantly paper-based to a digital process has made commenting easier but it has also inadvertently opened the floodgates to voluminous, duplicative and even “fake” comments — what I call notice-and-spam — thereby lessening the value of public participation.

As I predicted in an article in the Emory Law Review in 2004, shortly after the launch of regulations.gov: “Automating the comment process might make it easier for interest groups to participate by using bots — small software ‘robots’ — to generate instantly thousands of responses from stored membership lists. Moving from long standing agency traditions to a rationalized online system levels the playing field and lowers the bar to engagement. Suddenly, anyone (or anything) can participate from anywhere. And that is precisely the potential problem. Increased network effects may not improve the legitimacy of public participation. For without the concomitant processes to coordinate participation, quality input will be lost; malicious, irrelevant material will rise to the surface, and information will not reach those who need it. In short, e rulemaking will frustrate the goals of citizen participation.”

Although much current attention is focused on the problem of fake comments and astroturfing, where an interest group hides its identity and manufactures the appearance that comments come from the “ordinary public,” the more salient and urgent concern for regulators and overseers, is not who signed the comment — if anyone — but the failure to invite and then to use high quality and diverse participation to inform the rulemaking process.

There is a remedy. In the almost two decades since participation moved online, data science tools and methods have evolved to deal with the problems of voluminous, duplicative and fake comments. Yet neither agencies nor the regulations.gov administrator are using them in a substantial way. The more agencies are deluged by voluminous, duplicative and fake “astroturf” comments, the more this race to the bottom reinforces a disturbing disregard for the potential value of public participation. We are failing to recognize the value of public commenting to enhance the quality of rules and, therefore, we have chosen not to solve the real problem at issue, which is not astroturfing, but taking the value of public commenting seriously. Failure to address the real challenge will only set us back further from the growing number of advanced nations that use new technology to tap the collective experience and expertise of their citizens.

I argue that the Committee should direct the agencies it oversees to use — and itself use — easily available tools to:

  1. Mine and summarize relevant comments for information. As we shall explore, machine learning and natural language processing software, namely those subfields of artificial intelligence used for making sense of large quantities of text, have created unprecedented ways to manage information — to sort the informational wheat from the extraneous content chaff. These technologies could enable agencies to process and analyze public comments rapidly and effectively.
  2. Adopt complementary mechanisms for public commenting in addition to notice and comment. The technologies of collective intelligence that enable people to communicate and collaborate via the Internet, have led to new ways of soliciting information that are a substantial improvement on the traditional, open-ended submission process of notice-and-comment. Around the world, regulatory agencies and the legislative committees that oversee them are turning to “crowdlaw,” namely the use of the Internet to create a meaningful and deliberative two-way conversation with the public, yielding more relevant, timely and diverse information. I explain how we could — how we must — adopt these practices in the United States and reimagine how agencies engage with citizens and stakeholders.

For additional information on the platforms and processes described herein, please see “CrowdLaw for Congress: Strategies for 21st Century Lawmaking,” a report and short video training materials I authored, available at congress.crowd.law.

The GovLab’s CrowdLaw for Congress website with cases and examples of how parliaments around the world are using technology to engage with citizens and stakeholders. Available online at congress.crowd.law.

Non-Endorsement: The technologies referenced in this document are discussed as examples of platforms supporting public participation practices in lawmaking in legislatures around the world. Their mention does not constitute an endorsement of the companies behind these technologies. I derive no financial benefit from these firms.

Summary of Recommendations

In order to address the challenge of voluminous, duplicative and fake comments:

1. Agencies should use machine learning to summarize voluminous comments.

2. Agencies should use deduplication software to remove identical comments.

3. Agencies should use filtering software to sift out the real and the relevant.

4. Lawmakers and agencies should use complementary crowdlaw platforms and processes used by other governments and organizations to enable better citizen and stakeholder engagement.

5. Like Brazil and New Jersey, agencies and committees should use Wiki Surveys to reduce volume and duplication.

6. Agencies and committees should use Collaborative Drafting and Annotation as Germany did to engage more experts in review of rules.

7. Committees should set up UK-style Evidence Checks to crowdsource review of comments and evidence.

8. Committees should democratize oversight and pilot the use of Citizen Juries as they do in Belgium.

To read the written testimony in its entirety, visit congress.gov.

For more information, visit crowd.law.

--

--

The Governance Lab improving people’s lives by changing how we govern. http://www.thegovlab.org @thegovlab #opendata #peopleledinnovation #datacollab