Digital Gov

Can government kick bots out of the comments?

Shutterstock image Stock vector ID: 715937059 By denvitruk 

The net neutrality debate is pulling back the curtain on an ugly reality of the internet age -- it's easy for automated bots to submit comments on federal policy changes, and there's no easy fix to the problem.

To engage citizens and keep them apprised of potential changes that stand to impact them, agencies are required to allow at least 30 days after publicizing a proposed rule change for the public to comment.

Since the Federal Communications Commission posted a call for comments on a proposed repeal of net neutrality rules in April, more than 23 million comments flooded the FCC's electronic docketing system, shattering the agency's previous record of 3.7 million comments  received during the last comment period for net neutrality in 2014.

However, large percentages of these comments appear to be duplicative, were sent from invalid email addresses or have international IP addresses, according to a Pew Research Center review. The review also found "on nine different occasions, more than 75,000 comments were submitted at the very same second," suggesting an automated spamming effort.

Nonhuman commenting and use of false identities can occur at such a large scale, in part, because there's no process to verify commenters on the FCC's electronic docketing system. To comment and upload files up to 25 megabytes in size, only a name and address are required, and these are accepted without any verification.

A cursory search of the public comments reveals many were left by anonymous and pseudonymous users, as well as commenters likely posing as other humans; the name Homer Simpson commented both for and against keeping the Obama administration's policies, as did an Ajit Pai, the name of the current FCC chairman.

In a Nov. 22 Los Angeles Times op-ed, Democratic FCC Commissioner Jessica Rosenworcel lamented the current public feedback process: "If the idea behind the [net neutrality repeal] plan is bad, the process for commenting on it has been even worse."

However, the problem isn't simply that bots are taking over the process; it's that because of the extra step of user authentication on the FCC's system, it's hard to tell which comments are legitimate, which aren't and even what constitutes a legitimate comment.

And without fixes, government transparency advocates fret, this sort of obfuscation could be replicated for any high-profile federal rule change with the effect of overshadowing or delegitimizing legitimate public input -- and ultimately discouraging public participation.

"Fake, fraudulent or flawed public comment proceedings erode the trust of citizens in the agency of their participation," said Alex Howard, deputy director of the Sunlight Foundation.

Howard added that these issues "are primarily grounded in design and human choices, rather than any technical challenge" and are compounded by the lack of communication from the FCC in terms of "public information, evidence of investigation or accountability" on these problems and potential solutions. 

Multiple FCC officials did not respond to FCW's requests for comment.

As far as technical fixes for better differentiating between a bot and a person, Acting Administrator of the U.S. Digital Service Matt Cutts suggested adding security layers, such as captchas or email confirmation, "could be a short- to medium-term improvement."

For further improvements, Cutts proposed additional security two-step phone number verification, used by login.gov, would be another way to discourage spam attacks, adding "in the private sector, this would be common to see with reputable sign-on or identity providers."

While these solutions would mitigate some of the bot-fueled problems, Cutts acknowledged this kind of friction wouldn't be a panacea for automated interference and, depending on their design, could impede accessibility for some users.

"These additional steps can lead to fewer comments overall," he cautioned. "Designers of commenting/feedback systems need to balance the desire for ease of commenting against the potential hazards of a system being spammed or attacked."

Howard argued that without such friction, "it's unfortunately reasonable to expect that foreign and domestic actors will seek to disrupt, pollute or otherwise create doubt about the legitimacy of that feedback."

About the Author

Chase Gunter is a staff writer covering civilian agencies, workforce issues, health IT, open data and innovation.

Prior to joining FCW, Gunter reported for the C-Ville Weekly in Charlottesville, Va., and served as a college sports beat writer for the South Boston (Va.) News and Record. He started at FCW as an editorial fellow before joining the team full-time as a reporter.

Gunter is a graduate of the University of Virginia, where his emphases were English, history and media studies.

Click here for previous articles by Gunter, or connect with him on Twitter: @WChaseGunter

Featured

  • FCW PERSPECTIVES
    sensor network (agsandrew/Shutterstock.com)

    Are agencies really ready for EIS?

    The telecom contract has the potential to reinvent IT infrastructure, but finding the bandwidth to take full advantage could prove difficult.

  • People
    Dave Powner, GAO

    Dave Powner audits the state of federal IT

    The GAO director of information technology issues is leaving government after 16 years. On his way out the door, Dave Powner details how far govtech has come in the past two decades and flags the most critical issues he sees facing federal IT leaders.

  • FCW Illustration.  Original Images: Shutterstock, Airbnb

    Should federal contracting be more like Airbnb?

    Steve Kelman believes a lighter touch and a bit more trust could transform today's compliance culture.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.