Cybersecurity

Court: Algorithmic bias research doesn't count as hacking

Matrix background with the green symbols By ltummy Royalty-free stock photo ID: 161746904 

Two researchers have successfully petitioned a U.S. court to preemptively rule that their work studying algorithmic bias at hiring websites does not constitute a violation of the 1986 Computer Fraud and Abuse Act.

Northwestern University professors Christopher Wilson and Alan Mislove, began creating fictitious user profiles and job opportunities for online hiring website services for companies like Monster, LinkedIn, Glassdoor and Entelo in 2016 to examine whether their proprietary algorithms discriminate against applicants based on race, gender, age or other characteristics. In each case, they specified in the job listing or user profile that these were fake accounts and paid any applicable fees for using the service.

Such actions violate terms of service prohibiting accounts that use false or misleading information and, the plaintiffs argued, expose them to potential prosecution under the CFAA. The suit was designed to challenge the "access provision" of the law, which prohibits accessing a computer without authorization in order to obtain information from it.

In court documents, lawyers for the American Civil Liberties Union argued that applying the "overly vague" CFAA to the professors' activities would chill or restrict their First Amendment right to freedom of speech and "unconstitutionally delegate[ed] lawmaking authority to private actors in violation of the Fifth Amendment Due Process Clause."

Testimony from Mislove during the case indicates that he feared the government could one day decide to prosecute this type of work, saying "it weighs heavily on my mind…exposing my students to potential criminal prosecution or the risk."

On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.

"Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," U.S. District Judge John Bates wrote in his opinion.

The Computer Fraud and Abuse Act criminalizes hacking, but it has been applied by federal prosecutors to cover a much broader set of activities. Justice Department lawyers asked the court to throw out the case, arguing that the professors' challenge was too speculative and that they had not demonstrated a credible threat of being prosecuted.

However, during discovery John T. Lynch, chief of the Computer Crime and Intellectual Property Section of the Criminal Division at DOJ testified that while he would not expect the department to prosecute such a case, it would not be "impossible" to do so. Digital rights groups like the Electronic Frontier Foundation have pointed to a number of previous court cases where the government claimed that violating a private agreement or corporate policy amounts to a CFAA violation.

As companies are increasingly outsourcing credit and hiring decision making to artificial intelligence and machine learning tools, research on biases in those algorithms has become more relevant to policymakers. The House Oversight and Governmental Reform Committee has held a number of hearings on the topic in 2019. Meanwhile, the Department of Housing and Urban Development brought a case against Facebook last year arguing that its advertising algorithms steered housing ads away from certain minority groups, even as the department itself moved last year to alter regulations that would make it harder cite algorithmic bias when suing landlords or mortgage lenders.

"Researchers who test online platforms for discriminatory and rights-violating data practices perform a public service. They should not fear federal prosecution for conducting the 21st-century equivalent of anti-discrimination audit testing," said Esha Bhandari, staff attorney with the ACLU's Speech, Privacy, and Technology Project, in a statement.

About the Author

Derek B. Johnson is a senior staff writer at FCW, covering governmentwide IT policy, cybersecurity and a range of other federal technology issues.

Prior to joining FCW, Johnson was a freelance technology journalist. His work has appeared in The Washington Post, GoodCall News, Foreign Policy Journal, Washington Technology, Elevation DC, Connection Newspapers and The Maryland Gazette.

Johnson has a Bachelor's degree in journalism from Hofstra University and a Master's degree in public policy from George Mason University. He can be contacted at [email protected], or follow him on Twitter @derekdoestech.

Click here for previous articles by Johnson.


Featured

  • innovation (Sergey Nivens/Shutterstock.com)

    VA embraces procurement challenges at scale

    Steve Kelman applauds the Department of Veterans Affairs' ambitious attempt to move beyond one-off prize-based contests to combat veteran suicides more effectively.

  • big data AI health data

    Where did the ideas for shutdowns and social distancing come from?

    Steve Kelman offers another story about hero civil servants (and a good president).

Stay Connected

FCW INSIDER

Sign up for our newsletter.

I agree to this site's Privacy Policy.