Big Data

White House techies explore the intersection of big data and ethics

Shutterstock image (by wavebreakmedia): doors opening to data streams.

Big-data applications might seem remote and impersonal, but the software and algorithms they use are coded by humans and therefore can reflect human error and bias. A new White House report warns that the emerging technology poses risks and opportunities.

"Properly harnessed, big data can be a tool for overcoming long-standing bias and rooting out discrimination," U.S. CTO Megan Smith, Deputy CTO DJ Patil and Domestic Policy Council Director Cecilia Munoz wrote in a blog post announcing the report, which is the second in a series on big data.

Citing case studies on lending, employment, college admissions and criminal justice, the report offers detailed recommendations for advancing the relatively new field of data and ethics. Recommendations include more research into mitigating algorithmic discrimination, building systems that promote fairness and creating strong data ethics frameworks.The Networking and Information Technology Research and Development Program and the National Science Foundation are exploring ways to encourage researchers to delve into those issues.

The report also recommends that designers build transparency and accountability mechanisms into algorithmic systems so people can correct inaccurate data and appeal data-based decisions. And it calls for more research and development into algorithmic auditing and testing.

In an earlier report, the Federal Trade Commission concluded that existing non-discrimination law applies to complaints about bias in lending and credit, college admissions and other activities that use algorithmic models for decision-making.

Although data can be thought of as neutral, coders must decide how much weight to give to the data inputs in algorithmic systems, and those choices can lead to biased results, according to the report.

For example, if a person is searching for the fastest route via a GPS app, the results "might favor routes for cars, discourage use of public transport and create transit deserts," the report states. Also, if speed data is only collected from people who own smartphones, the system's results might be more accurate in places with the highest concentration of smartphones and "less accurate in poorer areas where smartphone concentrations are lower."

Furthermore, an app deployed in Boston used accelerometer and GPS technology on users' smartphones to locate potholes. A Harvard Business Review report notes, however, that older people and those in lower income groups often don't have smartphones, which means the app is not recording information from significant parts of the population.

Despite the drawbacks, "big data is here to stay," the White House blog post states. "The question is how it will be used: to advance civil rights and opportunity, or to undermine them."

About the Author

Bianca Spinosa is an Editorial Fellow at FCW.

Spinosa covers a variety of federal technology news for FCW including workforce development, women in tech, and the intersection of start-ups and agencies. Prior to joining FCW, she was a TV journalist for more than six years, reporting local news in Virginia, Kentucky, and North Carolina. Spinosa is currently pursuing her Master’s degree in Writing at George Mason University, where she also teaches composition. She earned her B.A. from the University of Virginia.

Click here for previous articles by Spinosa, or connect with her on Twitter: @BSpinosa.


  • People
    2021 Federal 100 Awards

    Announcing the 2021 Federal 100 Award winners

    Meet the women and men being honored for their exceptional contributions to federal IT.

  • Comment
    Diverse Workforce (Image: Shutterstock)

    Who cares if you wear a hoodie or a suit? It’s the mission that matters most

    Responding to Steve Kelman's recent blog post, Alan Thomas shares the inside story on 18F's evolution.

Stay Connected