Spring 2023
Feature Story

Decoding the Data

Algorithms can help you search the internet, find a date — even get through a red light. But inequality can seep in. Pitt Cyber’s Beth Schwanke is helping to ward off government bias from an unlikely perpetrator: data.
Photography by
Pitt Visual Services
Illustrated by
Getty Images

If someone in Allegheny County, Pennsylvania, suspects a child is being neglected, they can call the Department of Human Services’ (DHS) child protection hotline. It’s answered 24 hours a day, seven days a week by caseworkers trained to determine if the county’s Office of Children, Youth and Families needs to intervene. 

But the caseworker isn’t the only one assessing the child’s vulnerability. Callers’ reports are referred to a system that factors in hundreds of data points, including housing status and parental incarceration and probation records, to calculate a child’s potential risk. 

The algorithm then assigns a score, meant to offer an impartial, statistical review of the facts — one that can help assess whether to investigate the call.

Supporters argue that the screening systems used in child welfare agencies across the country, including Allegheny County, augment social workers’ decisions when timing is critical. They can also help reduce workloads and improve efficiency amid staff shortages.

But such systems also are scrutinized. The Associated Press reported that during the first year the child welfare algorithm was used in Allegheny County, it flagged a disproportionate number of Black families for neglect compared with white families. The U.S. Justice Department has raised concerns about a similar algorithmic distortion that may disproportionately flag families with disabilities. The Allegheny County DHS has monitored and adapted the system, pledging to respond to the criticisms.

The danger, as critics see it, is that an algorithm is only as good as the data it’s given. If the data is skewed, the algorithm could lead to unnecessary investigations, over-surveillance and Black and lower-income families being unfairly targeted and separated. 

“How governments use algorithms is a civil rights issue — end of story,” says Beth Schwanke, executive director of the University of Pittsburgh’s Institute for Cyber Law, Policy, and Security

In 2019, to aid in addressing these concerns, Schwanke launched the Pittsburgh Task Force on Public Algorithms, a Pitt Cyber project that examines how public agencies are using algorithms and their impact on decision-making. The task force seeks to establish effective and fair oversight practices for algorithmic systems and to better inform the public about the way the process is used. Their work is already making a difference, she says.

Schwanke in a black coat at a bus stop

Most people are familiar with algorithms in some form, especially in the private sector. They determine credit scores, what social media feeds look like and even who matches on dating apps. Perhaps the most well-known is Google Search, which uses algorithms to comb the internet and return ranked results to queries.

“A recipe is actually an algorithm,” Schwanke explains. “An algorithm is a set of rules or guidelines that describe how to perform a task.”

In a recipe, if the wrong ingredient is added, it can ruin the whole dish. By the same principle, if the data an algorithm uses is flawed, its ultimate result will be, too — and that’s especially troubling when the result affects how public agencies interact with citizens.  

Throughout her career, the human impact of policy, of putting a face to an issue, has driven Schwanke. After earning a law degree at the University of Michigan in 2009, she went directly into human rights work, defending political prisoners. Before even passing the bar, Schwanke found herself on CNN representing the late writer and activist Liu Xiaobo, a Chinese dissident and Nobel Peace Prize laureate.

“It was very stressful,” she admits. But it was an entrée into legal work with direct impact, protecting the core rights of people. Among other legal and legislative work, Schwanke spent subsequent years at the Center for Global Development, a think tank addressing poverty and sustainable development.

Then, shortly after moving to Pittsburgh in 2016, she met David Hickton (LAW ’81), who was in the midst of becoming the founding director of Pitt Cyber, an institute creating actionable proposals for policymakers and industries to tackle real-world challenges in areas such as networks, data and algorithms.

After realizing the institute’s potential for helping to shape policy that could affect so many people’s lives in a rapidly evolving technological landscape, Schwanke agreed in 2017 to become the institute’s executive director.

In 2019, she came to Hickton with an idea for the Task Force on Public Algorithms, envisioning an independent government advisory panel to give input on developing and implementing algorithms.

The inspiration for studying public algorithms came to Schwanke, in part, from Safiya Umoja Noble, author of “Algorithms of Oppression: How Search Engines Reinforce Racism” (NYU Press, 2018). Noble challenged longstanding notions that algorithms were objective.

“The prevailing attitude among most people, even social scientists and humanists, was that computer code is just math, and math can’t discriminate,” Noble told Pitt’s “Information Ecosystems” podcast during a January 2020 campus visit.

In fact, Noble reports that the opposite is often true. Evidence shows that algorithmic systems can lock in and exacerbate historical bias, especially along racial and gender lines, because those developing algorithmic systems don’t constitute a fair representation of society’s diversity.

While Noble and others were pioneers in studying algorithmic bias, Schwanke noticed there wasn’t as much work addressing how prejudice and bias could show up in algorithms used specifically in government and state agencies.

Hickton was instantly on board for the public algorithms task force — it represented to him the “perfect prototype” of Pitt Cyber’s calling: “What we seek to be is a respected local voice, and a recognized national voice, on all things in digital space — particularly where the intersection of technology, law and policy raises more questions than answers.”

When Hickton served as U.S. Attorney for the Western District of Pennsylvania from 2010 to 2016, he witnessed how historical data based on zip codes were being used in policing: specifically, to predict the likelihood of criminals reoffending.

“I’d been concerned about that,” he recalls. He believes that while data and decision-making algorithms can offer vast benefits and improve government efficiency, they also carry significant risks.

“That efficiency ought not be at the cost of reinforcing historical inequality, or worse, fostering inequality,” says Hickton. “That’s what we recognized as the problem.”

For the task force’s case study, it focused on Pittsburgh and Allegheny County.

From the beginning, Schwanke says community feedback was at the heart of the project. The goal was not to assess individual systems or conduct an “algorithmic audit,” she explains, but to look at the life cycle of algorithms used by regional government, from development to implementation, to evaluate that life cycle in conjunction with the public and government, and to offer recommendations. To produce standards reflecting the concerns of the community, she knew the task force first had to engage the community, especially those most likely to be impacted by algorithms. 

Sherrill stands by a road in a green suit and white shirt

After receiving funding from The Heinz Endowments and the Hillman Foundation, Pitt Cyber sought out task force members who could best help them realize their vision.

“We very much wanted a diversity of views on the task force,” says Schwanke, and with a bit of a smile, she adds, “We got it.” In total, there were 21 members, including Hickton and Schwanke. Whenever there was a difference of opinion, “we got to working it out and figuring out where compromise was,” says Schwanke.

She was surprised to find how many algorithmic systems already operated in the Pittsburgh region and how challenging it was to find information about them. Some algorithms she expected and considered low risk: one system to manage stoplights and better control traffic flow after events at Acrisure Stadium and PNC Park, another to assign risk scores for fires at commercial properties. 

Others were not low risk, such as those incoming calls to Allegheny County’s child protection hotline. 

With examples like that one, it’s no wonder task force member LaTrenda Sherrill (SHRS ’09) says she was immediately struck by the high stakes of the project. Sherrill — a community engagement expert and principal and lead consultant for Common Cause Consultants — realized the task force would need to strike a balance between educating the public and listening to them.

“Community knows what’s best for them,” says Sherrill. “I think we don’t give community enough credit most of the time, because we don’t give them information. So, it was really important for me to inform. And then let’s talk about ways that you can be empowered.”

Against the backdrop of the pandemic, the task force began to plan a series of community meetings, ultimately speaking to nearly 200 organization leaders and community members. When required, they pivoted online. Sherrill even appeared on the popular community education and social activist site 1Hood Media’s “What Black Pittsburgh Needs to Know” series on Facebook Live.

Sherrill says she was impressed by the feedback and one perspective in particular early on: “I remember specifically the community member who said, ‘Man, I wish these things could help people and not serve as a mechanism to punish people.’” There is every reason to believe, Sherrill says, that if constructed without bias, algorithms could help all communities.

That theme of “support community” continued to emerge in public comments. Could algorithms be used positively rather than punitively? Why couldn’t algorithms help the community, identify service gaps, and get more resources to people who need them?

“It seems like a lot of algorithms are used from a deficit model of thinking: ‘Here are the problems,’” noted one participant. “But what are the models for seeing our strengths?” Another commenter raised the possibility of the algorithms’ purpose being redefined to help facilitate positive outcomes — from children’s health to affordable housing to air quality.

On the negative side, another community member, who was all too aware of bias, wanted to ensure algorithms didn’t widen current disparities: “An algorithm can be used to determine what jail you go to, what your sentence would be, determine probation and parole. This could be a chain around a Black man’s neck that he can never get off because of the structure.”

No matter the perspective, the need for accountability was emphasized repeatedly. “When the algorithms go wrong, who is at fault?” another community member asked.

Overall, Schwanke, like Sherrill, was heartened by community members’ willingness to accept public algorithms if they were used responsibly, thoughtfully and ethically. 

Schwanke had imagined the task force’s report could have ended up “heavy-handed,” possibly even calling for an outright algorithm ban, as some activist groups have demanded. But instead, Schwanke found the community believed algorithms could do good, as long as there was greater transparency and community involvement at all points in an algorithm’s life cycle — procurement, implementation and ongoing assessment.

Could algorithms be used positively rather than punitively? Why couldn’t algorithms help the community, identify service gaps, and get more resources to people who need them?

Based on the feedback and research, the report of the Pittsburgh Task Force on Public Algorithms, published last year, presented six best-practice recommendations for regional governments that can also be scaled nationally.

Among its recommendations, the task force calls for the public to be involved in algorithmic system development plans, from the earliest stages through any later substantive changes to the system. The task force also recommends third-party reviews when a system might be considered higher risk, and the integration of an algorithmic review of the data procurement processes. And to instill further transparency, agencies should be required to publish information about algorithmic systems on a public website. As for all aspects of the best practices, meaningful public participation should be commensurate with the risk level of the potential system. Case in point, the public feedback for the task force report stridently disapproved of facial recognition systems because of their documented inaccuracy and potential for profiling.

Chris Belasco (A&S ’03, GSPIA ’05, ’13), chief data officer for the City of Pittsburgh, points out that the task force’s recommendations were recently presented at a data governance committee meeting. Belasco, one of the task force’s seven government advisory panel members, says the city is taking the 40-page task force report under advisement as it works to develop an algorithms policy.

And, for beyond Pittsburgh, the report concluded by stating:

“The task force’s hope is that the Pittsburgh region will become a model for others across the country similarly confronting the proliferation of algorithms in government. The cost of sitting idly by during this transformative era in municipal government could be high, especially for the most marginalized among us.”

Schwanke envisions that, as algorithm usage expands, the task force report will be instrumental whenever and wherever oversight legislation is considered.

“There are lots of ways, hopefully, this work will continue to have resonance.” 

 

This story was published on May X, 2023. It is part of Pitt Magazine's Spring 2023 issue.