Biz/TechAI may judge you for writing in ALL CAPS and using !!! marks

ETC StaffOctober 8, 20196 min

Kristen Cussen, News Reporter

Tweets at 2 a.m. with an excessive use of capital letters and exclamation marks are just a few ways “meta-data is used to judge you,” said Frank Pasquale, a recent speaker at the President’s Lecture Series.

Pasquale, a professor of Law at the University of Maryland Francis King Carey School of Law, a leader of algorithmic accountability with an emphasis on AI, machine learning and algorithm, said AI governance has two meanings in the 21st century: governance of AI and governance by AI.

In a world where credit history transformed into algorithmic credit scores, his lecture posed the question of whether there are areas of people’s life that shouldn’t be processed or automated by machines.

Maryland law professor Frank Pasquale spoke at a President’s Lecture Series event about algorithmic accountability and artificial intelligence. (Kristen Cussen)

Pasquale cited credit scores to highlight the good and bad in machine learning. 

In one sense, if a credit score algorithm is applied equally to everyone, it has no bias and is more objective. On the other hand, it is precisely the lack of human intervention that removes due process and the ability to appeal from the equation.

“There are now systems that even use what are called fringe alternative data, like how you use social media,” Pasquale said.

“There are ways in which your meta-data is used to judge you,” he said. “For example, are you a person who is tweeting at one or two in the morning?

“That can be seen as discrediting according to some of these entities,” Pasquale said.

He said we need to ensure these algorithms are not systemically discriminating against people or creating self-fulfilling prophecies. 

“Do we want to be judged in all these dimensions?” Pasquale asked.

Pasquale said there are flaws and potential in AI in regards to tainted training data. 

An example involves a dataset with pictures of various lesions and moles that can be used as a primary visual diagnosis of skin cancer. But Pasquale wonders if AI governance provides scientists a dataset for something like skin cancer with an equal understanding of skin type based on the sample size. 

“How can we ensure equal representation?” he asked.

In an attempt to remove bias and discrimination from policing, a dataset of criminals’ faces was used to identify a person’s criminal potential. By breaking down the method, Pasquale said datasets like these are prone to show correlation without causation, thus creating discrimination rather than eliminating it.

Pasquale also cited red-light cameras, a form of machine governance. 

Going through a red-light camera will result in a ticket and fine and while the opportunity to appeal still exists it is against a photograph, rather than an officer who issues a ticket.

“If a cop were there, is there a chance he would have let you off with a warning or not pull you over at all?” Pasquale said.

Fairness, due process and empathy are key components of decision making. By removing that from algorithmic processing, Pasquale said people are no longer governing AI but rather letting AI govern them.