View profile

Big Revolution - A.I. straight out of the 1800s

Revue
 
Welcome to Wednesday's newsletter, coming to you from a hot-hot-hot Manchester. — Martin from Big Rev
 
June 24 · Issue #754 · View online
Big Revolution
Welcome to Wednesday’s newsletter, coming to you from a hot-hot-hot Manchester.
— Martin from Big Revolution

Big things you need to know today
  • Twitter has taken action against another Trump tweet. His threat of “serious force” against protesters has been hidden behind a content warning, and sharing of the tweet has been disabled.
  • Twitch is to take action over alleged sexual misconduct by streamers and others. The Amazon-owned company says it wants to create a  “community-centred, safe and positive” experience
  • Apple is dropping ‘force touch’ from the Apple Watch. The interaction, which involved a harder press on the screen is to be removed in watchOS 7, with developers being, er, pressed to consider alternatives.
The big thought
Credit: Matthew Henry on Unsplash
Credit: Matthew Henry on Unsplash
A.I. straight out of the 1800s
When I started studying A-Level Sociology at the age of 16, the first topic we looked at was crime. To see how society’s understanding of crime had evolved, we first looked at 19th century theories which proposed that criminals had certain physical features that distinguished them from law-abiding folk.
This is obviously rubbish, and we quickly moved on to more advanced studies of how society drives certain people to crime. But the idea that people used to think criminals looked a certain way has stuck with me over the years, because it’s so easy to mock.
Now the idea is back. Motherboard reports that over 1,000 A.I. experts have signed a letter asking a scientific publisher not to release a paper that claims software can predict who will commit a crime based on their appearance alone.
“As numerous scholars have demonstrated, historical court and arrest data reflect the policies and practices of the criminal justice system,” the letter states. “These data reflect who police choose to arrest, how judges choose to rule, and which people are granted longer or more lenient sentences […] Thus, any software built within the existing criminal legal framework will inevitably echo those same prejudices and fundamental inaccuracies when it comes to determining if a person has the ‘face of a criminal.’”
While you can’t (yet) convict someone of a crime there is no evidence at all that they committed, systems like this can certainly lead to harassment of innocent people by law enforcement. And given the data the system would be based on, it’s pretty obvious that the people who would be harassed would be the kind of people the police already hassle needlessly anyway.
So such a system would be pointless, cruel, and based on assumptions that sociologists debunked over 100 years ago. Any intelligence involved in it is certainly artificial.
One big read
'My Little Pony' Fans Are Ready to Admit They Have a Nazi Problem 'My Little Pony' Fans Are Ready to Admit They Have a Nazi Problem
Today in the wild world of internet culture… adult My Little Pony fans take on the Nazis among them.
That’s all for today...
Back tomorrow with more.
Did you enjoy this issue?
 
Become a member for $5 per month
Don’t miss out on the other issues by Martin SFP Bryant
You can manage your subscription here
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue