View profile

futuribile / curating futures - Issue #10 #SeptemberIsTheNewJanuary

Aloha! And Happy New Year! September is THE MONTH for beginnings (because unless you live in the Sout
September 5 · Issue #10 · View online
futuribile / curating futures
And Happy New Year! September is THE MONTH for beginnings (because unless you live in the Southern hemisphere, what do you want to start in a cold dark month like January? OK, the fiscal year, yu-uh). Go wild with your own good proposals, but between a workout and a management plan make some space for empathy. You can start from “Reattaching for convenience: 9 passive-aggressive email phrases that must end now”.
Aficionados ❤️: you will find some changes in the organisation of the content, feedback is welcome. Newcomers: this newsletter is all about rethinking and narrating tech and the evolution of our species, beyond the hype.
1. The Next Generation Internet Awards are open, please share the info if you know anybody who is engaged in making the Internet a more inclusive and opportunities-generator space (with research, products, activism). If you want to catch up with the NGI vision there’s nothing better than this interview with the unbreakable Rob van Kranenburg.
2. The P2P Models team is still looking for a project manager and a senior dev! Join a gamechanger project about blockchain and a wonderful group (I suggest this last blogpost to figure out the level of conversations you can have at the coffee machine).
3. I will be in Porto on September 13 at the NGI Forum, moderating a session about decentralised tech with people who are getting their hands dirty and not just talking about it.

Blockchain ta mère
Ethereum is full of ponzis, is that a problem? 🤔 It depends from your perspective, darling:
Ethereum may be playing a role here that is akin to those charities that provide free needles to drug users. Both drug usage and running a ponzi are punishable offences. Since criminalizing drugs doesn’t stop usage, perhaps the best we can do is provide a safe environment for drug users and ponzi players.
The eternal problem with cryptocurrencies is that everybody is talking about them, but few really understand how they work. And even less people can make them work (in terms of personal competencies and machine capacity), which IMHO is a big governance problem (“peer network” ta mère).
Cory Fields from the MIT Media Lab Digital Currency Initiative published a detailed report of how a critical vulnerability in Bitcoin Cash was (safely) handled. He presents his writing as a warning to companies which are jumping on crypto without understanding how much work is required to reach the sophisticated level of engineering that cryptocurrencies require. Or, as he puts it:
I help develop and maintain Bitcoin Core, Bitcoin’s primary software implementation. Because of that work, I’m often asked at conferences and workshops what I consider to be Bitcoin’s greatest challenge in the future. My answer is always the same: avoiding catastrophic software bugs.
Somebody bothered debunking a recent article from the Washington Post as a mean to demonstrate how the media coverage of bitcoin is still a total disaster.
Meanwhile, the World Bank mandates the Commonwealth Bank of Australia for the world’s first blockchain bond. Their pitch: blockchain could hugely streamline the process of issuing bonds, which has been heavily reliant on physical paperwork for the past 200 years. BTW, it’s called the “Blockchain Offered New Debt Instrument,” or “bond-i,” a nod to Sydney’s famous Bondi Beach. 🏄🏾‍♀️
Zooming out
Human-centered design has become the go-to operating system for innovation. however, is it really helpful in solving human problems like poverty and climate change? We can argue that human-centered design is not architected to solve systemic problems:
Human-centered design is all about focus. It’s about observing the big picture and then zeroing in on a manageable set of insights and variables, and solving for those. By definition, this means the process pushes the designer to actively ignore many of a problem’s facets. And this kind of myopic focus doesn’t work when you’re trying to solve something systemic.
The article also cites a study on ride-sharing apps, a category of companies heavily relying on user-centered design. It found that ride sharing adds 2.6 vehicle miles to city traffic for every one mile of personal driving removed. Ride-sharing apps actually make traffic in cities worse. 😱😱
The hidden pollution of the internet, for dummies
Are the digital commons condemned to become “capital commons”? Recently, Katherine Maher, the executive director of the Wikimedia Foundation, wrote on Wired that Facebook and Google must do more to support Wikipedia:
If Wikipedia is being asked to help hold back the ugliest parts of the internet, from conspiracy theories to propaganda, then the commons needs sustained, long-term support – and that support should come from those with the biggest monetary stake in the health of our shared digital networks.
On the P2P Foundation blog, Maia Dereva reflects upon the implications of “capital commons”: should digital giants do more and significantly address the long-term sustainability of the Digital Commons that Wikipedia represents? Are we going back to industrial paternalism? The solution prospected is “enhanced reciprocity licensing”, which would prohibit lucrative commercial entities from reusing common resources, or impose funding on them in return.
Data-driven capitalism is accused to be one of the main drivers of the commons crisis also in this post. BUT! The perspective is completely different. The most valuable piece for me is how it challenges the granularity of data collected to deliver public services, using health data as example (and referring to ProPublica’s story Health Insurers Are Vacuuming Up Details About You — And It Could Raise Your Rates). Basically, the way we collect data - and create the personas and statistics that inform public services - has a huge role in creating inequalities:
When a population is aggregated on high-level data points like age and location, we’re essentially being judged on a simple shared commons – all 18 year olds who live in Los Angeles are being treated essentially the same, regardless if one person has a lurking gene for cancer and another will live without health complications for decades. In essence, we’re sharing the load of public health in common – evening out the societal costs in the process. But once the system can discriminate on a multitude of data points, the commons collapses, devolving into a system rewarding whoever has the most profitable profile.
Bodies and selves
(Find the complete boobs-posting decision diagram in the article below)
(Find the complete boobs-posting decision diagram in the article below)
Is Instagram a tech daddy? I think it might be. If so, my tech daddy sets rules that don’t always make sense. My tech daddy removes content without telling people. And my tech daddy, like all other daddies, has no idea what to do with breasts. So, like the curious girl I am, I’m testing his limits.
The 💪 story of a trans trolling Instagram’s breast policy. Because not everything is black or white, nor fully normative. Good luck handling it all with algorithms. When we challenge automatic classification we challenge first and foremost our human understanding and narration of the topic, which is unavoidably a narrow version of reality. Voices on the Harvard Business Review are even suggesting that for less-biased decisions we should use algorithms: they’re not purely objective, but neither are humans. We can’t help simplifying complexity to understand it: let’s just remember from time to time that reality is convoluted. For instance: why don’t we hear fat women’s #MeToo stories?
While thin women were free to talk about sexual assault as being somehow divorced from desire — rape is about power, not sex — I didn’t have that luxury. As a fat woman, my body was seen as inherently undesirable. Fat women are expected to be grateful for any expressions that could be mistaken for want, including assault and harassment.
Facial recognition, or the new life of determinism: Dive in the magic world of psychologist Michal Kosinski (the guy whose research about the correlation between activity on Facebook and personality traits was well exploited by Cambridge Analytica). Striving to push further the boundaries of AI ( “I can be upset about us losing privacy, but it won’t change the fact that we already lost our privacy, and there’s no going back without destroying this civilisation.”), the researcher’s work goes straight into biological determinism, denying influences from social and environmental factors. Sometimes he stumbles upon unexpected findings, like facial recognition of homosexuality (a piece of research dubbed “AI gaydar”).
Fun fact: a facial recognition software sold by Amazon mistakenly identified 28 members of US Congress as people who had been arrested for crimes. 😃
Here you can find some expert ideas about how to regulate facial recognition.
Apparently, some accents don’t work on Alexa or Google Home: as voice recognition advances, it’s leaving behind users with an accent. I feel discriminated.
The Web we live in
Once, I seriously considered doing a PhD. I wanted to focus on the role of social media in supporting bottom-up movements (it was the Tharir Square, Indignados and Occupy moment), and the relationship between digital mobilisations and the physical space of cities. I’ll go back to the origins sharing this complete (although quite US-centric) account of the steady sinking of all social media riot dreams from 2011 onwards.
Perhaps the simplest statement of the problem, though, is encapsulated in Facebook’s original mission statement (which the social network changed in 2017, after a backlash against its role in spreading misinformation). It was to make the world “more open and connected.” It turns out that this isn’t necessarily an unalloyed good. Open to what, and connected how? The need to ask those questions is perhaps the biggest lesson of all.
The dystopian cherry on top was the Summer coverage of Black Elevation, a fake group on Facebook which created real protest rallies. Apparently the creators did so by building on the back of existing groups. Once they persuaded several members of a group to trust them, they had a “stamp of approval", which let them built out their reach in the community.
The empire strikes back
Early august The Intercept uncovered Google’s plans to go back to China: a search app that will “blacklist sensitive queries” could be launched in six to nine months, according to documents and people familiar with the Dragonfly plan. For sure the move is pragmatic: China now has more than 750 million internet users, equivalent to the entire population of Europe.
The Dragonfly plan disclosure opened of course a case. A coalition of 14 human rights organizations issued an open letter addressed to Google’s CEO, Sundar Pichai. Google employees woke up from their golden cage and started circulating and signing a letter of protest. Which is quite good if you think that it is precisely these privileged workers who can make the difference and pull the plug: more and more highly-demanded profiles are refusing to get job interviews at big techs because of their position on ethical aspects (check the hashtag #TechWontBuildIt). Another example of how the “us / them” narrative against big tech is more complicated. Most of their workforce is on “our” side (whatever this means).
Let’s go back to China. The biggest search engine in the world obeying the censorship in China is a victory for the Chinese government. And a very dangerous precedent. Remember last year’s debate around the plans for a new Chinese ID / social ranking system? The surveillance system is expanding abroad, targeting the Uighur ethnic minority. Beijing has launched an unprecedented global campaign to get them back, or to monitor them where there are via a global registry of Uighurs who live outside of China. What if companies based out of China start (even indirectly) to support this?
To end on a positive riot note, researchers at Citizen Lab have figured out some ways to dodge censorship on WeChat: one of the more effective methods they’ve discovered seems to be sharing images instead of text, which can be easily caught by censors.
Oh, dear!
Patrick Gerard
Facebook is pushing that "share a memory" junk where they make custom videos out of your old photos to boost engagement and I just literally got shown a bunch of happy cartoon characters dancing on my mom's grave. 🤦‍♂️
That’s all! Thanks for reading. If you enjoyed the ride you can express your support by inviting new people to subscribe.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue