Newsletter - Coda Story https://www.codastory.com/tag/newsletter/ stay on the story Tue, 26 Nov 2024 13:21:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://eymjfqbav2v.exactdn.com/wp-content/uploads/2019/07/cropped-LogoWeb2021Transparent-1.png?lossy=1&resize=32%2C32&ssl=1 Newsletter - Coda Story https://www.codastory.com/tag/newsletter/ 32 32 239620515 An unseen interview with Vladimir Kara-Murza: ‘Putin’s propaganda has taught us not to trust a lot of things’ https://www.codastory.com/disinformation/vladimir-kara-murza-russia-political-prisoner/ Thu, 07 Mar 2024 11:45:38 +0000 https://www.codastory.com/?p=50452 Following Alexei Navalny's death, Vladimir Kara-Murza is now the highest profile political prisoner in Russia. In this previously unseen interview from 2019, he talks about being poisoned, what keeps him awake at night and why people in the West shouldn't take their freedom for granted

The post An unseen interview with Vladimir Kara-Murza: ‘Putin’s propaganda has taught us not to trust a lot of things’ appeared first on Coda Story.

]]>
On the day of Alexei Navalny’s funeral in Moscow last week, I held my nose and turned on Russian state television. The evening news on state TV, which is still watched by millions across Russia, led with a funeral. Except it wasn’t Navalny’s. Nikolai Ryzhkov, a former prime minister of the Soviet Union who died at the age of 94 on February 28, was also buried on the same day. He laid in state in Moscow’s main Christ the Savior Cathedral, surrounded by a handful of solemn apparatchiks from Russia’s ruling party. There was no mention on state TV of the alternative vision of Russia that was being buried that day or of the tens of thousands of people who defied the heavy police presence and walked across the city to pay their final respects to Navalny. 

With Navalny dead, the chilling title of Russia’s highest-profile political prisoner now belongs to Vladimir Kara-Murza. Kara-Murza, a 42-year-old historian, journalist and opposition politician, is currently held in Siberia, in the same type of punitive solitary confinement cell that Navalny had occupied before he was transferred to a penal colony in the Arctic, where he died on February 16. 

Kara-Murza, who comes from a long line of Russian dissidents, was arrested in April 2022 and sentenced to 25 years in prison for charges related to his criticism of President Vladimir Putin and the war in Ukraine. By then, he had already survived being poisoned twice, once in 2015 and again in 2017. In February 2021, a Bellingcat investigation uncovered that Kara-Murza had been followed by the same unit of Russia’s Federal Security Service that allegedly poisoned Navalny with a nerve agent in 2020. Kara-Murza was also a close associate of Boris Nemtsov, another slain Russian opposition politician who was gunned down in Moscow in 2015. 

The full-scale invasion of Ukraine, the eventual murder of Navalny, and Russia’s descent into a Stalinesque dictatorship were still only hypothetical scenarios when Kara-Murza and I sat down for an interview in Tbilisi, Georgia, five years ago. He was there for a conference. We talked about disinformation, his hopes for Russia and what makes him angry. (Spoiler: It was the complacency of the so-called “golden billion,” the people who live in democratic countries and take their freedoms for granted, while so much of the world is not free. 

The interview was supposed to become part of a larger Coda project that never got off the ground, thanks to Covid. In the years that followed, I assumed the interview had been lost amid Coda’s pandemic-era transition to a fully remote team, but recently we found the footage. We are publishing it now; the transcript below has been edited for clarity.

https://youtu.be/5mMNlEsp_7A

What keeps you awake at night?

The thought that Russia, my country, one of the most cultured and one of the most beautiful nations in the world has for two decades now been ruled by a corrupt and authoritarian kleptocracy that is stealing from its own people and that is violating the most basic rights and freedoms of its own people. That’s not a normal situation.

What do you think has been your country’s biggest mistake?

I will separate my country from the government of my country and I think the biggest mistake of the government of my country was the failure in the early 1990s to fully reckon with our communist and our Soviet past and to fully reckon with the mistakes and the crimes of that Soviet and that communist past. The Russian people were ready for it in the 1990s but the Russian government at that time was not up to the task.

I’ll say a word and if you can finish the sentence. Disinformation is…

Lying.

Give me an example of a fake news story that fooled you.

I think almost two decades of Putin’s propaganda has taught us not to trust a lot of things that we hear so I’m not even able to think of one immediately.

Give me an example of a fake news story that has had a huge impact on the world.

Well, there was a story in 2014 on Channel 1 on Russian state television about Ukrainian soldiers crucifying a Russian child in the town of Sloviansk and it caused as you can imagine for any normal human beings, it would cause, it caused them an outpouring of anger and grief and resentment and then of course it turned out to be completely and utterly fake, just completely made-up. This is just one small example in a vast sea of propaganda and lying and disinformation put forward by the Putin regime. 

How do we stop democracy turning into plutocracy?

Well, as Winston Churchill said, democracy is a very flawed system but it’s the best one of everything that’s been created, so I think if a democracy functions properly and the institutions function, that in itself is the best guarantee against turning into plutocracy. Because when you have a government that is transparent and accountable to its own citizens, the citizens will not allow it to become a plutocracy.

Is technology helping dictators or democrats?

You know, the sun shines on both good people and on criminals, so I think in the same way, technology can be used for good and for bad and dictators have certainly been savvy very often with using modern technology and so should we be, and I can tell you that, you know, today in my country, in Russia, the internet and the social media are a major instrument that supports the civil society and the democratic movement and the pro-democracy movement, and I’ll give you just one example: Last year, in 2017, there was an investigative video put forward by the Anti-Corruption Foundation led by Alexei Navalny about the corrupt dealings of current Russian Prime Minister Dmitry Medvedev, and of course, if you watched Russian state television, you wouldn’t hear a single word about it and yet tens of millions of people watched that investigative video on YouTube, on Facebook, on Twitter, and tens of thousands went on the streets all over Russia to protest against it. And I think that gives a powerful example of how important and how influential independent information space in modern technology can be.

What’s the biggest benefit of the Trump presidency?

I think it’s a very… I think the Trump presidency in many ways is a very useful reminder to those people who may have harbored some illusions that our job will be done for us by somebody else, that somebody else will come in from the outside and solve all our problems, that it’s not going to happen and it shouldn’t happen because it’s only for us, it’s only for the citizens of Russia to effect change in our country, to change the political situation in our country, to return democracy to our country. It’s not going to be done by Trump or Obama or Bush or Merkel or Macron or anybody else. This is only for us to do, it’s for us to sort out the situation in our own country, and I think the attitude of the current U.S. administration is a very good reminder of that.  

What’s one thing you would tell President Trump?

I will tell him what I will tell any Western leader if I were to meet them, is that, you know, if your country and your government and your system claims to adhere to the values of democracy and human rights and rule of law, then act on it and please stop enabling the crooks and the kleptocrats and the Kremlin by giving them safe havens in your countries for their looted wealth, for their bank accounts, for their real estate, for their families, which is what the West has been doing for many, many years. And that is why it is so important for those countries that have passed the Magnitsky laws, which are the laws imposing personal targeted sanctions on crooks and human rights abusers, to implement those laws to the full extent, and that includes the United States. And that’s why it’s so important for the countries that have not yet passed the Magnitsky laws to pass them and to implement them.

What is one thing you would say to Vladimir Putin?

I have nothing to say to that man. He knows full well what he’s doing. He’s an intelligent person and everything he’s been doing to our country for the last two decades, he’s been doing on purpose and I have nothing to say to this man.

What is Putin’s biggest nightmare? 

The answer is very clear: It’s the people on the streets. We saw how scared and how frightened they were in December of 2011 when we had tens of thousands people on the streets of Moscow protesting against the Putin regime. So for a few days the regime was caught completely surprised. For the first time in their time in power, they had lost the initiative and they were on the defensive and you could see the terror in their eyes and frankly, you know, if you look even at the faces of the policemen who came to the Sakharov Avenue protests on December 24, 2011, when we had something like 120,000 people on the streets of central Moscow protesting against the Putin regime, you can read the fear in the eyes of those police officers and you can read that the biggest thing they were afraid of is that they would be given the order to shoot and they didn’t want to shoot because these people are their friends, their neighbors, their relatives, and that is the thing that is the biggest nightmare for the Putin regime. Just as it has been a nightmare for so many authoritarian regimes all over the world, including here in Georgia.

What should the Western liberal democracies fear more: the government of China or the government of Russia?

You know, I think human rights are universal and the rule of law is universal and the principles of democracy are universal, so I don't want to sound as if I think different standards should be applied to different countries. No, human rights are for everybody, the citizens of China and the citizens of Russia. But I do think that it is important to remember that Russia is a member of the Council of Europe, that Russia is a member of the Organization for Security and Co-operation in Europe, and everything that Vladimir Putin has been doing both in his domestic and his foreign policy over the last two decades is breaking and violating the most fundamental rules of those organizations, and so I think for that reason Western countries should be much more indignated about what Vladimir Putin’s regime has been doing.

What makes you angry about the world?

About how few people care about the violations of the rights of people in other countries. It’s, you know, the so-called golden billion, the people who live in successful democracies. They very often forget that the vast majority of the population of our globe live in countries that are not free, live in conditions that are not free and they lack the basic rights and freedoms that so many people in the West take for granted. I think it’s very important for the people in the West not to lose sight of that.   

What makes you hopeful?

When I look at the tens of thousands of young people who have been and continue to come out to the streets of cities and towns all across Russia to voice their protest against the endemic corruption and the authoritarianism of the Putin regime despite the pressure and the dangers and the threats and the beatings and the arrests. That really makes me hopeful about the future of my country and about the future of the world as well. 

What’s one film everyone should see?

It’s going to be a tough competition, but given everything that we’ve just been talking about, I’d say watch the “Trial at Nuremberg” [“Judgment at Nuremberg”]. It’s a film that made one of the most profound impacts on me, I can tell you that, and that phrase there at the end of the film, when the young prosecutor is talking to this elderly American judge about the situation, I’m not going to give a spoiler, you’ll know what I mean when you see the film, but what that elderly judge answers is that “Yes, what you’re saying is correct, is factually correct, but there is nothing on this earth that makes it right.” I think that’s a very important message to remember in our time as well.

One book everyone should read?

My favorite is “Master and Margarita” by Bulgakov, but I think I’m biased towards Russian literature.

When was the last time you felt really scared?

I suppose it was in the early hours of February 2 of 2017, the second time I was poisoned in Moscow, and I knew what it was because it had happened before so I knew the symptoms and I knew I only had a few hours left of being conscious before falling into a coma again and it was… The scariest thing was not being able to breathe. When you make this movement that every person makes every day, every minute, to take in the air and you feel as if the air is not coming in, you feel as if you’re suffocating, and that’s very painful but also a very frightening experience. 

What does the world look like in five years?

Hopefully with a Russia that has a government that respects the rights of its own people and that respects the rule of law and democracy at home and that behaves as a responsible citizen on the international stage.

This piece was originally published as the most recent edition of the weekly Disinfo Matters newsletter.

The post An unseen interview with Vladimir Kara-Murza: ‘Putin’s propaganda has taught us not to trust a lot of things’ appeared first on Coda Story.

]]>
50452
How tech design is always political https://www.codastory.com/authoritarian-tech/tech-design-ai-politics/ Thu, 29 Feb 2024 18:29:23 +0000 https://www.codastory.com/?p=50026 Social media companies have made many mistakes over the past 15 years. What if they’re repeated in the so-called AI revolution?

The post How tech design is always political appeared first on Coda Story.

]]>
Facebook has a long-maligned yet still active feature called “People You May Know.” It scours the network’s data troves, picks out the profiles of likely acquaintances, and suggests that you “friend” them. But not everyone you know is a friend.

Anthropologist Dragana Kaurin told me this week about a strange encounter she had with it some years back.

“I opened Facebook and I saw a face and a name I recognized. It was my first grade teacher,” she told me. Kaurin is Bosnian and fled Sarajevo as a child, at the start of the war and genocide that took hundreds of thousands of lives between 1992 and 1995. One of Kaurin’s last memories of school life in Sarajevo was of that very same teacher separating children in the classroom on the basis of their ethnicity, as if to foreshadow the ethnic cleansing campaign that soon followed.

“It was widely rumored that our teacher took up arms and shot at civilians, and secondly, that she had died during the war,” she said. “So it was like seeing a ghost.” Now at retirement age, the teacher’s profile showed her membership in a number of ethno-nationalist groups on Facebook. 

Kaurin spent the rest of that day feeling stunned, motionless. “I couldn’t function,” she said.

The people who designed the feature probably didn’t anticipate that it would have such effects. But even after more than a decade of journalists like The New York Times’ Kashmir Hill showing various harms it could inflict — Facebook has suggested that women “friend” their stalkers, sex workers “friend” their clients, and patients of psychiatrists “friend” one another — the “People You May Know” feature is still there today.

From her desk in lower Manhattan, Kaurin now runs Localization Lab, a nonprofit organization that works with underrepresented communities to make technology accessible through collaborative design and translation. She sees the “People You May Know” story as an archetypical example of a technology that was designed without much input from beyond the gleaming Silicon Valley offices in which it was conceived.

“Design is always political,” Kaurin told me. “It enacts underlying policies, biases and exclusion. Who gets to make decisions? How are decisions made? Is there space for iterations?” And then, of course, there’s the money. When a feature helps drive growth on a social media platform, it usually sticks around.

This isn’t a new story. But it is top of mind for me these days because of the emerging consensus that many of the same design mistakes that social media companies have made over the past 15 years will be repeated in the so-called “AI revolution.” And with its opaque nature, its ability to manufacture a false sense of social trust and its ubiquity, artificial intelligence may have the potential to bring about far worse harms than what we’ve seen from social media over the past decade. Should we worry?

“Absolutely,” said Kaurin. And it’s happening on a far bigger, far faster scale, she pointed out.

Cybersecurity guru Bruce Schneier and other prominent thinkers have argued that governments should institute “public AI” models that could function as a counterweight to corporate, profit-driven AI. Some states are already trying this, including China, the U.K. and Singapore. I asked Kaurin and her colleague Chido Musodza if they thought state-run AI models might be better equipped to represent the interests of more diverse sets of users than what’s built in Silicon Valley.

Both researchers wondered who would actually be building the technology and who would use it. “What is the state’s agenda?” Kaurin asked. “How does that state treat minority communities? How do users feel about the state?”

Musodza, who joined our conversation from Harare, Zimbabwe, considered the idea in the southern African context: “When you look at how some national broadcasters have an editorial policy with a political slant aligned towards the government of the day, it’s likely that AI will be aligned towards the same political slant as well,” she said.

She’s got a point. Researchers testing Singapore’s model found that when asked questions about history and politics, the AI tended to offer answers that cast the state in a favorable light.

“I think it would be naive for us to say that even though it’s public AI that it will be built without bias,” said Musodza. “It’s always going to have the bias of whoever designs it.”

Musodza said that for her, the question is: “Which of the evils are we going to pick, if we’re going to use the AI?” That led us to consider that a third way might be possible, depending on a person’s circumstances: to simply leave AI alone.

This piece was originally published as the most recent edition of the weekly Authoritarian Tech newsletter.

The post How tech design is always political appeared first on Coda Story.

]]>
50026
Samoa’s official Facebook page is a battleground between anti and pro-vaxxers https://www.codastory.com/polarization/samoa-vaccines-facebook/ Mon, 23 Dec 2019 14:46:03 +0000 https://www.codastory.com/?p=10636 Hi everyone, Inge here, Coda’s Impact Editor. This week, I want us to dip our toes into the real-life implications of vaccine skepticism. With two friends who are anti-vax, and my son currently attending one of the notoriously vaccine-sceptic Waldorf schools, I’ve heard all the arguments before. On both sides, the discussion is tainted with

The post Samoa’s official Facebook page is a battleground between anti and pro-vaxxers appeared first on Coda Story.

]]>
Hi everyone, Inge here, Coda’s Impact Editor. This week, I want us to dip our toes into the real-life implications of vaccine skepticism. With two friends who are anti-vax, and my son currently attending one of the notoriously vaccine-sceptic Waldorf schools, I’ve heard all the arguments before. On both sides, the discussion is tainted with strong emotions about the wellbeing of our children. 

As a parent, when your child is sick, it’s the worst feeling in the world: you feel helpless. As I write this, my son is lethargic on a couch with a nasty ear infection. But it’s just an ear infection, and I know it will pass. 

Parents in the small Pacific country of Samoa haven’t been that lucky. The island nation with a population of less than 200,000, has been struggling with a major measles outbreak for a few months now. The disease has so far infected around 4,500 people and killed 76, of which more than 60 were younger than four.

The outbreak has animated both sides of the vaccination argument. People from all over the world have been passionately arguing about the subject on the government’s official Facebook page.

This horrible tragedy shows what happens when vaccine skepticism takes a hold. 

Vaccination rates in Samoa plummeted to just 31% after July 2018, when two young children died after receiving vaccinations. It was a tragic accident. The nurse administering the vaccines had mixed the measles-mumps-rubella vaccine powder with expired muscle relaxant instead of water. But for those already fearful of vaccines, the deaths served as proof that vaccines are dangerous.

With children dying of measles, you may think this clearly shows how dangerous it is to decline vaccinations. But to those who worry vaccines might kill their children instead, a deep-rooted fear has led them to believe in the existence of a massive cover up in Samoa. And they are sharing their views on the government’s official Facebook page. 

Prior to the 5th of December, the government’s official Facebook page had only seven reviews. The recommendations all praised the government and seem to have come from a few of Samoa’s residents. 

But that all changed when Samoa began a mandatory immunization drive in November to vaccinate 90% of the population. Authorities also arrested a prominent anti-vaccination campaigner Edwin Tamasese, charging him with incitement against a government order. Tamasese had described the government’s vaccination program as "the greatest crime against our people", and falsely claimed vitamin C could cure the infected children.

In merely a week, Samoa’s Facebook page became the battleground of a war between anti and pro-vaxxers. 

“A tyrannical government run by the cartel to arrest a man for giving Vit A and C to dying children with measles (who improved), while pharmagov gave out ineffective Tylenol and antibiotics. Government of Samoa places revenue and pride ahead of health safety,” one Facebook user wrote while giving a 0-star review.

Renee DiResta, a disinformation expert recently interviewed by Coda reporter Isobel Cockerell on how anti-vaxxers get around Instagram’s new hashtag controls, noticed an increase in negative reviews regarding the government’s new vaccination policy.

“A brigade of antivax Karens from the US descending upon the FB page of the Government of Samoa *to leave 1-star reviews* is...next level,” she tweeted

In response to the reviews from anti-vaxxers, hundreds of Facebook users took to the page to support the government’s mandatory vaccine policy. 

For many, it’s hard to understand why people would be so vehemently anti-vax and the outbreak in Samoa exemplifies how a refusal to vaccinate can kill. But for the anti-vax community, the belief that vaccines kill is stronger and linked to a decline in trust in expertise. 

As mentioned in almost every negative review on Samoa’s FB page, the community believes that the medical establishment - often described as Big Pharma - is paying scientists and governments to push unsafe vaccines. In their defense, it wouldn’t be the first time: medical companies have influenced research on the effects of sugar, exaggerated the importance of cereals for breakfast and downplayed the addictive nature of opioids, resulting in a massive addiction crisis in the U.S. These scandals have contributed to a growing distrust in science and medicine.

Starting in 2020, we are launching our new coverage on the war on science, and we’ll be investigating why people lose trust and how some bad actors take advantage for monetary gain or political power. We hope our coverage will uncover the many layers of this discussion.

My favorite Coda Stories this week:

We recently reported on the disinformation surrounding the measles outbreak in Ukraine. Like Samoa, Ukraine is one of the countries hardest hit, with more than 115,000 cases and 40 deaths since 2017.

Other recommended reads:

  • Disinformation has infiltrated the media and elections in the West. In Italy, Russian propaganda now seems to have played a role in a murder trial. (New York Times)
  • Read this special issue of the Columbia Journalism Review which focuses on disinformation. (CJR)
  • Listen to this Power 3.0 podcast interview with Coda Story’s co-founder Natalia Antelava, in which she discusses new and innovative ways to report disinformation, as well as the impact of authoritarian technologies.

The post Samoa’s official Facebook page is a battleground between anti and pro-vaxxers appeared first on Coda Story.

]]>
10636
What to expect from Authoritarian Tech in 2020 https://www.codastory.com/authoritarian-tech/authoritarian-tech-2020/ Thu, 19 Dec 2019 09:50:42 +0000 https://www.codastory.com/?p=10610 Hi, it’s Mariam Kiparoidze, an associate producer at Coda Story. If you subscribe to our Disinformation Matters newsletter, you might know me from there, but this is my first Authoritarian Tech newsletter to you. And since we’re jumping into a new year, this week, I decided to ask tech experts for their predictions about authoritarian

The post What to expect from Authoritarian Tech in 2020 appeared first on Coda Story.

]]>
Hi, it’s Mariam Kiparoidze, an associate producer at Coda Story. If you subscribe to our Disinformation Matters newsletter, you might know me from there, but this is my first Authoritarian Tech newsletter to you.

And since we’re jumping into a new year, this week, I decided to ask tech experts for their predictions about authoritarian tech in 2020. Here’s what they said:

Cities will increasingly become smarter

“I think we will see ever more powerful or ever more ubiquitous smart city systems being installed around the world,” said Peter Bihr, a managing director of The Waving Cat.

The Waving Cat is a research and strategy company in Germany that explores how emerging technologies can have positive social impact. 

Bihr sees a trend towards more surveillance and control from authoritarian regimes, like China, across all digital spheres, but most visibly around Smart City tech. 

The tech supply companies often come from supply chain management or data analytics fields, said Bihr. 

“So if the background is data analytics and tracking people’s behavior, then that is what they’ll do in whatever smart city project they do. They would track people in public spaces just as they would on the web.” 

Think of the huge plan Google’s Sidewalk Labs has for making Toronto’s shoreline smarter.

So what?

These seemingly benign mechanisms can turn dark pretty quickly. The tech companies use what they may have already learned in industrial settings and apply their findings to public spaces, said Bihr.

However, Bihr believes the future is not so grim. As big tech companies face a crisis of legitimacy and trust, the issue has sparked an active debate among policymakers.

“There’s also for the first time a debate in policy circles, that acknowledges that there are in fact human rights implications, and we need to first take a look at them,” Bihr told me.

Internet blackouts aplenty

“I think the most concerning development is the increasing use of internet blackouts to cover up protests and repression,” said Dr. Jennifer Cobbe, who researches technologies and law at the University of Cambridge. 

Throughout this past year we’ve seen anti-government protests and crackdown on dissent in a number of countries from Russia to Hong Kong. 

Check out Coda’s stories from Hong Kong and Lebanon, where governments have managed to tech into tools of repression.

While governments in Iran, Iraq, Sudan and Kashmir went as far as shutting down the internet to curb protests. “I'd be very surprised if we don't see more of this in 2020,” said Dr Cobbe.

Why it matters:

“For all the internet's many problems, it's still one of the primary ways by which vulnerable and oppressed people can reach the rest of the world, said Dr Cobbe. 

“The more that authoritarian governments are willing to shut down the internet, the less chance those people have of reaching us, and the less chance we have of mobilizing to help them.”

Welfare is getting more digital in the coming year

While researching 2020 trends, I reached out to Big Brother Watch, a UK based organization which exposes threats to privacy and civil liberties from new technologies. 

“We'll see how dramatically public services are shifting into automation,” Big Brother Watch’s director Silkie Carlo told me. 

She says “obscure computer systems have embedded socio-economic policies that are harder to challenge than the paper-based bureaucracy we're used to.” 

Carlo says the decision-making process behind the allocation of welfare benefits is 

becoming increasingly digitized and opaque. At the same time, expanding systems of surveillance will govern the lives of many people.

Profiling and stereotyping maybe an old issue, but what worries Carlo is that it is now happening under the guise of innovation.

 “The very concept of these kind of predictive analytics is dangerous and also it necessitates growing systems of mass surveillance and data collection.”

Why you should care:

Carlo told me that even in democratic countries the relationship between citizens and the state is being rebalanced. “It’s not in our favor,” she said. 

There is no better example than the shocking story of Ruth Cherry, a young woman with multiple disabilities in Glasgow, Scotland. For Coda Story, journalist Emily Reynolds wrote about Cherry, who depends on a remote “telecare” system for help. 

My favorite Coda Story this week:

  • Coda Story’s Chaewon Chung explores how North Korean refugees’ lives have been affected after a hacker stole personal information from hundreds in a refugee community in South Korea. “For North Korean refugees, vulnerabilities in the computer system of the institution that holds so much of their information raised serious concerns,” writes Chaewon.

Recommended reading:

  • Reuters’ latest investigation reveals how former White House officials were involved in forming a secret surveillance unit to serve the United Arab Emirates’ agenda in spying on activists, journalists and dissidents. (Reuters)
  • The New York Times has published a great oral history project on what has become of tech in the past decade. (New York Times)

A final note this week:

I recommend this Power 3.0 podcast interview with Coda Story’s co-founder Natalia Antelava. In this episode, from The International Forum for Democratic Studies, Antelava discusses new and innovative ways to report disinformation, as well as the impact of authoritarian technologies.

We will be taking a season break from the Authoritarian Tech newsletter next week. I wish you all happy holidays!

Send your feedback or tips to me at mariam@codastory.com

The post What to expect from Authoritarian Tech in 2020 appeared first on Coda Story.

]]>
10610
Data protection comes to India https://www.codastory.com/authoritarian-tech/data-protection-india/ Fri, 13 Dec 2019 13:15:39 +0000 https://www.codastory.com/?p=10402 After a number of delays, India has finally proposed a set of rules for data protection laws. While the proposed legislation could place restrictions on how companies like Facebook and Twitter use information from India’s 600 million internet users, it also risks balkanizing the internet with the types of government controls seen in countries like

The post Data protection comes to India appeared first on Coda Story.

]]>
After a number of delays, India has finally proposed a set of rules for data protection laws. While the proposed legislation could place restrictions on how companies like Facebook and Twitter use information from India’s 600 million internet users, it also risks balkanizing the internet with the types of government controls seen in countries like China.

According to the new rules, which were leaked on Tuesday, the Indian government would require technology companies to garner consent from users before collecting and processing their personal data. A newly proposed government agency, the Data Protection Authority, will write specific rules, monitor how companies are applying them and mediate to settle disputes.

While the bill, dubbed the “Personal Data Protection Bill 2019,” addresses long standing needs for data protection in India, it also posits more worrying concerns about authoritarian overreach by the government.

According to the current draft, which is expected to be debated by lawmakers over the next few weeks, the rules would allow the government to “exempt any agency of government from application of Act in the interest of sovereignty and integrity of India, the security of the state, friendly relations with foreign states, public order.”

Of equal concern is how the bill mandates the use of public data by the government. One rule seeks to grant the government the power to ask any “data fiduciary or data processor” to hand over “anonymized non-personal data” with the aim of helping legislators deliver better governance and services to their citizens. 

While the bill is unclear, this could essentially allow the government to request user data from social media platforms like Facebook and Twitter. 

Critics of the bill have been quick to highlight their concerns. Udbhav Tiwari, a public policy advisor at Mozilla, said the bill would “represent new, significant threats to Indians’ privacy. If Indians are to be truly protected, it is urgent that the Parliament reviews and addresses these dangerous provisions before they become law.”

The draft has also come under fire from a judge who led the committee that drafted an earlier version of the bill. Justice BN Srikrishna said the new version is “dangerous” and could turn India into an “Orwellian State”. 

“They have removed the safeguards,” said Srikrishna. “That is most dangerous. The government can at any time access private data or government agency data on grounds of sovereignty or public order. This has dangerous implications.”

Rahul Matthan, a partner at the Indian law firm Trilegal, said that businesses had real concerns as to whether the new data authority would have the capacity to manage all of its responsibilities. “We are expecting this Data Protection Authority to be at the standard of a G.D.P.R. without any experience,” Matthan told the New York Times. “That’s a tall ask.”

You can read a good breakdown of the bill in this Twitter thread by digital advocates Media Nama.

My favorite Coda piece this week: 

Scant coverage has been given to Algeria, where a popular protest movement against an entrenched regime has mobilized weekly protests for over 40 weeks. Ahead of a presidential election last Thursday, we published this story by Layli Faroudi about how pro-regime social media accounts, dubbed “electronic flies”, targeted platforms like Facebook and Twitter with fake news and fear mongering.

Recommended reads:

  • China is about to become becoming the first major economy to issue sovereign digital money. According to reports, a digital currency has been in development In China for several years and may be rolled out in a small-scale pilot in Shenzhen as early as this month, before expanding in 2020. Important questions about privacy and anonymity, as well as how much transaction information the government will be able to access, remain unanswered. (MIT Technology Review)
  • From offering users the chance to take a Tai Chi class, fly above New York City or even have a guided meditation with Jesus, virtual reality headsets like the Oculus Quest are having a major effect on retail, science and security. In this entertaining and often funny essay, Patricia Marx tries out the best of the Oculus store. (New Yorker)

If you have any tips about how governments and companies use authoritarian technology, you can send me an email at burhan@codastory.com or contact me on @BWazir1.


The post Data protection comes to India appeared first on Coda Story.

]]>
10402
When the internet stopped being fun https://www.codastory.com/authoritarian-tech/internet-fun/ Thu, 05 Dec 2019 14:12:20 +0000 https://www.codastory.com/?p=10171 As this decade comes to a close, I keep hearing variations of the same nostalgia-tinged question about the internet: “When did we all stop having fun?” In November, the New York Times Magazine devoted an entire issue to the question. “The Internet Dream Became a Nightmare. What will become of it now?”  one headline ran.

The post When the internet stopped being fun appeared first on Coda Story.

]]>
As this decade comes to a close, I keep hearing variations of the same nostalgia-tinged question about the internet: “When did we all stop having fun?” In November, the New York Times Magazine devoted an entire issue to the question. “The Internet Dream Became a Nightmare. What will become of it now?”  one headline ran. The digital edition was published with colorful nostalgic touches like fake pop-up ads and retro mouse cursors. 

In her new memoir, Trick Mirror: Reflections on Self-Delusion, the New Yorker’s Jia Tolentino tracks down the website she built for herself at ten years old. Using her father’s office computer, she blogged, writing prophetically: “I’m going insane! I literally am addicted to the web!”  

“In 1999, it felt different to spend all day on the internet,” Tolentino writes. “People had been gathering on the internet in open forums, drawn, like butterflies, to the puddles and blossoms of other people’s curiosity and expertise.”

I remember 1999 too. At my London primary school, a Canadian boy, Pierce, told us about Google. “You can search for anything on Google, anything in the world, and you don’t need a web address, you can just use normal words,” he said. “And it will give you the answers.”

No one had heard of it. The concept seemed too huge and all-consuming for us to understand.

“What’s it called again?”

“Google.” 

We laughed at the strange name, and thought Pierce was probably making it up.

“I remember what the internet was like before it was being watched, and there has never been anything in the history of man like it,” Edward Snowden said in the 2014 documentary Citizenfour, describing how children once freely conversed with world-renowned experts halfway across the world. “We’ve seen the chilling of that model towards something where people self-police their own views.”

I was too young to really take part in this great conversation. Instead I played Miniclip games and watched the badger mushroom video on a loop, streamed on a website that wasn’t YouTube. The internet was at its heart, fun – the only spectre that loomed was my parents’ terror of “Chat Rooms.”

For Dr. Christopher Markou, an artificial intelligence expert at Cambridge University who I met in October at a tech conference in Armenia, the arrival of Facebook was a joyous moment. “It was really cool! I could go on and be like “who likes Radiohead at my University? And I could go and make friends with everyone who has the same music as me,” he said. “But as soon as the newsfeed came in, I was like, “Oh no. I opt out.”

What bothered Markou, who has ADHD, was how the newsfeed’s algorithm held his and his peers’ attention. Now, he says, “I’ve realized that the rest of the world has become like how I was told not to be,” he said.

TikTok clings to its joyful image

The story of TikTok, the short-form video app that’s a favorite platform among teens for selfie videos and singalongs, is somewhat of a fable for the larger tale of the internet. Faced with a looming U.S. national security investigation, and a California lawsuit claiming TikTok sent reams of data to China, TikTok is scrambling to show that at its heart, it’s still good fun.

Last week, I wrote about how the platform banned a user after she posted a make-up tutorial that discussed China’s imprisonment of Uyghur Muslims. TikTok said her phone was blocked in relation to a different video, a statement which appeared to fall flat when the platform then also removed the Uyghur video. The company said the video removal was a “human moderation error.”

In response to the criticism the platform received, TikTok’s Head of Safety Eric Han wrote a blogpost re-iterating the company’s “common goal of providing a platform that fulfils its core purpose of bringing creativity and joy to others.”

“The episode has highlighted a signature challenge facing TikTok: Famous for its lighthearted memes and singalong videos, the app increasingly finds itself facing scrutiny due to its close ties to a Chinese conglomerate that must adhere to the country’s strict censorship rules,” the Washington Post’s Drew Harwell and Tony Romm wrote last week.

But TikTok’s answer to staying “joyful” appears simply to eliminate content that sparks joyless responses. According to a leak by a TikTok whistleblower, the platform hides videos of people with disabilities, disfigurements, and autism – in the name of stamping out cyberbullying. 

My favorite Coda Stories this week: 

  • At the forefront of Europe’s battle for tech transparency For Coda, reporter Filip Brokes took a look at the ongoing battle between the European Union and tech giants like Facebook and Google.
  • Inside the China Cables: Video surveillance, confessions and ‘de-extremification’ in Xinjiang. I took a closer look at some of the language used in last week’s leak of documents instructing Xinjiang authorities how to run the region’s vast network of concentration camps for Uyghurs and other Muslims.
  • And on the subject of the lost days of the internet, do read Eduard Saakashvili’s news brief on the sale of the .org domain to an investment firm. The firm, Ethos Capital, is owned by three Republican billionaire families, who will now have the power to charge higher prices to the thousands of nonprofits who use .org in their URL, and will also be able to control and censor anything that gets posted to those domains.

The post When the internet stopped being fun appeared first on Coda Story.

]]>
10171
China & Russia: Two different approaches to ‘internet sovereignty’ https://www.codastory.com/authoritarian-tech/china-russia-internet-sovereignty/ Thu, 28 Nov 2019 17:11:35 +0000 https://www.codastory.com/?p=10149 When we launched our Authoritarian Tech channel a year ago, the talk of the town was a war of two internets: the “open” internet on one side and another doctrine, called “internet sovereignty,” on the other. The internet sovereignty side was represented by China and Russia, and when we needed a header image for our

The post China & Russia: Two different approaches to ‘internet sovereignty’ appeared first on Coda Story.

]]>
When we launched our Authoritarian Tech channel a year ago, the talk of the town was a war of two internets: the “open” internet on one side and another doctrine, called “internet sovereignty,” on the other.

The internet sovereignty side was represented by China and Russia, and when we needed a header image for our article about it, we made the one above.

But how united are China and Russia really, in their approach to the internet?

That’s the question a new report from the Hague Program for Cyber Norms, funded by the Dutch Ministry of Foreign Affairs, tries to answer.

On the surface, the two countries share a lot. For one, both national governments exhibit a kind of siege mentality about online information, viewing the free flow of ideas on the internet as a national security threat. “[B]oth countries see online information as the primary risk,” the report concludes.

But there is one crucial difference as well: China has a massive tech industry, and Russia… well, it’s not as big. From the crackdown on Uyghurs to the digital components of Belt and Road, a powerful Chinese tech industry fuels key pillars of national policy. Russia, despite some advances, is lagging behind.

Even in their hacking operations abroad, the two countries are different. Russia “seems to pursue a policy of political destabilization of Western countries,” while Chinese hackers “have mostly concentrated on economic targets”—stealing intellectual property to benefit domestic industries. In essence, Russia trolls, China steals.

On the whole, a picture emerges of China as a rising digital and economic superpower with long-term ambitions and serious investment in technology. Russia, on the other hand, comes across as a laggard that fears the internet at home while using it to muddy the waters abroad.

This week in Coda

If that report suggests the “authoritarian” countries differ in their cyberspace philosophy, the same can be said for “free” countries. It’s becoming increasingly clear that the EU, through initiatives like the GDPR, is paving a “third way,” distinct from both America and, say, China: a relatively free internet that still acts aggressively to contain the excesses of surveillance capitalism.

Regular Coda contributor Filip Brokes went to a conference attended by some of Europe’s leading minds in tech regulation. “We just want them [tech platforms] to share with the public the decisions behind building these algorithms,” Katarzyna Szymielewicz from the Polish NGO Fundacja Panoptykon told him. Read his dispatch here.

OTHER NEWS:

  • I recommend this thoughtful critique of “data is the new oil” metaphors. Yes, data is valuable, but calling it oil takes for granted that it’s a commodity, and suggests the best we can hope for is a better-regulated surveillance capitalism. (Real LIfe)
  • You probably saw, or at least heard of, Sacha Baron Cohen’s address about social media last week. Not everyone is enthusiastic about it, even those often critical of tech. The Verge’s Casey Newton, for example, usually quite critical of bad social media moderation, has written an in-depth, thoughtful critique arguing that Cohen’s proposals are inadequate. (The Verge)
  • China used to be skeptical of Bitcoin, but now seems to be pivoting to adopting digital currency on a government level. This article argues that cryptocurrency could become the foundation for Chinese soft power. (Observer)
  • Fresh reporting on Western tech companies’ complicity in China’s crackdown in Xinjiang. (WSJ)
  • Why are data centers so often in abandoned industrial buildings? Sure, maybe it’s convenient. But what if it’s actually because of neoliberalism? That’s what this journal article tries to explain. (Work Organisation, Labour, and Globalisation)

The post China & Russia: Two different approaches to ‘internet sovereignty’ appeared first on Coda Story.

]]>
10149
My trip to Brussels made me think about the gig economy and the early industrial age https://www.codastory.com/authoritarian-tech/apps-traffic-rights/ Thu, 21 Nov 2019 14:30:06 +0000 https://www.codastory.com/?p=10042 Hi everyone, Inge Snip here, Coda Story’s Production Editor. In this week’s Authoritarian Tech newsletter, I’ll be exploring the price of convenience in our digital age. I’ll also examine how apps have helped create a world with more traffic jams, eroded labor rights while increasing inequality. “The gig economy resembles the early industrial age” I

The post My trip to Brussels made me think about the gig economy and the early industrial age appeared first on Coda Story.

]]>
Hi everyone, Inge Snip here, Coda Story’s Production Editor. In this week’s Authoritarian Tech newsletter, I’ll be exploring the price of convenience in our digital age. I’ll also examine how apps have helped create a world with more traffic jams, eroded labor rights while increasing inequality.

"The gig economy resembles the early industrial age"

I am writing this newsletter from a beautiful Airbnb in Brussels. I ordered myself an Uber from the airport to get here, and I used Google Maps to find the nearest restaurants and grocery shops. 

Ten years ago, I would have stayed in an overpriced hotel or cramped hostel and used paperback copies of Lonely Planet travel guides to find out how to take a bus. Remember the stress of wondering whether or not you are on the right bus in a country where you don't speak the language? I do.

Today, from the gig-economy to infrastructure, to digitized bureaucracy: with one simple click we can make money as freelancers, avoid traffic jams, and navigate all aspects of modern life.

At Coda Story, we often cover the ways in which authoritarian regimes abuse and misuse technology to suppress their citizens. Perhaps it’s time to look at how we are enabling tech companies to abuse us in order to make profits. What is the price of our convenience?

I wanted to understand more about this, so I contacted Seda Gürses, Assistant Professor at the Delft University of Technology in the Netherlands.

Gürses researches what she calls “optimization” systems that enhance features depending on user behavior. The resulting data is then used in the interest of the company for the extraction of value. 

“There are deliberate ways of scaling technologies to extract value, while externalizing certain costs onto others,” explained Gürses. 

The Google-owned app Waze is one example of how optimization works — it also illustrates some of the problems generated by modern tech. The app promises to optimize our paths and the time it takes to travel from A to B. It also regularly enhances its appeal by offering new options and even a new design so that we, the users, are happy. As a result, more people will start using Waze, which makes its investors happy. All of this adds to the company’s cache of location data — which Waze can then use to build new and better services. 

But by optimizing our paths, by redirecting us through service routes, Waze actually causes more congestion. Some of these roads weren’t designed for the amount of traffic generated by the app. If only a few people use the app, there’s no problem. But if a lot of people start using it, it actually increases congestion for everyone. 

“This problem has been vastly overlooked,” Alexandre Bayen, the director of UC Berkeley’s Institute of Transportation Studies, told CityLab. “It is just the beginning of something that is gonna be much worse.”

But it’s not only our infrastructure that’s at stake, it’s happening all across the board. 

One such example is provided by Uber. Normally, companies pay for the time people spend at work. Uber only pays its drivers when they are ferrying customers. This means that they have externalized the risk of not having customers onto individual workers. 

“For all its app-enabled modernity, the gig economy resembles the early industrial age… it is truly a movement forward to the past,” writes Assistant Professor of Sociology at Mercy College in her book “Hustle and Gig. Struggling and Surviving in the Sharing Economy.”

The gig economy has had a profound effect on some of our urban cities. All over the world, residents in cities have seen a decrease in certain jobs in their neighborhoods as investors buy up apartments and rent them out to tourists. This BBC article from last year summarizes a few studies uncovering the issues with Airbnb.  It’s led to governments imposing stricter laws on Airbnb, but this is just like putting a Band-Aid on a much larger problem. 

The problems I mentioned above are often referred to in academia and in policymaking as “AI Accidents,” but Gürses explains to me that they are far from accidents, they are often deliberate. 

“I think calling these issues accidents is completely ignoring the whole political economy of it,” said Gürses. 

How did we get here?

 Until the end of the 1990s, software came in a box, it was on a floppy disk or a CD ROM. Companies could only change and update the software when a new version was released, about every few years. 

Currently, tech companies can make changes constantly, they can follow our behavior, and adjust their software. They can actually experiment with services in a way that they couldn't do before. 

The companies track every we move we make on apps, collect all of our data and as a result, know if the design of the software, or the features it offers, don’t work as they had planned. They can improve their software using this feedback, but they can also optimize user behavior in a certain direction, orienting us towards the interests of the company. 

“We really need to stop talking about only data and algorithms, and instead focus on what the forms of governance we want to have for these new technologies,” Seda tells me when I ask her if there’s a solution.

My favorite Coda story this week

If you want more context, Keren Weitzberg wrote a store about mobile credit in Kenya, She finds that not only have credit apps led to the government and private companies having a wealth of private data at their disposal, the easy access to credit has also impacted personal debt levels.

But sometimes, technology can also shine a light on some of the darkest corners of the world. Read Isobel Cockerell’s article on how Uyghurs game TikTok’s algorithm to find a loophole in Xinxiang’s information lockdown.

In other news:

  • Have a look at this entertaining celebrity deepfake roundtable featuring Tom Cruise and Jeff Goldblum. (The Verge)
  • Under President Erdogan's censorship of the web, Turks can easily find everyday information. But they never fully know what they're missing. (NYT)
  • Big technological shifts can aid the work of reformers. Unfortunately, the same shifts can also empower racists and propagandists. (The New Yorker)
  • Activists who want Congress to ban facial recognition have found a novel way to draw attention to the issue. (Vox)

The post My trip to Brussels made me think about the gig economy and the early industrial age appeared first on Coda Story.

]]>
10042
“I want to use the governor’s race to continue to talk about Facebook.” https://www.codastory.com/authoritarian-tech/facebook-political-ads/ Thu, 14 Nov 2019 10:24:24 +0000 https://www.codastory.com/?p=9793 The news has been awash with stories about Facebook recently, but there’s one that I’ve enjoyed following the most. Adriel Hampton is a liberal activist from San Francisco. He made national headlines back in 2009 for being the first congressional candidate to announce his run on Twitter, a company that at the time Politico described

The post “I want to use the governor’s race to continue to talk about Facebook.” appeared first on Coda Story.

]]>
The news has been awash with stories about Facebook recently, but there’s one that I’ve enjoyed following the most.

Adriel Hampton is a liberal activist from San Francisco. He made national headlines back in 2009 for being the first congressional candidate to announce his run on Twitter, a company that at the time Politico described as a “trendy social networking site”.

Hampton lost the race then and it’s pretty likely he’ll lose the election he’s just entered: California’s gubernatorial election in 2022. But he seems to be ok with that.

His campaign is focused on a central goal: “I want to use the governor’s race to continue to talk about Facebook,” he told me when we spoke on the phone.

Hampton wants to see Facebook reverse its policy on allowing politicians to run ads with false messages on the site. Or at least bring some more attention to the company’s controversial rule. So on October 28 he registered for the gubernatorial race to further test Facebook’s policy. It’s earned him another online “first” — Hampton is the only politician or candidate in the U.S. that is ineligible to run false ads on the site.

“This person has made clear he registered as a candidate to get around our policies, so his content, including ads, will continue to be eligible for third-party fact-checking,” wrote a Facebook spokesperson to Recode.

When I spoke with Hampton he was about 50% gleeful in describing Facebook’s one-man exception for his run and 50% earnest about his cause:

“For me this is really high stakes,” Hampton said. “I’m treating it like a joke because that’s a certain way to get yourself into the narrative. But, the pivot is to talk about how serious it is.”

“I feel like that's the most important thing I could do politically right now to ensure fair elections in 2020,” he continued. 

Hampton says a major motivation for his campaign is how he’s seen climate change denialism spread on social media platforms, through conservative news outlets and in statements made by President Trump.

“That threatens the lives of my family,” he said. “The last time the media really had a huge effect in bolstering the government's lies was the Iraq war. And at the time, two of my brothers were in the service and I was just totally blown away. And it really radicalized me.”

Since his last run a decade ago, he’s worked several jobs: as the political editor for the San Francisco Examiner, a social media consultant to progressive campaigns and as an investigator for the San Francisco city attorney’s office.

Hampton is critical of the business connections of California Governor Gavin Newsom and other  former businessmen turned politicians: “They're kind of like golden retrievers for the corporations. I want to be more like a pit bull for the people.”

While 2022 is still a long way off, Hampton’s goal for now is to try to find more loopholes in Facebook’s policies and to get more people registering to run for office in order to run false ads: “I think that's what you have to do to win an activist battle against a large corporation. You have to make it terribly embarrassing.”

Facebook’s Supreme Court

I also asked Hampton what he thinks about another recent policy announcement from Facebook. By the end of the year, the company will create an Oversight Board made up of 40 part-time members tasked with reviewing policy decisions submitted by Facebook. According to the company, the board will act independently, although it will be funded by a trust paid for by Facebook. 

Writing in The New York Times, Kara Swisher enthusiastically embraced the move, even saying that she’d like to apply “to be judge and jury over Mark Zuckerberg” in an opinion piece headlined: “Facebook Finally Has a Good Idea.”

Hampton wasn’t quite as keen for a spot on a Facebook-funded supreme court. “No, I'm interested in getting elected to make such a board accountable to the people and not to Facebook,” he said. “We need independent oversight for Facebook that is elected by the people or is accountable directly to elected officials.”

He’s not the only person who is skeptical about the board. Dr. Jennifer Cobbe, who coordinates the Trust and Technology Initiative at Cambridge University, said she would also decline such an offer. “I wouldn’t join it if Facebook offered me all the money in the world,” she said. “I think generally this proposal is another attempt by Facebook to centralize as much power as possible in the hands of Facebook.”

Whiplash

Talking to Hampton, Cobbe, or other people in the tech field makes me think of all the various family members or friends who have closed their Facebook profiles, or at least deleted the app from their phones. It really does feel like we’re in the middle of a “watershed” moment as described by Hampton.

But here are some numbers I can’t get out of my head:

  • Facebook added 39 million monthly active users during its second quarter this year
  • Twitter added 5 million daily users in the same quarter
  • As of last year, over 25% of American households own a smart speaker like Alexa. (I wonder what that number will that number look like after the holidays)
  • 72% of Americans use some form of social media, according to the Pew Research Center, a number that keeps climbing

These numbers come from (another) opinion piece from the Times: “There is No Tech Backlash”. When I searched to find the piece again to double check the figures, I stumbled on this column in The Verge telling us that not only is the tech backlash real, it’s also “accelerating.” The following article it in Google search had a different take: “The backlash to the tech backlash.”

Other great Coda stories

Russia is building a facial recognition system that, according to some, may even be bigger than China’s 200 million camera system.

In Lebanon, WhatsApp is a major tool for fear and intimidation targeting protestors. Emily Lewis looks at how unverifiable rumors spread on the messaging app are impacting Beirut’s unprecedented mass demonstrations.

Most of the coverage about China’s social credit system paints the system as Orwellian. Rui Zhong takes a deeper look at the rise of the schemes.

The post “I want to use the governor’s race to continue to talk about Facebook.” appeared first on Coda Story.

]]>
9793
Inside Xinjiang’s five-star propaganda tour https://www.codastory.com/authoritarian-tech/propaganda-tour-xinjiang/ Thu, 07 Nov 2019 07:29:38 +0000 https://www.codastory.com/?p=9627 Covering a crisis from the outside is a challenging, but often necessary, job. I have never set foot anywhere near China — and yet I’ve spent much of the past year reporting on stories from inside Xinjiang, the modern-day police state in the country’s northwest. I’ve covered the Uyghur humanitarian crisis from Europe and Turkey,

The post Inside Xinjiang’s five-star propaganda tour appeared first on Coda Story.

]]>
Covering a crisis from the outside is a challenging, but often necessary, job. I have never set foot anywhere near China — and yet I’ve spent much of the past year reporting on stories from inside Xinjiang, the modern-day police state in the country’s northwest. I’ve covered the Uyghur humanitarian crisis from Europe and Turkey, or by making long WhatsApp calls to Uyghurs who’ve escaped to Australia or Canada.

Naturally, I spend much of my time thinking about that silk road region in China’s far west. I think about the mostly-destroyed old city in Kashgar. I think about how the clocks in Urumqi are set to Beijing time, even though it’s several time zones to the west. I think about the yawning silence – how the air feels when at least a million Uyghurs have been disappeared into concentration camps. I think about the apricots: one Uyghur source told me those in Xinjiang are the sweetest she’d ever had. She believes she’ll never taste one again. I wonder, perhaps selfishly, if I will one day — because of my reporting, it also seems unlikely. 

This week, I was in Tbilisi, Georgia — where Coda has a newsroom — exploring the eastern edge of the city, where a vast multi-million dollar Chinese development lies, part of the Belt and Road project. It’s a strange, at times eerie place. A huge, gleaming five-star hotel, echoing with emptiness. Deserted strip malls; blank shop windows. And like everywhere in Tbilisi, the odd stray dog. As the wind picked up and the sun began to set, the smell of charcoal and cumin caught the air — suddenly, a clattery Uyghur restaurant emerged from the lengthening shadows. It’s run by one Uyghur woman, who’d been living in Tbilisi for two years. I was desperate to know her story, but for that afternoon I was content to eat her food, drink her tea, and look at the pictures on the walls: Uyghurs playing the dutar; camels running through the Gobi desert; traditional undemolished Uyghur homes.

Xinjiang has been dominating my thoughts this week after having several long phone calls with Olsi Jazexhi, an Albanian-Canadian journalist and academic who went to the region in August on a North Korea-style propaganda tour. For months, Jazexhi had been reading the reports about the Uyghur humanitarian crisis, and decided to investigate for himself. “I approached the Chinese embassy in Albania and presented myself to them as an Albanian journalist. I told them I’d seen so many stories and I don’t believe them to be true,” Jazexhi told me.

“I never imagined what the media were saying was true and that things were even worse – even more tragic.”

Jazexhi arrived at Urumqi airport on a hot day in mid-August, alongside a group of foreign journalists, mostly representing state broadcasters from countries along the silk road economic belt. He was greeted by his guides for the trip: a group of Chinese Communist Party officials. “We were given first class treatment,” Jazexhi said. “Wherever we went, state police were at our service. The traffic was stopped and the police opened the way for us. We were treated like presidents.”

The journalists stayed in five-star hotels while they toured the cities of Xinjiang and were lectured on China’s so-called “de-extremification” program to stamp out terrorism and separatism in the region. “They presented us with their own version of the history of Xinjiang,” Jazexhi said. Almost every night, the journalists were given a show. “A group of boys and girls were selected to sing and dance for us everywhere we went. They were using Uyghurs like monkeys in the zoo.”

“They wanted us to see these people singing and dancing so that when we came out of China we could tell people, ‘there are no concentration camps. There are kids who sing and dance, and they’re very happy.’”

The journalists toured the mosques, many of which Jazexhi described as simply a front for Chinese party propaganda. “They wanted to show to the outside world that there were mosques in Xinjiang,” Jazexhi said. He ventured into one of the mosques near Urumqi’s grand bazaar. “But when I went in I found there was just a store.” 

Many of the other mosques were simply shuttered or had been turned into museums. In Kashgar, the famous Id Kah mosque was open only on Fridays. “We saw only elders – people 55 years and above — praying. The youth were afraid to pray,” Jazexhi said. “In the sermons the imams praised Xi Jinping and the Communist Party.”

The centrepiece of the Xinjiang press trip was a visit to one of the region’s so-called “vocational training centers” — the vast concentration camps where a million Uyghurs are currently held. In a convoy of minibuses, the group drove out of the city of Aksu into the bare, open desert. The camp, which Jazexhi’s Chinese minders referred to as a “school”, was surrounded by rocky nothingness.

“It was in the middle of nowhere,” Jazexhi said. “It was a kind of Alcatraz – even if they could climb the walls and jump, they would be dead in the desert.”

Jazexhi tried to interview some of the camp’s inmates — to the consternation of the guards, who preferred that he watch another dance performance. “I told them, ‘I’m sorry, I did not come from Europe to watch a show,’” Jazexhi said.

When he did manage to talk to some Uyghurs, Jazexhi was deeply disturbed. “We understood that they were under total control of their Chinese guards and totally terrified to talk to us.” 

Jazexhi asked the inmates why they had been detained. “They started responding that a year ago they read the Quran, or posted on the internet that Muslims should pray five times a day. Things that are basic human rights we have in the West. Their only crime was that they are Muslims.”

The Aksu camp was a turning point for Jazexhi. “When I was there I understood the Chinese were playing with us,” he said. As the group was ferried back to the comfort of their five-star hotel, Jazexhi faced a discomfort familiar to any journalist who’s been on a state-funded press junket (I was reminded of a recent, lavish press lunch laid on by the Moscow Center for Innovations while I was reporting for a dispatch on their surveillance program).

“It’s a kind of irony, after all they paid for my trip and they were expecting me to behave myself,” Jazexhi said. “I questioned myself: should I reveal the truth, or should I — for the sake of the treatment the Chinese gave to us — lie. But by doing this I would ignore these poor people who are suffering in these concentration camps.”

Jazexhi made a series of YouTube videos about his experience in Xinjiang, and also posted footage of his time in the Aksu camp. You can watch it all here. He made the only real choice available to him: to tell the truth. “I know that now I am an enemy of China,” he said. “But at least I told the world what I saw.”

My favorite Coda Story this week: 

Continuing on the China theme, have a read of this dispatch by reporter Rui Zhong on the social credit scheme’s growing influence in the country. One of the most chilling elements Zhong picks out is the reality that citizens who don’t pay their debts live in fear of being outed as a “debt-dodger.” How? By having their face plastered onto advertisements on Chinese TikTok

And elsewhere...

Think your pets are safe from authoritarian tech? Think again. Last week, a Russian security researcher accidentally found a way to hack and take over all FurryTail automatic pet feeders across the globe. Anna Prosteva of St Petersburg, Russia, told business tech website ZDnet that the feeders’ vulnerability would allow a hacker to hijack the feeding schedules of more than 10,000 furry friends living around the world. Surveillance? More like purr-veillance.

The post Inside Xinjiang’s five-star propaganda tour appeared first on Coda Story.

]]>
9627
“Every leap in technology is used to better monitor, trap, and villainize us.” https://www.codastory.com/authoritarian-tech/hiphop-privacy/ Thu, 31 Oct 2019 06:20:10 +0000 https://www.codastory.com/?p=9462 Regular Coda Story readers will know that much of our coverage of authoritarian technology examines issues around the use of biometrics, big data and surveillance. Often, the people who we interview (victims, academics and other experts) feel compelled to speak out about the black holes left open by technology. This week, I thought I’d take

The post “Every leap in technology is used to better monitor, trap, and villainize us.” appeared first on Coda Story.

]]>
Regular Coda Story readers will know that much of our coverage of authoritarian technology examines issues around the use of biometrics, big data and surveillance. Often, the people who we interview (victims, academics and other experts) feel compelled to speak out about the black holes left open by technology.

This week, I thought I’d take an entirely different tack. 

We don’t often get the chance to write about how these same issues are being documented by artists. In musical terms, concerns about the uses of technology are as old as sheet music. Over the years, subjects like automation (Kraftwerk), surveillance (M.I.A.) and the dangers looming in our near future (EMA) have all been extensively tackled by electronica, hip-hop and pop artists. 

A new song from an upcoming album by the American musician DJ Shadow puts some very contemporary issues like data gathering and surveillance front and center. The song, “Urgent, Important, Please Read”, uses three rappers (Daemon, Rockwell Knuckles and Tef Poe), to warn about addiction to mobile phones, data collection and the motives of tech companies.

You can listen to the song on Spotify and YouTube. Listeners will notice the track sounds impatient, much like the alert notifications which punctuate our daily lives. You may also trace an unmistakable through-line from older recorded works by groups like Public Enemy and A Tribe Called Quest

https://www.youtube.com/watch?v=l8OE8kg7DoQ

Earlier this week, I interviewed two of the three rappers who appear on the song, Daemon and Rockwell Knuckles. Both live in St Louis, Missouri. 

Daemon told me he and Knuckles were inspired to write about aspects of technology after DJ Shadow sent them a message outlining the kind of oppressive mood he was hoping to create. “I felt like he [Shadow] kept things open-ended enough so we had plenty of room to play,” wrote Daemon, in an email. “From there it was all about trying to get into the right headspace.”

While “Urgent, Important, Please Read” doesn’t specifically mention artificial intelligence, both rappers think technology ultimately aids an ever more anonymous brand of policing. “We’re not out here dreading the rise of AI, that kind of concern is the luxury of a rich man,” wrote Daemon. “Every leap in technology is used to better monitor, trap, and villainize us. It’s hard to get worked up about the coming robot apocalypse when you’re already stuck in a system that is specifically engineered to take advantage of you until it kills you so that someone else can have a better life. And it’s designed in such a way that the people who benefit the most have maximum deniability.”

Rockwell said there were issues he felt he had to address. On “Urgent, Important, Please Read”, he raps about deleting his browser history: “Deleting every record, you know what I mean/ Puttin' mind on my money, downloadin', flee the scene.” 

In the same verse, he strikes a more pessimistic tone later, on realizing there is no escaping the digital world: “A futurist gotta say it, it’s odd/ The land of technology, the hacker is god/ Everybody charge or church up, get the bag.”

“I think I was trying to say that technology is the new religion,” said Rockwell, speaking by phone on Monday. “Everyone is on Instagram. The man that can control the technology can control the phone. So to me, your phone is like your church. Charge your church, open your social media and bow for prayer.”

“I think the key thing is we’re all trying to find a middle ground in how technology controls us all,” he added. “Our phones, alarm clocks, the things you buy, everything is pretty muddy.”

My favorite Coda Story this week:

Last week, Evan Gershkovich attended a two-day event in Sochi which featured a number of African heads of state. As Evan discovered, the Kremlin is leaning on its “anti-colonial” past for greater influence in Africa, offering trade and security initiatives in a bid to counter U.S. and European influence. Ayanda Dlodlo, South Africa’s State Security Minister, said, “As Vladimir Putin put it very well: Russia has not colonized my country.”

Elsewhere: 

More than a year after the Cambridge Analytica scandal rocked Facebook, it has emerged that the social media platform recently asked several U.S. hospitals to share anonymized data about their patients. According to the report, Facebook intended to match patient data with user data it had collected in a bid to understand which patients might need special care or treatment. The proposal never went past the planning phase and has been put on pause. (CNBC)

And finally: a fascinating account of how the West sometimes overestimates the Chinese government’s ability to push innovation. The author argues that China’s support for digital innovation has an uneven track record to date. China might be great on simpler technologies like solar cells, but lags in producing quality semiconductors, despite favorable government subsidies. (SupChina)

Pitch alert: We are seeking story ideas on worldwide anti-science movements. Send your pitches now on character-driven narratives explaining how anti-science movements are created and how they thrive. Email me on burhan@codastory.com

The post “Every leap in technology is used to better monitor, trap, and villainize us.” appeared first on Coda Story.

]]>
9462
Selling happiness, promoting disinformation https://www.codastory.com/disinformation/selling-happiness-promoting-disinformation/ Mon, 28 Oct 2019 13:02:49 +0000 https://www.codastory.com/?p=9419 I wash myself using only soap, I pick out the cheapest shampoo at the store, and my idea of mental rest is either working on our little forest yard or taking a hike with our two dogs. I’ve never quite understood the wellness hypes — detoxing with weird juices, vitamin supplements, and putting “eggs” in

The post Selling happiness, promoting disinformation appeared first on Coda Story.

]]>
I wash myself using only soap, I pick out the cheapest shampoo at the store, and my idea of mental rest is either working on our little forest yard or taking a hike with our two dogs. I’ve never quite understood the wellness hypes — detoxing with weird juices, vitamin supplements, and putting “eggs” in places they shouldn’t belong. 

But the $4 trillion global wellness industry is growing yearly. And they do so by selling happiness and healing in large part through unscientific cures. The problem with the wellness industry isn’t only that it could be potentially harmful to individuals — such as the Chinese woman almost dying from using a homemade IV to inject fruit juice straight into her veins — its impacts can be felt on a much larger scale: from deciding not to vaccinate your children to voting against scientific-backed health policies. 

How did we get here?

Decades ago, people went to doctors for health information. In 1966, more than three-quarters of Americans had great confidence in medical leaders. However, trust in the medical profession has declined sharply — only 34% trust medical leaders and only 23% express a great deal or quite a lot of confidence in the healthcare system — and the internet has provided access to a wealth of information previously inaccessible. 

Celebrities and social media influencers have taken advantage of this, with Gwyneth Paltrow’s Gloop as perhaps its most famous example. We’re told we can lose weight forever with one simple trick, cleanse our bodies from “toxins” with a juice, take a supplement or two to cure our insomnia. 

“It’s almost now that we’re all obligated to do whatever we can, all the time, to try to improve our health and our wellbeing,” says Timothy Caulfield, a researcher, author and professor of health law and science policy at the University of Alberta and host of the show "A User's Guide to Cheating Death." 

Wellness used to be a blend of health and happiness — perhaps a walk on the beach or a massage. Now it has become a false antidote to the fear of modern life and death. 

Even more so, wellness influencers need you to fear and distrust in order to sell you their products.

“Moving the kind of product that churns the wheels of the wellness-industrial complex requires a constant stream of fear and misinformation,” writes Jen Gunter, a California obstetrician-gynecologist, who has been dubbed Twitter’s resident gynecologist. “Look closer at most wellness sites and at many of their physician partners, and you’ll find a plethora of medical conspiracy theories: Vaccines and autism. The dangers of water fluoridation. Bras and breast cancer. Cellphones and brain cancer. Heavy metal poisoning. AIDS as a construct of Big Pharma.”

The power of social media and celebrity advice, the commodification of “happiness,” the urge to sell us more and more by pushing conspiracy theories and false information, and the resulting decline in trust in the medical field, contribute to a major public health crisis. One of which the end is not in sight yet.

Next time you buy a 3-day cleanse to get rid of those “toxins,” you may want to think twice.  

My favorite Coda Stories this week

We’re expanding our coverage of the war on science to look at how anti-science movements are created and how they thrive. This article by Lily Hyde explains why Ukraine this year has seen the largest increase in measles in the world. 

It isn’t only commercial interest pushing health misinformation. One of the largest special ops conducted by the KGB in the 1980s was Operation Infection: designed to spread disinformation about aids being a CIA invention. Daria Litvinova investigated in this multimedia piece, tracking how a disinformation spiral started by the Soviets is now killing Russians. 

IN OTHER NEWS:

  • Dutch reporter is jailed for failing to reveal his source. Law enforcement previously intercepted conversations between the reporter and his source (DutchNews)
  • UK investigation reveals Russian hackers impersonated Iranian hackers (Coda Story)
  • Lithuanians are using software to fight back against fake news (The Economist)
  • How Ethiopia's ruling coalition created a playbook for disinformation (Global Voices)
  • The Energy 202: ExxonMobil goes on trial over accusations it misled investors about climate change costs (The Washington Post)

The post Selling happiness, promoting disinformation appeared first on Coda Story.

]]>
9419
The Nobel prize recognizes the influence of lithium-ion batteries https://www.codastory.com/authoritarian-tech/nobel-lithium-ion-batteries/ Mon, 21 Oct 2019 13:43:43 +0000 https://www.codastory.com/?p=9335 Earlier this month, three scientists got the Nobel prize in chemistry for inventing the lithium-ion battery. “They created a rechargeable world,” reads the Royal Swedish Academy of Sciences press release announcing the award. Chances are, the phone that’s no more than a meter away from you, or on which you are reading these words, has

The post The Nobel prize recognizes the influence of lithium-ion batteries appeared first on Coda Story.

]]>
Earlier this month, three scientists got the Nobel prize in chemistry for inventing the lithium-ion battery.

“They created a rechargeable world,” reads the Royal Swedish Academy of Sciences press release announcing the award.

Chances are, the phone that’s no more than a meter away from you, or on which you are reading these words, has a lithium battery in it, which is why the lithium-ion battery is so important: It made powerful electronic devices — like smartphones — portable in a way they hadn’t been before. This, in turn, set the stage for the smartphone revolution over the past decade and a half.

And the lithium-powered rise of the smartphone has had profound consequences, including political ones, because it’s largely through smartphones that internet browsing has permeated the world’s population at an unprecedented scale.

Earlier this year, three researchers tried to figure out what those political consequences have been.

“There is a heated debate on the political implications of the spread of internet...Previous studies — mostly carried out in one country setting — produced evidence for both optimistic and pessimistic points of view,” one of the authors, Sciences Po professor Sergei Guriev told me over email. “We decided to look at the global evidence. Our results are also suggesting that both optimists and pessimists are partially correct.”

For governments, the results don’t look good.

“The expansion of mobile internet networks leads to a reduction in confidence in government when the internet is uncensored,” the recent paper, called “3G Internet and Confidence in Government,” argues. By using statistical methods and survey results, Guriev and co-authors Ekaterina Zhuravskaya and Nikita Melnikov show there is a strong correlation between increased 3G internet access and distrust in government.

“What surprised us was the magnitudes of the effects, which were quite large,” Guriev said.

Disappointingly, they don’t find this strong effect when the internet in a given region is censored. So their research both vindicates and undermines Bill Clinton’s now-somewhat-infamous 2000 speech that touched on internet censorship.

“In the new century, liberty will spread by cell phone and cable modem,” Clinton said then. On this, the paper seems to agree with him: 3G helps expose government corruption.

But then Clinton went on: “Now there’s no question China has been trying to crack down on the Internet. Good luck! That’s sort of like trying to nail Jell-Oto the wall.” 

This now seems naive, and the 3G paper is one more piece of evidence suggesting you can nail Jell-O to the wall — that is, successfully censor the internet to stifle dissent. This is why Foreign Policy dubbed China’s former internet czar, Lu Wei, “The Man Who Nailed Jello to the Wall.” Internet censorship kind of works.

My favorite Coda Story this week

Speaking of 3G internet, it’s worth revisiting Keren Weitzberg’s piece on how smartphone lending has helped financial credit systems burrow deep into the lives of ordinary Kenyans. “We use smartphone data to build a financial identity for applicants,” said one company representative. That includes how often you call your mom.

OTHER NEWS:

  • Researchers at Georgia Tech have been experimenting with having AI read fiction to make it more empathetic. (LitHub)
  • Here’s a very well-done visualization of social media content removal requests by country, though it’s important to note that we can’t simply draw conclusions about which are the “worst” countries for internet censorship. Russia has one of the highest rates, but France has ten times more requests than China. Still, the data is fascinating and worth exploring. (Comparitech)
  • Digitized welfare systems are rising across the world. But a new report from a United Nations expert warns that the world “needs to alter course significantly...to avoid stumbling zombie-like into a digital welfare dystopia.” (UN)
  • China’s Xi Jinping ideology app apparently spies on its users. (Coda)

The post The Nobel prize recognizes the influence of lithium-ion batteries appeared first on Coda Story.

]]>
9335
At a tech conference in Armenia, Moscow pushes its smart city solutions https://www.codastory.com/authoritarian-tech/moscow-smartcity-technology-tour/ Tue, 15 Oct 2019 07:06:19 +0000 https://www.codastory.com/?p=9131 Isobel Cockerell and I reported this week from the World Congress on Information and Technology where the City of Moscow dominated the exhibition floor at one of the tech industry’s largest global gatherings. A sponsor of the conference, its booth was the first thing you saw when you walked in. Interactive screens built into its

The post At a tech conference in Armenia, Moscow pushes its smart city solutions appeared first on Coda Story.

]]>
Isobel Cockerell and I reported this week from the World Congress on Information and Technology where the City of Moscow dominated the exhibition floor at one of the tech industry’s largest global gatherings. A sponsor of the conference, its booth was the first thing you saw when you walked in. Interactive screens built into its sweeping lipstick-red structure showed off some of the smart city technology that earned the capital city a top spot in the UN’s international survey of best e-governance.

Kudos to Moscow for using algorithms to tackle its notorious traffic jams but it’s the implementation this year of facial recognition technology in 40% of the city’s 162,000 cameras that has privacy experts and protestors concerned. During a break in the conference, I called Alena Popova who leads the Ethics and Technology think tank in Moscow to talk to her about a lawsuit she filed against the municipal government after they used facial recognition to identify and fine her for a one-woman protest.

She was pretty unequivocal about what this technology in Moscow is leading to: “I’m certain we’re moving towards total surveillance,” Popova said. “In reality this is technology that is being used to hunt down political opponents.”

Needless to say the Moscow delegation wasn’t pleased when Isobel and I sent them the published story (you can read the piece here). Their spokesperson said that the piece was full of “cold war and political activism stereotypes” and that we could have written the article while “staying at home on the sofa in NY.”

On one hand I can understand his frustration. In a keynote address, Russia’s Federal Tax Commissioner presented impressive innovations in filing digital returns and the co-founder of a Russian education start-up stood out with her forward-thinking plans for integrating tech with schooling. Maybe 90% of the technology on display from Moscow had nothing to do with surveillance.

But as the Moscow government continues a global tour of its smart city technology they should find some better answers about data collection, privacy and the ethics behind facial recognition. The CEO of Moscow’s Agency of Innovations seemed baffled when I asked him whether the agency had anyone overseeing ethical concerns, answering that he didn’t think there was such a thing as a professional in the field of ethics. Whether it’s Mosocw, Beijing, New Delhi or London, it’s hard to get excited about electric scooter rollout or city-wide wifi when it coincides with the unchecked introduction of city-wide facial recognition systems.

Repeatedly at the three-day conference, entrepreneurs and researchers lumped together Russia with China. Author Jamie Metzl made it clear he thought Russian scientist’s move to produce genetically modified children was irresponsible. Other panelists did the same, prompting one journalist to ask a Moscow representative at a press lunch organized by the delegation how it felt to be a trope for going “too far.” Her reply: “We’re pioneers. We always have been.”

Speaking of Cold War stereotypes

A few weeks ago I wrote about the politicians and pundits who can't stop resurrecting unhelpful Cold War metaphors as they try to find the right words for the technology age. At this tech gathering, I jotted down a couple more helpful metaphors used by speakers on panels that I think helps demystify some of the conversation around technology and regulation. The main theme is that the tools for taming tech and moderating regulation already exist. We’re by no means at an impasse.

Here are a few of my favorites:

Digital Twin

“Do I own digital Olga? Who owns digital Olga?” asked Olga Mack during a panel digitizing civil administration. As CEO of an online contract management company, Parley Pro, Mack is 100% dependent on social networks, like LinkedIn, for her work, and like many of us, she’s a prisoner to their terms and conditions. Unlike with a bank or accounting service, she can’t pack up her records and leave if she’s unhappy with its services. Her profile on the platform is also so much more than bank details, so close to her “real” self that it becomes a digital copy of her habits, likes, and relationships. In other words, a true “digital twin.” “So who owns Olga?” If we think of the data social networks collect as a “digital twin” the question of legal rights and legal justice seems much more clear. Why can’t we pack up our data in a “digital suitcase” and take it elsewhere?

Standard-gauge railway

“I’m glad you brought up trains,” began Dr. Chrisopher Markou, from Cambridge, during a panel about government administration and tech. If you can dig back far enough into middle school history class you may remember talk of 19th century gauge wars. “The initial conditions of a technology can determine how that technology diffuses such that we still use the standard width of railroad track that was used back in the 1880s. How we establish the initial conditions for technology to develop, have knock on effects that are intergenerational,” Markou explained. 

The one that got away

It would be amiss to leave you without a few choice quotations from the conference’s headliner Kim Kardashian West, the real reason behind Coda’s deployment to the conference. I take my hat off to her work ethic and talent in business management, but you would think after basically being created by the media she’d have more curated answers to the questions from her interviewer, the author Magdalena Yesil:

Yesil: “The one business that you wish you could have done out there. The one that got away. Sometimes you look at a company and think that’s such a great idea...I wish I had done it.”
Kim: “I mean there’s so many. I mean inventing the computer, everything, I don’t know.”

To be fair, the other panelists also struggled: the founder of Giphy gave the postal service as an answer and Reddit co-founder Alexis Ohanian said the 2018 Armenian revolution (the conference this year was held in Yerevan, the Armenian capital).

Around the Web

Russia is far from the only country sparking concern over facial recognition systems. Buzzfeed reporter Pranav Dixit looks at India’s rollout of technology so advanced that “even plastic surgery may not be a dramatic enough remedy to avoid surveillance.”

The Verge reports on just how much Apple is bending to pressure from China by removing apps from its store that violate guidelines and “local laws.” In one schizophrenic case, Apple rejected a crowdsourced mapping app popular among Hong Kong protestors from its store, then approved the app and then re-rejected it again a few days ago. In a similar case also reported by the Verge, one of my favorite news outlets Quartz had their mobile app removed from the Chinese App Store following its ongoing coverage of the Honk Kong protests.

The post At a tech conference in Armenia, Moscow pushes its smart city solutions appeared first on Coda Story.

]]>
9131
Linking climate change and air pollution to intelligence https://www.codastory.com/authoritarian-tech/climate-change-air-pollution/ Mon, 07 Oct 2019 08:05:18 +0000 https://www.codastory.com/?p=8913 Picture this: a post-apocalyptic world, with humans now living on one of Jupiter’s moons. A young female scientist goes back to Earth to regenerate oxygen into the atmosphere using bees. As a sci-fi buff, it was only a matter of time before I’d see the movie “The IO” — a flopped Netflix Hollywood production. Critics

The post Linking climate change and air pollution to intelligence appeared first on Coda Story.

]]>
Picture this: a post-apocalyptic world, with humans now living on one of Jupiter’s moons. A young female scientist goes back to Earth to regenerate oxygen into the atmosphere using bees. As a sci-fi buff, it was only a matter of time before I’d see the movie “The IO” — a flopped Netflix Hollywood production. Critics have been harsh, but the movie spoke to me in light of recent climate change events. It certainly wasn’t the first time a film or book would predict our future, such as President Trump in the Simpsons, Airpods in Star Trek, and Black Mirror’s episode “Nosedive” in which people have to rate each other to gain access to services - insert China’s social credit system.

I thought of The IO again last week when I was on a call with environmental technologists and I discovered that there’s an abundance of scientific research exploring how CO2 — carbon dioxide — makes us stupid. One of the participants — Chris Adams, a director of the green web foundation, and environmental-focused tech generalist — told me he started focusing on the issue after reading James Bridle's book New Dark Age. “It's the existential problem of our time, and this is backed up by the science.”

A study by the Yale School of Public Health found that air pollution caused a drop in intelligence levels, while a University College London study found that “higher amounts of greenhouse gas in the atmosphere could affect our memory, concentration and decision-making abilities.” 

In response to the research, tech makers have been working on nifty little devices to track how “stupid” you get — wherever you are — from CO2. 

Sidenote: the 1968 musical Hair “warned” we probably shouldn’t breathe in too much CO2, singing: “Welcome, sulfur dioxide. Hello, carbon monoxide. The air, the air is everywhere. Breathe deep, while you sleep, breathe deep.” (Besides sci-fi, I’m also an avid musical fan, don’t judge).

We already know that casinos have been pumping extra oxygen into the halls for decades to keep people awake longer; in China, students use extra oxygen before the national exams. Our air is being altered to make us act and think differently.

This made me wonder: would repressive regimes go as far as pumping extra CO2 into enclosed areas to make their subordinates a little bit less intelligent? I know, purposely using CO2 to dumb down a nation sounds a bit outlandish. 

Or is it the perfect plot for another prescient sci-fi movie? Because it isn’t completely absurd. And that is how I wandered into the bizarre world of aerosol warfare:

  • The American military reportedly briefly looked into developing a non-lethal aphrodisiac bomb that would cause forces to become “irresistibly attracted to one another.” 
  • The CIA famously used aerosols to spray LSD to test mind-control techniques. It first came to light in 1977 when a former CIA  employee testified that he and a colleague went to San Francisco in 1959 to “lure unsuspecting people to a party at which the two agents were to spray the air with LSD‐25 as part of the agency's secret drug‐testing.” 
  • The death of a scientist assigned to a secret U.S. biological warfare laboratory in 1953 turned out to be linked to the aerosol experiment — which inspired the  2017 six-part docudrama miniseries Wormwood
  • And no one seems to be able to agree on whether or not a small French city was sprayed with LSD by the CIA in 1951. Did the same thing occur in the New York subway system

My Favorite Coda Story This week

An interesting read this week is Lily Hide’s piece on how a chemical polluted the air around Armyansk, a town of 22,000 people in northern Crimea, and how disinformation left the residents hanging in the air (pun intended). “As reports spread of what appeared to be a major health and environmental incident, authorities on both sides of the divide, in Russia and Ukraine, seemed more concerned with using the leak as a propaganda tool than addressing the needs of those affected or investigating the cause.”

IN OTHER NEWS:


The post Linking climate change and air pollution to intelligence appeared first on Coda Story.

]]>
8913
The world’s cities are becoming Singapore https://www.codastory.com/authoritarian-tech/the-worlds-cities-are-becoming-singapore/ Mon, 30 Sep 2019 14:44:11 +0000 https://www.codastory.com/?p=8844 Regular readers will be familiar with our coverage of how authoritarian technologies lurk around the infrastructure of smart cities. Our journalists have looked at how Western companies are aiding the surveillance architecture of smart cities in China; we have also detailed how technology is assaulting the lives of ordinary Zimbabweans. A new book, “The Smart

The post The world’s cities are becoming Singapore appeared first on Coda Story.

]]>
Regular readers will be familiar with our coverage of how authoritarian technologies lurk around the infrastructure of smart cities. Our journalists have looked at how Western companies are aiding the surveillance architecture of smart cities in China; we have also detailed how technology is assaulting the lives of ordinary Zimbabweans.

A new book, “The Smart City in a Digital World” (Emerald Publishing), provides a good overview of some of the challenges faced by local and national policymakers who are under pressure to innovate and save public finances. As the author Vincent Mosco demonstrates, smart city solutions often involve the outsourcing of data gathering and other services to companies like Amazon or Google. At the same time, in its most insidious form, the technology can be used to surveil minorities like Uyghurs in China. 

Mosco’s research shows that there are around 1,000 smart city projects in various stages of planning or development worldwide, and around half of these are located in China. India plans to build 100 new smart cities and rejuvenate another 500. 

I recently interviewed Mosco and began by asking him if smart cities were like utopian cities of the past. What follows are highlights of our conversation which have been edited for length and clarity.

Are the smart cities of today any different from the kinds of fiefdoms we have seen in the past - for example, Henry Ford’s Fordlandia, Disney’s Celebration or the large infrastructure projects built by planners like Robert Moses?

I think they are similar in certain, specific respects. They embody a kind of private master builder approach, whether it is Ford or Disney or a tech company like IBM. The smart city movement represents a prominent viewpoint in American urban history, that it takes great men, and it tends to be men - unencumbered by governments and regulation — to build truly great cities. Companies like Google are the master builders of the digital word, and unlike in the past, they are not building an industrial society but an informational society. 

Builders of large infrastructure projects in the past often looked to governments or officials as partners. Is that still true? 

I don’t believe tech companies see this to be true anymore. There is a sense of hubris that governments have gotten it wrong in the past and it takes private innovation to do this. The key difference with master builders in the industrial era was that there was a sense that governments would be a close partner to industrial leaders. While corporate execs would take the lead, the work would be done with government help and regulation that would assist and legitimize. Today, there is a difference, private entrepreneurs take the lead. People like Peter Thiel and Elon Musk look on their projects as a way to get away from governments. 

Turning more directly to smart cities, we often see invasive digital technologies rolled out ahead of big events as a form of policing, I’m thinking of how Brazil and Russia used camera systems ahead of their FIFA World Cups, as well as the 2016 Summer Olympics. The technology then stays around long after the event has moved on. 

 One of the key selling points of this technology is the availability to better root out crime and manage police and security forces. This began years ago when IBM built an operations center in Rio de Janeiro ahead of the World Cup. Smart city technology was installed on the basis of saving money and better managing a part of a city, but underlying it all was facial recognition which became a tool for mass surveillance. 

One of the points in my book is that smart city solutions in New York first grew out of an interest in bringing the Olympics to the city ahead of 2012 and spurred the redevelopment of huge swathes of the city itself caused more privatization and gentrification. 

Much has been written about biases in algorithms. What do you think can be done to counter this? 

In the smart city and in the digital world, if we are going to make use of algorithms, we need to make the process of developing and using them much more transparent  We rely on tech experts who know very little about the racial biases built into the systems. We need to open this up to access by private citizens. This doesn’t necessarily provide us with the solution. 

Technologies tend to embody the societies and social divisions that they are used for. We need to recognize from the start that the algorithms we deploy are biased. This will require more regulation and it is no surprise the technology industry would resist. My experience in this area shows that the communications and tech industry, starting with the introduction of the telegraph, has resisted regulation. There is nothing new about the resistance of Google and Amazon and Facebook to these efforts. 

Can ordinary people do anything? 

Urbanites need to take back their cities. We need to do this soon, before these technical systems are so influential in decision making that it would be difficult to redesign them to make them more human. 

More and more people are coming to recognize that putting down devices alone will not be enough. What makes smart cities particularly interesting, given how Google Sidewalk Labs are setting up their projects, is they will be tracking people simply by virtue of them being in the area. Cameras, point of sale registers, energy systems will monitor all of your use. Communications and scanners and transportation, sidewalks and street lamps. You don’t have any opportunity to sign up for this level of surveillance. With a website, we can click the “I Agree” box. In smart cities, that opportunity won’t be there. 

Do you worry that populations in countries like India or China may be more susceptible to control from smart cities? 

We need to be deeply concerned about this. Singapore is becoming the laboratory for smart city development. China has taken a page out of Singapore’s story by applying technologies that are quite authoritarian, like the social credit system which keeps track of all of one’s activities and uses it as an index of citizenship or worthiness. If you happen to be surveilled by it, whether at a demonstration or not, can impact your social credit score, government benefits, schooling, etc. India is different in that while the state is involved in funding the benefits, a lot of private companies are involved in overseeing the projects. 

Some people have gone so far as to call for a halt to the sales of authoritarian technology. Do you think a moratorium would be helpful? 

It may not go far enough, in my view. Authoritarian regimes will be defined narrowly to include nations like China. But we are seeing the rise of authoritarian tendencies in countries like the United States and Britain. I am concerned about Google and Amazon assisting the reprehensible immigration system in the U.S. where tech companies are rooting out refugees and sending them to their deaths back in their home countries. I think a halt is a good start, but we need to recognize that China and Saudi Arabia are not the only governments to be concerned about. Western governments need to be examined as well.

OTHER NEWS:

  • Earlier this week, I visited a new exhibition at the Tate Modern museum here in London. “Higher Resolution”, presented by a number of creators, including Romy Gad el Rab and Caroline Sinders, simulates platforms like Twitter and Facebook in everyday settings. Visitors are encouraged to sit in a public “living room,” a “town hall,” and even a “loo” and speak as loudly and as emphatically to strangers as they might on a social media platform. Those participants who were British seemed to embrace the idea with some initial hesitancy. The exhibition includes curated playlists and informative talks about subjects like artificial intelligence and the feminist internet.
  • In the forthcoming weeks, we are going to be looking at the use of facial recognition in public spaces like schools and private housing. This story looks at the backlash to the use of facial recognition systems in public housing in Detroit. 

One of our highlights this week was this piece about how a global network of Uyghurs living outside of China are digging deep into the popular social media app TikTok (they are using the  Chinese version, called Douyin) to uncover information about life in Xinjiang. From footage of demolished mosques to video of long lines of Uyghurs passing through security checkpoints, the story explains the extent of China’s determination to control its minority populations. 

The post The world’s cities are becoming Singapore appeared first on Coda Story.

]]>
8844
Governments good and bad converge on AI surveillance https://www.codastory.com/authoritarian-tech/ai-surveillance-global-spying/ Mon, 23 Sep 2019 12:04:07 +0000 https://www.codastory.com/?p=8661 Here at Coda, we take a Unified Field Theory approach to the global storylines exerting a gravitational pull on events, whether planetary or local. Social and political forces are interconnected, interrelated, sometimes coordinated (and best understood by investigating the actions of individual people). But in recent weeks, everything and everyone seem to be flocking together.

The post Governments good and bad converge on AI surveillance appeared first on Coda Story.

]]>
Here at Coda, we take a Unified Field Theory approach to the global storylines exerting a gravitational pull on events, whether planetary or local. Social and political forces are interconnected, interrelated, sometimes coordinated (and best understood by investigating the actions of individual people). But in recent weeks, everything and everyone seem to be flocking together. It’s starting to feel ridiculous.

Take the last few days of Ukraine news. Once upon a time, like perhaps in 2015 or 2016, activists working at anti-corruption NGOs operating in not fully consolidated democracies feared physical attacks; apart from personal safety, their big concern was usually where to find more funding. Now the Anti-Corruption Action Centre, or AntAC, which deals with only domestic corruption, has come under digital attack from high-tech Israeli mercenaries. And the U.S. president’s personal lawyer has dragged AntAC into a dizzying, high-stakes disinformation campaign as he seeks to invent a narrative to further Trump’s chances in the 2020 elections.

Roles are reversing, disinformation is coming from all sides, authoritarian tech is being pushed from and adopted in every direction, illiberal and liberal democracies, corrupt and rule-of-law governments, and the technologies they embrace are more entangled than ever.

In fact, authoritarian and liberal governments are moving toward each other lickety-split in their implementation of artificial intelligence and facial recognition to spy on their citizens and to build a pervasive surveillance system. More than 75 countries have developed or acquired AI technology so that they can surveil their citizens and monitor their public space. Huawei, Hikvision, NEC Corp, and IBM are the corporations around the world selling the most AI for surveillance (with Huawei far in the lead).

These are among the insights in a new, eye-opening Carnegie Endowment for International Peace report by Steven Feldstein, a leading scholar on advanced technology and governance. Feldstein determined that AI surveillance technology is spreading far more rapidly than previous understood, and his key findings include:

  • Governments in autocratic and semi-autocratic countries are more prone to abuse AI surveillance than governments in liberal democracies. 
  • Liberal democracies are major users of AI surveillance. The index shows that 51 percent of advanced democracies deploy AI surveillance systems, a higher percentage of autocratic regimes.
  • China is a major driver of AI surveillance worldwide, and adoption closely tracks with countries having signed on to China’s Belt and Road Initiative, suggesting the Chinese government is subsidizing and encouraging governments to purchase their equipment. These tactics are particularly relevant in countries like Kenya, Laos, Mongolia, Uganda, and Uzbekistan – which otherwise might not access this technology.

In other news:

On more context for that last one: Haiyun Ma, an assistant professor at Frostburg State University in Maryland, wrote in June an important research report for the Hudson Institute on the scope of the PRC’s anti-Muslim goals:  

“Xinjiang’s so-called ‘de-extremification’ campaign clearly has become a struggle against Islam itself, which is meant to de-Islamicize the daily lives of Uyghur Muslims by criminalizing their normal religious practices. As a result, in large parts of western China, the Communist government’s policies toward Islam have become virtually indistinguishable from the demands made by Chinese anti-Muslim activists online. This toxic amalgam has led to some of the most egregious human rights abuses in today’s world.”

Write me: greenberg@codastory.com

The post Governments good and bad converge on AI surveillance appeared first on Coda Story.

]]>
8661
Are we really living in the age of a “cyber arms race” and an “information Iron Curtain?” https://www.codastory.com/authoritarian-tech/cyber-arms-race-information-iron-curtain/ Mon, 16 Sep 2019 07:37:24 +0000 https://www.codastory.com/?p=8588 Are we really living in the age of a “cyber arms race” and an “information Iron Curtain?” Cold War vocabulary is back and leading the headlines. The escalating trade war between the U.S. and China has resurrected the metaphors as politicians and TV pundits try to draw meaningful parallels between the nuclear and the technology

The post Are we really living in the age of a “cyber arms race” and an “information Iron Curtain?” appeared first on Coda Story.

]]>
Are we really living in the age of a “cyber arms race” and an “information Iron Curtain?” Cold War vocabulary is back and leading the headlines. The escalating trade war between the U.S. and China has resurrected the metaphors as politicians and TV pundits try to draw meaningful parallels between the nuclear and the technology eras.

But the issue is these analogies have expired for a reason; they just don’t apply anymore and as Justin Sherman writes for Wired, it’s possible that militarized, Cold War language is producing “overly combative policies on emerging tech.”

There are a number of reasons why these metaphors are obsolete and our latest piece by Charles Rollet gets at what I think is the most telling difference — while the Cold War was defined by borders and competing political ideology, our tech age is more often borderless and apolitical. This is especially obvious when we look at collaborations between Western technologists and China.

Why did Anil K. Jain, a rockstar in biometrics research at Michigan State, present a paper in Xinjiang the same month that a UN human rights panel described the region as a “massive internment camp”? A 30-second Google search could have told Professor Jain — one of the world’s most influential computer scientists — that the ethnically Uyghur president of the university sponsoring the conference was arrested in 2017 and is facing imminent execution.

Here are a few more examples of some of those ties:

  • For years the New York State and California State Teachers’ Retirement Systems have held millions of dollars in shares in Hikvision, the world’s largest supplier of surveillance equipment, nicknamed China’s “Big Brother firm.” Also, the agency administering retirement savings for federal employees and members of the armed services is shifting its investment plan in a change that would expose its $50 billion in retirement funds to investments in Chinese companies. Marco Rubio is one of the loudest critics, writing that the decision would “effectively fund the Chinese government.”
  • Two years after U.S. sanctions outlawed Russian antivirus software, Kaspersky Lab’s tools are still installed on U.S. government military networks and other government defense contractors. The reason? It’s just too complicated to remove it. The ban came out of fear that the Kremlin has influence over the company and can weaponize the software for surveillance. The Department of Homeland Security said that all the software was removed this spring, but a Forbes investigation revealed that was not the case.
  • In a similar case, it seems that Chinese-made surveillance cameras are also so integrated into U.S. security infrastructure that it’s essentially impossible to comply with the recent Congressional ban on the technology.

Additional reading:

Buy the Apple Watch or...Die? The Wall Street Journal’s Joanna Stern writes that she has some “really mixed reactions” to Apple’s fear-based marketing strategy to sell its latest Apple Watch. I’m happy to pile on here. The video Tim Cook presented during last week’s keynote featured stories of people who have had their lives literally saved by their watch: an Apple Watch alerts its pregnant owner that her heart rate was abnormal, prompting her to see a doctor and have an emergency C section. Another watch automatically called 911 when an elderly Apple consumer fell down. “Not only is that a bit of an icky place to be when selling gadgets, it could also become a liability if the Watch is ever unable to save someone’s life,” writes Stern in her live coverage of the keynote.
When it comes to carbon emissions, 1 AI model = 5 cars. Some impressive research from the University of Massachusetts, Amherst looking at the carbon footprint of deep learning. Turns out, an AI model emits nearly the same amount of carbon as the lifetime of five average American cars, including the manufacture of the car. What’s also surprising is how surprised scientist were by the figures, yet another example of how research around AI lags behind the development of the tech itself.

The post Are we really living in the age of a “cyber arms race” and an “information Iron Curtain?” appeared first on Coda Story.

]]>
8588
Smart doorbells and connected hoovers — when control meets convenience https://www.codastory.com/authoritarian-tech/smart-doorbells-and-connected-hoovers-when-control-meets-convenience/ Tue, 10 Sep 2019 00:06:28 +0000 https://www.codastory.com/?p=8531 This time last year, I stayed in an apartment in Chicago filled with household gadgetry. As soon a visitor approached the front gate, our phones would chirrup in warning. Then the doorbell rang, and their face promptly flashed up on our screens. Via an app, we could climate-control the house, even if we were hundreds

The post Smart doorbells and connected hoovers — when control meets convenience appeared first on Coda Story.

]]>
This time last year, I stayed in an apartment in Chicago filled with household gadgetry. As soon a visitor approached the front gate, our phones would chirrup in warning. Then the doorbell rang, and their face promptly flashed up on our screens. Via an app, we could climate-control the house, even if we were hundreds of miles away. My hosts, who were young parents, had a camera fixed above their toddler’s crib, which beamed live video straight to their cellphones, so they could check on him when they were out to dinner. The vacuum cleaner skated around the house of its own accord – until the two-year-old tried to ride it.

I relished my time in the smart home. It gave us all an innate sense of being in control of our lives — protected, organized, not to mention temperature-regulated. But what’s the price of convenience when it means that companies – and potentially governments — can have access to the most intimate information of all: how we live behind closed doors?

This week, controversy abounded over the Ring doorbell-camera system, bought by Amazon for $829 million last year and currently used by millions of security-conscious homeowners across the world. Fears are on the rise that Ring is creating a web of tightly surveilled neighborhoods. “Amazon’s home security company is turning everyone into cops,” Vice’s Caroline Haskins wrote in February. The Ring doorbell is integrated with a social media app called Neighbors, which allows users to upload their own video streams of outside their front doors and flag “suspicious” characters. Haskins looked at the app and found users were mostly flagging people of color. 

Ring has entered into hundreds of contracts cooperating with local police departments across the U.S., which means police can request, via Ring, that customers submit their doorbell footage as evidence in investigations. This week, Gizmodo reporter Dell Cameron revealed that Ring pass data straight back to the police about those who refuse to comply with the requests. And Buzzfeed News reported that while Ring claim not to use facial recognition tech, they’ve actually employed a “head of facial recognition research” in their Ukraine office.

Panopticon-like neighborhoods aside, Carl Miller, Research Director at the Centre for the Analysis of Social Media, made a salient point on Thursday: “One thing I've never considered,” he tweeted, “does the rise of things like Nest and smart homes means that it's now literally impossible to have a secret party when your parents are away and then desperate hoover up the broken glass?! That was the core of practically every 1990s teen drama.” 

Speaking of hoovers, better not make it a smart vacuum: it may be beloved by toddlers, but turns out it has the potential to be used as a roving home CCTV system, which could be hacked. “Since the vacuum has WiFi, a webcam with night vision, and smartphone-controlled navigation, an attacker could secretly spy on the owner and even use the vacuum as a ‘microphone on wheels' for maximum surveillance potential,” Leigh-Anne Galloway, cybersecurity resilience lead at Positive Technologies, told the Inquirer in an article about the devices last year.

Facebook’s dating app launch

Do you feel comfortable giving Facebook the keys to your love life? On Thursday, the tech giant launched their new Facebook Dating app, which creates matches using an algorithm powered by the seemingly bottomless cache of data it has on every user. I have to admit, I find the idea both chilling and alluring. After all, as a company, Facebook knows us better than almost anyone. For millennials like myself, it’s been there since adolescence: privy to all our crushes, obsessions, break-ups, loves, friendships and failures. Thousands of potential data points could help find the perfect match. But then we remember that only recently Facebook was fined $5 billion – a record amount – by the Federal Trade Commission for violating users’ privacy. With this in mind, Facebook put privacy front and center of its launch: “We’re committed to protecting people’s privacy within Facebook Dating so that we can create a place where people feel comfortable looking for a date,” the company said in a statement

At a lecture at the London School of Economics in August, political philosopher Professor Michael Sandel appeared to predict that such an app would soon be in existence. You can listen to the talk, called “Will AI make thinking obsolete?” here. 

The post Smart doorbells and connected hoovers — when control meets convenience appeared first on Coda Story.

]]>
8531
Startups discuss the ethics of artificial intelligence in London https://www.codastory.com/authoritarian-tech/startups-discuss-the-ethics-of-artificial-intelligence-in-london/ Wed, 04 Sep 2019 10:59:18 +0000 https://www.codastory.com/?p=8504 As Coda Story readers are aware, such are the ethical concerns over the widening use of artificial intelligence that a number of campaigns have sprung up to limit or ban its use. In the United Kingdom, in the case of facial recognition, some privacy advocates have argued that an outright ban should be on the

The post Startups discuss the ethics of artificial intelligence in London appeared first on Coda Story.

]]>
As Coda Story readers are aware, such are the ethical concerns over the widening use of artificial intelligence that a number of campaigns have sprung up to limit or ban its use. In the United Kingdom, in the case of facial recognition, some privacy advocates have argued that an outright ban should be on the table. In the U.S., the House Education and Labor Committee will hold hearings on how AI is impacting workers and their jobs once Congress returns in September.

Some aspects of this debate could be heard during a panel discussion titled “Smarter Than Us: The Rise Of AI” in Old Street, London on Thursday evening. In an upstairs room in the heart of Silicon Roundabout, around 140 attendees - including programmers and public sector workers - ate pizza and listened to five speakers discuss how their companies are harnessing AI to improve sectors like digital identities, healthcare, journalism and even real estate. 

A number of the panelists spoke about how their businesses gather information like historical property sales data, images and case studies about Type 2 Diabetes, while trying to preserve some sense of privacy and anonymity. 

Samuel Rowe, Research and Policy Executive at Yoti, a company with a digital identity app, spoke about the lack of guidelines around facial recognition. “There are insufficient safeguards in place for a lot of facial recognition technologies,” he said. “However, from this interrogation, at least in my opinion, there comes an opportunity for radical transparency from organizations working in this field.”

Avinash Bajaj, who previously worked at a health tech firm called Biolink.Tech, said his company opted not to use a cloud solution for storing information belonging to patients. “We wanted privacy at the center and at the core of everything we do,” he said. “So we came up with this aspect of private AI where we did not store any private information on our cloud. We stored all the information on users’ devices.”

Unsurprisingly, governments have struggled to meet the regulatory demands of AI. Only this year did the European Commission announced the launch of a pilot project which hopes to draft ethical rules for developing and applying artificial intelligence technologies. On privacy and data governance, the Commission said “Citizens should have full control over their own data, while data concerning them will not be used to harm or discriminate against them.”

Afterwards, as the panel took questions from the audience, I was reminded of how predatory and illegal conduct in other industries, like banking, prompted government intervention but only after calamities like the financial crisis of 2008. A number of people in the audience repeated the often heard mantra that regulation stifles innovation. Someone said self-regulation would be enough. One person commented, “I want that facial recognition to work, I want that convenience. But at the same time, I don’t want to give up my privacy.”

No one seemed willing to address the wider moral dilemma: With smart assistants on nearly every phone, can we trust those who watch over us to go unwatched themselves?

OTHER NEWS:

  • We have an upcoming story on Delhi’s plans to roll out a citywide surveillance system with over 300,000 cameras. The project, which would allow ordinary residents to access camera footage in their neighborhoods, has been characterized by a lack of transparency. You can find an interesting primer here. (IEEE Spectrum)
  • As demonstrations against the Chinese government enter their 12th week, some businesses in Hong Kong are fast adopting cryptocurrency. One department story has announced it will now accept Bitcoin, Ether, and Litecoin at all its locations. Earlier this month, demonstrators were seen withdrawing Hong Kong dollars from ATMs and banks and converting them to U.S. dollars - the Hong Kong dollar is pegged to its U.S. counterpart. (Business Telegraph). 

Elsewhere in Coda Story: In the aftermath of the collapse of Italy’s government, this is a good week to look at how a weaponized social media has had a whiplash effect on politics in the country. One of our contributors has profiled liberal politician Laura Boldrini, who comes under regular attack from supporters of the hard-right, anti-migrant League party and the anti-establishment Five Star Movement. As one political scientist in Rome puts it, “Facebook and Twitter are a gutter.”

The post Startups discuss the ethics of artificial intelligence in London appeared first on Coda Story.

]]>
8504
The abstraction of life into a…data trace https://www.codastory.com/authoritarian-tech/data-tool-politics/ Mon, 26 Aug 2019 06:20:46 +0000 https://www.codastory.com/?p=8427 Authoritarian tech, even in its computerized form, is not as new as you might think. This month, the journal Security Dialogue published an article that shows that. It’s called “Sensing, territory, population,” and it studies the deployment of something called the Hamlet Evaluation System (HES) in the Vietnam war. The HES was a kind of

The post The abstraction of life into a…data trace appeared first on Coda Story.

]]>
Authoritarian tech, even in its computerized form, is not as new as you might think. This month, the journal Security Dialogue published an article that shows that. It’s called “Sensing, territory, population,” and it studies the deployment of something called the Hamlet Evaluation System (HES) in the Vietnam war. The HES was a kind of proto-big-data program, which let the U.S. military collect and aggregate data on small settlements (hamlets) around Vietnam, so that they could keep track of the war’s progress. The paper’s author wants us to think about what it meant for a complex, controversial war to be transformed into a series of data points:

“I argue that acts of translating the rich texture of hamlet and village life into an objectified information format constituted a unique form of ‘epistemic violence,’ rooted...in the pure abstraction of life into a digitally stored data trace.”

So, in 1967, the U.S. military was already doing something that we can now see as the “abstraction of life into a...data trace.” 

If this isn’t authoritarian tech, what is? That’s not a rhetorical question. We at Coda chose to create a channel called “Authoritarian Tech” because we think that something interesting is happening at the intersection of politics and technology. But what is that something, and is it a coherent phenomenon we can name? More and more, I think the “abstraction of life into a digitally stored data trace” is at the center of the answer.

I’m reading a book called New Dark Age: Technology and the End of the Future, by James Bridle. It argues that digital technology has, among other things, reduced our ability to know and act on the present. This is a paradox: The proliferation of data, the universal availability of more and more information, and the increased sophistication of computer models all suggest the opposite. But, to Bridle, when life gets digitized, we lose sight of it. Crucially, this distortion is political. When we pass the world through a technological lens, we get a version that reflects the political views of those who created the technology. Twitter, say, reflects a certain view of what a public sphere means, how political discourse happens, and what a user is interested in seeing. 

“Technology is not mere tool making...it is the making of metaphors,” Bridle writes. “In making a tool, we instantiate a certain understanding of the world that...is capable of achieving certain effects in that world.”

So here is how I see this as connected to authoritarian tech: Technology is a tool, but it is a tool shaped heavily by corporations and governments with specific political agendas. The technology we use, in turn, shapes our view of the world, our ability to make decisions, and circumscribes the realm of political possibility. 

This is a political power grab, away from the democratic subject, and towards unaccountable product designers, government censors, and technocrats. “Computation, at every scale, is a cognitive hack,” Bridle writes, “offloading both the decision process and the responsibility onto the machine.”

Further reading:

  • Web browsers are uniting to stop Kazakhstan’s plans to monitor citizen’s internet use. (Axios
  • Climate change endangers the internet. This article about it is interesting, but it focuses entirely on the literal physical effects of a warming planet on internet infrastructure. There’s surely a political-sociological side to this story that is just as interesting, if more speculative. (Gizmodo)
  • How Spain’s far-right Vox party is mastering social media to reach young voters. (Open Democracy)
  • AI is being pitched as a way to reduce hate speech on social media. There is one problem: *drum roll* the AI might itself be racist. (Vox)
  • Is technology actually addictive? Skip the tepid debates and read this in-depth psychology paper trying to answer the question in good faith. (Journal of Public Policy & Marketing)

The post The abstraction of life into a…data trace appeared first on Coda Story.

]]>
8427
WhatsApp, TikTok and the question of the chicken and the egg https://www.codastory.com/authoritarian-tech/authoritarian-tech-whatsapp-tiktok-and-the-question-of-the-chicken-and-the-egg/ Fri, 23 Aug 2019 07:50:23 +0000 https://www.codastory.com/?p=8336 You’ve probably heard of rural India’s WhatsApp problem: hate speech, forwarded rumors, and sometimes incited lynch mobs. Both the press and the Indian government have blamed the platform for this, and Coda has tracked some of India’s attempts to legislate the problem away. But others have suggested these weren’t really WhatsApp problems — they were

The post WhatsApp, TikTok and the question of the chicken and the egg appeared first on Coda Story.

]]>
You’ve probably heard of rural India’s WhatsApp problem: hate speech, forwarded rumors, and sometimes incited lynch mobs. Both the press and the Indian government have blamed the platform for this, and Coda has tracked some of India’s attempts to legislate the problem away.

But others have suggested these weren’t really WhatsApp problems — they were problems associated with social and religious issues in India that manifested on WhatsApp. “Technology is what we make of it,” wrote Indian economist Mihir Sharma. “If we in India choose to use convenient messaging to form lynch mobs, that tells us more about India than it does about WhatsApp.”

A big Wired story this week seems to vindicate this approach — that WhatsApp was the facilitator, not the cause. India’s latest scapegoat is the Chinese video app TikTok, which has been hosting caste-based hate videos. “TikTok is fueling India's deadly hate speech epidemic,” reads the article headline. The actual body of the article is more equivocal, talking about centuries old and entrenched caste issues as “massive problems [TikTok] faces in the country,” thereby granting that TikTok did not exactly create India’s caste system.

Most technology scholars I’ve talked to fall pretty squarely on the entrenched-problem side of things: That is, they think hate and violence issues that manifest online ultimately reflect real-world issues. To be fair, it’s hard to find people who disagree with that statement.

But for journalists, platform-based analysis is low-hanging fruit: You find a few instances of hate speech that weren’t taken down, interview a gullible user or two, quote a few concerned experts, and you have an exposé of a platform unable to contain the spread of digital violence. It’s not wrong to criticize these platforms, of course, but we see these kinds of articles over and over. This summer, we have seen this done for YouTube in Brazil and YouTube in the US, and one publication did a deep dive essentially listing YouTube channels it thinks should have been deleted but weren’t. (They were probably right — but you could write these pieces regularly.) These articles all point to the same conclusion: A platform is censoring things, but it should be censoring more things.

Again, it may well be true that the platforms aren’t censoring enough, and criticism is healthy. But the Wired story I mention here quotes a government official who said TikTok is “degrading culture.” Does anyone consider such a statement to be remotely true?

OTHER NEWS

  • This interview with a data scientist offers one of the best explanations I’ve seen of the current era of Artificial Intelligence. It connects the exaggerated hype around AI to the shady practices of those who use it. And it offers an amazingly intuitive and morally clear explanation of what exactly it means for AI to be “racist.” Highly recommended. (Logic Magazine)
  • We’ve written before about the threat posed by “cyber sovereignty” to the free internet. Some have even warned of a “splinternet,” where the web bifurcates into a restricted authoritarian network and a “free” internet. Now, a new Foreign Affairs op-ed argues that the West should welcome and even accelerate this split. That is, authoritarian countries should be kicked off the free web. It’s a strange argument, but worth reading. (Foreign Affairs)
  • Speaking of cyber sovereignty, Russia’s attempts to create a sovereign internet could make things harder for its infamously brazen hackers. (The Register)
  • Arzu Geybullayeva, a previous Coda contributor, reports on how the Turkish government is abusing Twitter’s own rules to get its enemies banned (Global Voices). That reminds us of a similar phenomenon Umer Ali reported for us from Pakistan (Coda Story).
  • Is tech addictive? I think so, but Vox’s Ezra Klein debates an author who thinks that’s not a useful way to talk about it. Though his ideas and books seem interesting, his dismissal of tech addiction strikes me as unpersuasive. (Vox)
  • A ranking of the world’s most-surveilled cities. The top ones are in China. It’s worth taking a look just to get a sense of the numbers of cameras — they are very large. (Comparitech)
  • Once again, a massive database of private information is found to be badly secured. (The Guardian)

The post WhatsApp, TikTok and the question of the chicken and the egg appeared first on Coda Story.

]]>
8336
Pentagon piloting swarms of surveillance balloons; Thiel upset with Google; and China’s social credit system https://www.codastory.com/authoritarian-tech/pentagon-piloting-swarms-of-surveillance-balloons-thiel-upset-with-google-and-chinas-social-credit-system/ Tue, 13 Aug 2019 07:29:31 +0000 https://www.codastory.com/?p=8313 The U.S. is testing a balloon mass-surveillance system. If it works, it may be deployed across the country

The post Pentagon piloting swarms of surveillance balloons; Thiel upset with Google; and China’s social credit system appeared first on Coda Story.

]]>
The Guardian recently published a story about how the Pentagon is piloting swarms of surveillance balloons that will watch over several U.S. states in order to support criminal investigations. They would essentially track all vehicles.

These kinds of law enforcement projects have become unremarkable. After all, what can we do to halt the march of technology? New products come to market, and their most invasive applications seem to follow soon after. From this pilot program, it’s likely not far to full implementation across the States. 

A side note: For the past few weeks, I’ve been talking to historians of technology for another Coda project, and a few of them have expressed sentiments about the inadequate framing of conversations about technology. Our debates about technology are too much about technology, they say. Is AI dangerous? Is facial recognition racist? Is social media bad for our politics?

Back to the Guardian article on surveillance balloons, where there’s an interesting quote in the piece: “[I]f they decide that it’s usable domestically, there’s going to be enormous pressure to deploy it,” said an ACLU representative.

Notice the word “pressure.” What is this “pressure”? Where is it coming from?

The premise behind these questions is that technology is the actor. Once you have the technology, it changes society in a certain way. Certainly, people talk about regulation, but it is usually conceived of as a Sisyphean task, perpetually a step behind the tech. But what these scholars have suggested to me is that we’re missing the human element: The way we design and deploy technology, as well as the way we talk about it, are human constructs.

So let’s revisit this balloon program: The U.S. is testing a balloon mass-surveillance system. If it works, it may be deployed across the country. There would be “pressure” for this to happen. But the pressure is not neutral, or inherent in the technology. According to Allied Market Research, the global video surveillance market will be worth more than $80 billion by 2025, while US defense companies spend over $100 million on lobbying members of Congress every year. The company behind the balloons, Sierra Nevada, spent over a million last year. These processes, not the balloons themselves, are the cause of proliferating mass surveillance.

A similar “just because we can” attitude in tech circles also came up in a recent New York Times op-ed from, of all people, Palantir founder Peter Thiel.
Thiel is upset that Google opened an artificial intelligence lab in China, which could easily expose its AI advances to the Chinese military. “A.I. is a military technology,” writes Thiel. He quotes Google saying that “A.I. and its benefits have no borders.” Here, Thiel criticizes a similar mentality, a pressure, if you will, to advance and spread innovation, the idea that more technology in more places and more deployments is always good.

A similar attitude is the subject of an article Charles Rollet reported for us last week, on the role of Western academic institutions in developing “ethnicity detection” technology in China. It seems that there, the idea that scientific knowledge is always good, and should be shared, became an alibi for developing racist technology.

In a newsletter from three months ago, I mentioned a researcher who had suggested that China’s social credit system might become a kind of trade barrier by discouraging Chinese consumers from purchasing foreign products. Something not entirely dissimilar is beginning to happen, but instead of consumers the social credit system is targeting foreign companies directly, according to Axios.

An interesting example: Remember when China told U.S. airlines to stop listing Taiwan as an independent country? Apparently, a potential punishment for non-compliance was to reflect “your company’s serious dishonesty” through the social credit system.

OTHER NEWS:

  • “Gamification” has been a buzzword for years, fueled by excitement about using technology to make boring tasks more exciting. But a Logic Magazine essay, built around a Lyft driver’s personal experience, persuasively argues that gamification makes it easier to exploit workers. Workers, rather than seeing a situation in which they get mistreated, focus on beating “the game.” The illusion of possible victory makes them both more productive and more docile. Of course, governments are happily applying these lessons to ideology — recall China’s now-notorious Xi Jinping education app that gamified party indoctrination. (Logic Magazine)
  • China’s state-sponsored hackers are likely hacking video games on the side — yet another snapshot into the links between government and criminal hacking. (MIT Technology Review)
  • In April, we reported on two Saudi sisters who escaped their country by breaking into an app that regulated their travel. Media organizations are reporting those travel restrictions have been relaxed. (Channel 4)
  • New U.S. government rules will make it more difficult to legally go after algorithmic discrimination. (Vice)
  • A new report analyzes the possibility of “rogue” countries like North Korea setting up an alternative financial system based on cryptocurrency. This would help them evade US sanctions. (Foundation for Defense of Democracies)

The post Pentagon piloting swarms of surveillance balloons; Thiel upset with Google; and China’s social credit system appeared first on Coda Story.

]]>
8313