The ubiquity of the digital world isn’t hard to see even for the average person. We now use the internet, digital devices and smart technologies for everything. It seems every facet of our lives involves some kind of digital tool, from the smallest interaction to the most mundane of daily tasks. We work, do business, have fun, and sustain relationships all within the digital realm.
The full effect of the internet and rapidly advancing technology on our lives is hard to ascertain, especially as it feels we are discovering those effects in real time. While most people understand the pervasiveness, I don’t think they appreciate just how much digital data intervenes in their lives, or what this implies for their constitutional rights in a purported democracy. If the advent of the all encompassing digital world also means all encompassing surveillance–just exactly how free are you? And what does the intimacy of digital technology mean when you account for race, gender and class?
A Tale of Two Apps
In recent years there has been increased interest in exposing the shady business of selling people’s personal info to third parties. Recently, John Oliver, the HBO talk show host, did an episode on data brokers where he demonstrated to Congress just how easy this all was by creating a profile and targeting them with ads.
But I’d like to compare two stories regarding applications used on the phone and how that data is then sold, resold, and used for surveillance purposes.
The first story “Leaked Location Data Shows Another Muslim Prayer App Tracking Users” was published by Vice, Motherboard in 2021, and details how an ICE contractor was able to purchase location data from a Muslim prayer app. The second story “What Your Period Tracker App Knows About You,” published by Consumer Reports in 2020, details what health data period trackers have access to and what they can do with that data. You can already see similarities in these stories by how intimate the service being given is and the vulnerable populations that make up their users.
For those who don’t know, practicing Muslims often pray 5 times a day as a daily ritual. In each of these prayers they must be facing the Kaaba in Mecca, and also adhere to the changing times of the prayer. Prayer tracking apps quickly became a popular, 21st century way to make this all easier. Salaat First is one of these apps, with a staggering 10 million downloads.
Motherboard reports that such granular location data was being logged and sold that one app user’s phone reported their physical location every two minutes walking near a Mosque. This location data is sold to Perdicio, a French company. They, in turn, sell it to US government contractors, who then share that data with law enforcement agencies like Immigration and Customs Enforcement (ICE) and Customs and Border Protection (CBP) and FBI. Predicio is part of a larger complex data supply chain that has been handing Muslim users’ data over to authorities who would surveil them even more. And Perdicio doesn’t just collect data from Salaat First app–it also gathers location data from Fu*** Weather, a weather app with more than one million downloads. Data brokers will often use multiple sources to collect, aggregate, and combine countless data points on you to create a profile that they can then sell to others. This raises questions about apps’ privacy policies, and also about any claims of “anonymity,” as we’ll see in the next story.
“These invasions of privacy mean real world harm to vulnerable groups.”
Users of a period tracker app called “Flo” also experienced similar invasions of privacy with their personal information. Flo, which has 50 million downloads, is an app people use to track their period in order to plan having a baby, track for hormonal issues or to prevent a pregnancy (although professionals have criticized the accuracy of these predictive tools for “natural contraception”, so be careful!). The data these kinds of reproductive health apps gather is quite personal and intimate.
The types of information being logged into the app can include: Dates of your period, number of times you have sex, if you are trying for a baby, if you have unprotected sex, if you’ve had a miscarriage, and if you’re near menopause. The Consumer Reports tells of one woman, Heather, who despite using the app still had privacy concerns. In order to balance her privacy concerns with her need for a period tracker, she attempted to use the app anonymously by not giving her name or email. Consumer Reports found that even anonymous users “have no guarantee that their information won’t be shared in some way with third parties for marketing and other purposes.”
Unlike Salaat First which was sharing location data, Flo has far more sensitive medical and health information. Personal health info being shared has serious implications that could severely impact your life in unseen and indirect ways. Consumer reports says the selling of health data has been known to affect your ability to claim life insurance, money you’re charged for coverage, interest rates on loans and even workplace discrimination. And unlike medical records, medical health information stored on apps is not covered by HIPAA.
These invasions of privacy mean real world harm to vulnerable groups. An app specifically designed to help Muslims stay faithful sells their location data to unsavory government police and intelligence agencies, who are likely to use it to abuse and violate their rights. An app ostensibly aimed at helping women with their reproductive goals forks over their sensitive medical data to a black box of third parties that could have all types of consequences for them.
On a broader scale, the growth of technological advancements and digital devices connected to the internet has paved the way for our new knowledge economy. A far cry from factory or manufacturing work, Big Data has become the name of the game. Donell Holloway writes in the Conversation, “Surveillance capitalism…uses a business model based on the digital world, and is reliant on “big data” to make money. The data used in this process is often collected from the same groups of people who will ultimately be its targets.” Ultimately, if this business model is to make money (which is the superseding drive), then you must log, surveill, and scrutinize every action your target takes online to create a behavioral portfolio that can further commercial goals.
A recent Restore The Fourth article, “Why Free Markets are incompatible with the Surveillance State,” remarked on the knowledge economy that our own taxpayer dollars are going not to capital improvements but to 17 spy agencies with a combined budget of $80 billion. In the knowledge economy, information is power and “our market system is founded on open and unfettered access to knowledge.” But shouldn’t there be some protections? Some knowledge we don’t want being shared? Did this unregulated market help the users of Salaat First or Flo?
The term, “surveillance capitalism,” was coined by Professor Shoshana Zuboff of Harvard. In an interview, she remarks that while amassing huge amounts of personal data is an issue, there is a second more insidious component of all this that we don’t account for. The actors within Surveillance Capitalism “now develop ‘economies of action,’ as they learn to tune, herd, and condition our behavior with subtle and subliminal cues, rewards, and punishments that shunt us toward their most profitable outcomes.” If taken far enough, this could erode our autonomy and even our democracy.
But I think there are also some healthy criticisms of Zuboff and her book that will give us insight on how to move forward and fix this problem. The most useful and in depth criticism of Zuboff’s ideas comes in a book review on Baffler by Evgeny Morozov. One of his points is that Zuboff is trapped by the logic of capitalism and cannot see outside it. Morozov argues that this book is actually a “warning against ‘surveillance dataism’” not surveillance capitalism and that Zuboff doesn’t answer the all-important why question—“surveillance capitalists engage in surveillance capitalism because this is what the imperatives of surveillance capitalism demand.” In a similar vein, I found this blogpost by a teacher incredibly helpful for examining the scholarly efficacy of Zuboff’s book.
Overall, I personally don’t believe replacing surveillance capitalism and all its innate perverse incentives with another more managed and “cleaner” digital capitalism is going to be what gets us out of this quagmire.
Race, Gender, and Class
Everyone’s data in surveillance capitalism is being shared and sold and repackaged to be used against them.
But who has the FBI, ICE, or police come to their doorstep? Who is discriminated against in the workplace for being pregnant? Whose abortion clinic visit is being logged and sold to the highest bidder? Whose credit scores and insurance prices are affected to the point where they can’t find housing or get affordable healthcare?
The racial, gender, and class fault lines remain as intact as ever and all but guarantee that state violence and/or debilitating poverty will be the unshakeable future for most. It’s these people who will experience real world harm. The Internet of Things treats constitutional rights as “damage” or blockages and routes around them by oversharing every scrap of data. To the point where it has us re-litigating the hard won victories of the civil rights era.
One only has to look at the surveillance apparatuses coordinating to thwart peaceful protestors and gravely needed reforms in order to realize how far along we are into what I’d describe as a virtual noose. Government and private sector surveillance technologies combined with personal data brokering industries will crush rebellions before they begin, understand you better than you can and overdetermine the course of our lives globally.
We need swift government action and a concrete next step would be having Congress pass the Fourth Amendment Is Not for Sale Act. Here is a letter signed by 48 privacy and civil liberties orgs, including Restore The Fourth, urging the Senate and House Judiciary Committees to hold hearings on a bipartisan bill that would stop law enforcement agencies from acquiring, from apps or data brokers, data that would otherwise require a warrant to acquire directly.
Without a radical change in course, the digital world will continue to fuel inequality and discrimination, to bolster the profits of the rich and political aims of the powerful.