Top 5 This Week

Related Posts

Join The Dots: Data WTF?

YouTube video

Kam Sandhu and Matt Kennard discuss the ways in which our data is being used by data brokers, tech elites and financial technology companies in order to make assessments of us. Is all data credit data?


Join thousands of others who rely on our journalism to navigate complex issues, uncover hidden truths, and challenge the status quo with our free newsletter, delivered straight to your inbox twice a week:

Join thousands of others who support our nonprofit journalism and help us deliver the news and analysis you won’t get anywhere else:

Story Transcript

KAM SANDHU: Welcome to Join The Dots, I’m Kam Sandhu.
MATT KENNARD: And I’m Matt Kennard.
KAM SANDHU: Today we’re gonna be talking about the world’s new oil, data. We’ve been covering data, privacy, and fintech all this month on Real Media. Now, the other week, Wired released a long read called “Big Data Meets Big Brother as China Moves to Rate Its Citizens”. This was about the development of the social credit system that would rank the trustworthiness of citizens from data as part of a national score. Ultimately, this will have effects on who gets a mortgage, on where your kids go to school and it’ll rate you on things like who you know, how much you watch TV, and where you hang out and whether that’s a positive or a negative. The Chinese government said that this would enhance trust and build sincerity. It starts as a voluntary process becoming mandatory by 2020.
MATT KENNARD: We know how much information data giants like Google and Facebook have about us but what we don’t talk about enough is what they’re doing with this information and what impact it’s gonna have on our lives into the future because all this data is stored from the first time we use one of their services. We don’t think consciously enough about what that means for our privacy, for our future, financial future and every other part of our life.
KAM SANDHU: One of the companies that will be providing some of this information to the Chinese government is a credit scoring agency called Sesame Credit. That’s one of the two big Chinese data giants. They create credit ratings from the information that they gather because financial companies have been one of the most aggressive and advanced data gatherers because they’re most interested in creating some kind of profile about you to figure out whether your credit worth. Douglas Merrill, the former chief information officer at Google, coined the phrase “all data is credit data,” which means all the information that you create online could be used in some way to affect your credit rating.
How does this play out with our information? In May, we spoke to Beverley Skeggs, sociologist at Goldsmiths and she had just completed some research about Facebook and how it’s able to track you. Here she is explaining what this phrase means.
BEVERLY SKEGGS: Every time you use any device whatsoever, you’re sending signals about your behavior that will be used in a model to predict what your credit rating will be, depending on who’s buying that data. It’s being assessed for its value. For some people, if it’s assessed as being high net worth individuals, it will be put up for auction and traded. For others, it will be profiled and possibly siloed into a, Sorry, low net worth category where less interest will be put in adverts on their browser by advertisers who will be competing to buy particular data.
KAM SANDHU: How much do we know about the data broker industry?
BEVERLY SKEGGS: Very, very little. They are beyond regulation. They refuse to appear before the US Senate or the Federal Commission. I’d say we know very, very little about them. They are the real dark side of the net because they are the people who are compiling, assessing, classifying and trading your data and for a new generation, if you are a child born now, they will be assessing your data right from the start. They’ll be building profiles on you. We have no idea how they’re, well, we know how they’re doing it. We can find out how they’re doing it, but we have no idea what you’re being traded for, what your value is; the literal economic transaction, we do not know. I would say they are the most powerful forms of classification that we now experience socially.
KAM SANDHU: Who do they sell this information to?
BEVERLY SKEGGS: Well, they sell it to each other. They will sell it to banks. They will sell it to other data companies, so say Experian, in our research, we found a lot of Experian and Acxiom on Facebook, so they will be selling the data to Facebook. Facebook will be selling data to them as well, so there’s a lot of trade as they’re all trying to work out who are the high worth individuals that they can sell onto either banks or advertisers.
KAM SANDHU: That was Beverley Skeggs talking to us in May about her Facebook research. She told us at the time that Facebook had recently patented the right to use your social circle, so who you know in your friends and their ratings they create about you, and Facebook has, you might have seen a kind of stream of stories about the kinds of things that they’ve been doing with data in terms of buying information from other companies, tracking when you’re off the platform and a lot of these companies are financial companies like Acxiom and Experian who will be making these kinds of credit ratings about you.
MATT KENNARD: It’s very hard to escape the pressures to take part in these platforms because now with your credit rating being decided on your use of Facebook or other platforms, you really don’t have the option of just shying away from it all because there’s many things that you need in society that now there’s a prerequisite that you use these platforms. From the Daily You by Joseph Turow, we have an example of the kinds of categories and reputation silos that are used such as soccer, and SUVs, mortgage woes, first digs and so on.
KAM SANDHU: These are the categories that are used to make decisions about you like what ads you’re gonna be shown, what rating you’re gonna get, but the ways that they actually put this information together to create these categories is totally new and it’s not just the ways that we might think about in terms of our age, our name, our location. There are loads of different ways that they’re able to gather this information. We spoke to Dr. Tom Fisher who’s a research officer at Privacy International.
TOM FISHER: In some important ways, we are losing control over our identity. They are building up this one single picture of us based on the data they gather about us. We no longer have control over how we present ourselves, who we are. For instance, there is Kenyan credit scoring apps, which, on a daily basis will upload the entire contents for your phone to their servers, then analyze it to offer people credit based on things like how you organize your address book, whether you write in capital letters, even things like how often you call your mother now affect your credit rating. The fintech sector is really about that expansion in the use of data and the amount of data they’re gathering about us to make these judgements.
One concerning development is the way that when we fill in an online form to apply for credit, for instance, how we fill in that form now becomes more important than what we write on it because they are analyzing information like what device you’re using. Are you using a high end iPhone, therefore you’re more likely to be rich and wealthy versus a lower-end Android device? Also, things like your location and they can map this in quite detailed ways to know whether you live in a place where they’re less likely to repay your loan. We used to call these kind of things “red lining” for mortgages, where certain populations and certain groups were discriminated against because of where they lived.
MATT KENNARD: That was Tom Fisher, talking about how we are losing the ability to decide how we present ourselves, and also how increasingly companies can get hold of information about us, which they might not have been able to get before this new technology existed and it had to go through convoluted channels and really abide by the law, and now there’s a lot of information about us out there, which can be accessed without our knowledge and without our consent.
KAM SANDHU: This is totally different to a normal kind of capitalist transaction because the value of the data isn’t the same as kind of buying a coffee somewhere because they’re able to have this longer relationship with the things that you’re giving up. There was an article in the FT that said that the, called, “Big Tech Makes Vast Gains at Our Expense,” which explain this and talk about the huge power imbalance that takes place in these kinds of transactions.
MATT KENNARD: Another way that people are conned is that the companies make it as difficult as possible to actually read their privacy policies. I think one lawyer said that it would take a year to read the privacy documents for the average person’s number of apps on their phone, so obviously no one’s gonna do that and every one sort of knows when you get that box that asks you to read the privacy policy and you don’t really do it, you just sort of press “accept”, so they…out on the fact that they know people aren’t gonna check what’s happening. I don’t think that’s really going to change.
KAM SANDHU: This comes back to this idea that internet marketers use that customers are getting a fair deal for using these services online, for using free service they should get access to your data. In reality, we’re not actually getting that fair deal and here’s Tom Fisher explaining one example of why not.
TOM FISHER: I was speaking to a friend of mine a couple of months ago who was using a fitness tracker tied to her insurance company and she would get a free smoothie every month in exchange for basically having her location tracked constantly wherever she was. If you think about what kind of things, something as simple as location data reveals about you, you know from that where someone lives, where someone works, you probably could work out where their boyfriend or girlfriend live, their wife, their mistress, all these people based on their movements, where you go, where you like to shop, all these things. In exchange for a free smoothie, in the case of the health insurer, you’re giving them all this knowledge and information about your life. It can’t be this fair deal or fair trade between the companies that are doing this.
KAM SANDHU: That was Tom Fisher from Privacy International explaining one example of why we don’t get a fair deal for our data.
MATT KENNARD: Research by Joseph Turow in 2015 found that most people in America were not evaluating the privacy conditions of any of the apps they were using. They were more in a state of resignation about the fact that their data was being taken by these companies. There was no sort of informed decision-making process at all.
KAM SANDHU: Tom Fisher mentioned in that example the health insurance companies. There was an article in Bloomberg by Cathy O’Neil where she talked about what she talked about what would happen to healthcare insurance industries when big data gets involved and she said that it would essentially encourage these companies to price out the sick and put up the prices on healthy people, which completely defeats what the risk-pooling nature of insurance is meant to be about, but these are corporations who want to make a lot of money and that’s something that we perhaps need to remind ourselves when we talk about big data and technology and how positive we can often be about it.
MATT KENNARD: I was going to say, I mean, essentially what needs to happen is there needs to be a magna carta for the internet age and there needs to be a new bill of rights that deals with the new technology we’ve dealt with because people are not aware of this as an issue that is gonna impact on everyday lives. Some kind of movement needs to start to ensure the privacy of people and to actually take these companies to court and stop them being able to mine the data.
KAM SANDHU: It’s something that’s totally worked in their favor though because I think people also are not just in a state of resignation, but kind of tired and can’t figure out what is actually going on in terms of their information. They hear that, yeah, tracking your location is really, really bad, but what are people doing with these bits of information? The reality is is that this is something that could affect your life, could affect your standing, your next application. I mean, the whole modus operandi has been to obfuscate that people don’t know what they’re actually capable of.
MATT KENNARD: One way round it as well is that you get technologists producing products that have privacy issues in mind and then outflanking them the big data companies. That happened with WhatsApp with their encryption. They don’t’ encrypt and then other services like Telegram and other ones came online that were usable, so people switched to them. WhatsApp, although they said the fact that they started encrypting was all some sort of idealistic thing, I think one of the guys said that he grew up in the former Soviet Union and knew how bad government surveillance could be. That’s why they did it. That was rubbish. What was happening was they were getting outflanked by other technologies, which were using encryption and people, so that could happen with all these big companies and I think that Snowden a lot of them had to change how they operate, at least in a PR sense because people, although everyone sort of suspected what was happening, Snowden really blew the lid on it and if you could get a Facebook where your data wasn’t mined and sold onto corporations, people would start using it tomorrow, so that could happen, and hopefully that will.
KAM SANDHU: Fingers crossed. You’ve actually got a book coming out soon, which is about corporations and the shadow world that they operate in.
MATT KENNARD: The book we’re doing is about the mechanisms that corporations use to obtain power and obviously a major way that a corporation can gain power is to have information about the citizenry. Now, the major thesis of the book is that the state is actually being eaten by the corporation and there’s just been a trend since the corporation existed as an institution has eaten bit by bit away at the state. Essentially, Facebook now definitely holds more information about this populous than the state does, so what does that mean? Especially as the state’s shrinking and it has less and less people employed directly by it, most things are subcontracted out to private corporations, so it’s all part of the same thing. The power imbalance is such that I don’t really think there’s many ways we can fight back at the moment. As we’ve said, the resignation issue is a major one with technology. People just don’t care that people are aware that their information’s being used, but that does not transfer at the moment into an anger at the company doing it. I think that until things start happening where people are put in prison or people can directly link having to pay increased insurance premiums, nothing much will happen, but that will happen eventually.
KAM SANDHU: Well, we spoke to Brett Scott who’s a financial activist and author about the kind of short term problems that technology has gotten very good at solving, and here he is talking about what kind of problems that leaves for us in the long term.
BRETT SCOTT: The general modus operandi of the technology sector, more generally, not only the financial technology sector is that you fixate upon short term problems or you artificially create short term problems or else fixate upon the null short term problems and you sell stuff like that to people. You say, “I’ll sell you a solution to your short term problem, but what we’ll never tell you about is the long term consequences of all of you using our short term solution.” Because you can’t sell long term consequences, so I think about the standard startup pitch format. If you ever go to any startup pitching contests, they always have this format, which is like, “I’ve identified a problem. Here’s my solution for the problem, and here’s how you monetize the solution,” right? Here’s how you market this to people. This is what you tell them, the short term use value of this thing, right? That’s the only thing that a venture capitalist is gonna back because that’s the only thing you can sell in the short term, right?
There’s this whole structure of capitalism, which involves having to pitch short term solutions and at an individual level, that might be fine, but when you start to zoom out and you look at the collective effects of these things, you’ll find people being locked into gigantic systems that they do not understand. Our mobile phones are a very good example of this. These have been pitched as being useful things, which they are, but it’s like, “Oh, by the way, we also created a huge surveillance system to track every moment of your life in the process of offering you these useful things. We didn’t really mention that in any of the startups that created this type of technology.” That’s the big problem that you’ll find, and there’s no economic incentives for people to ever tell you what the implications of the technology are, so there’s a chronic problem in technology ethics right now where start up entrepreneurs are rewarded for their individual successes, but are never asked to take responsibility for the collective consequences of all of this technology put together. That’s been going on for a long time, but there’s some cool stuff you can do with technology, too. I’m not totally against it, I’m just sick of the overwhelmingly, sickly sweet kind of insipid optimism around technology.
KAM SANDHU: Let’s talk a little bit about algorithms because Cathy O’Neil who we mentioned there with the Bloomberg article on health insurance and big data, she wrote an excellent book called, “Weapons of Mass Destruction,” which is all about algorithms. She’s also got a great TED talk if you wanted to check that out, which was about not putting blind faith into big data. She basically explains how these algorithms, which are increasingly deciding who gets a loan, who gets access to some kind of service, who gets an interview, these can replicate the kind of biases that we hold in our daily lives. It matters very much who is making this and how much these algorithms are checked and that they exploit people’s fear of math and science to get them to not ask any questions about what goes into this algorithm and what it takes to work these things out. There’s a feeling that because it’s a machine spitting out an answer, that it’s kind of objective, but it will just automate the status quo, in the words of Cathy O’Neil. In her book, Douglas Merrill, the man behind the “all data is credit data” slogan, he now runs a company called ZestFinance, which was designed to be a better rate than payday lenders. In the short excerpt, she explains what can happen in terms of feedback loops that are created from these algorithms.
MATT KENNARD: “Merrill proclaims that all data is credit data, in other words, anything goes. ZestFinance buys data that shows whether applicants have kept up with their cell phone bills, along with plenty of other publicly-available purchase data. As Merrill promised, the company’s rates are lower than those charged by many payday lenders. A typical $500 loan at ZestFinance costs $900 after 22 weeks; 60% lower than the industry standard. It’s an improvement, but is it fair? The company’s algorithms process up to 10,000 data points per applicant, including unusual observations, such as whether applicants use proper spelling and capitalization on their application form, how long it takes them to read it, and whether they bother to look at the terms and conditions. Rule followers, the company argues, are better credit risks. That may be true, but punctuation and spelling mistakes also point to low education, which is highly correlated with class and race, so when poor people and immigrants qualify for a loan, their sub-standard language skills might drive up their fees. If they then have trouble paying those fees, this might validate that they were a higher risk to begin with and might further lower their credit scores. It’s a vicious feedback loop, and paying bills on time pays only a bit part.”
KAM SANDHU: That was an excerpt that explained how these algorithms can create feedback loops that just reinforce your social position, or perhaps leaves you even worse off because of all this new information that’s being used to make decisions about you. Now, we’ve got Tom Fisher again, who’s also explaining why it’s important to pay attention to who is making these algorithms.
TOM FISHER: On the one hand, you’ve got the sets of the, you’ve got the people who are writing and developing these algorithms often far, far away from the groups that are being affected by things like alternative credit scoring methods. These are often credit scoring aimed at the poorest and most vulnerable people without traditional credit files. A lot of growth in Africa on these kind of projects, yet you have these being written by computer scientists, data scientists in California, so have to understand how these algorithms develop and work because it’s so easy for these to discriminate against certain groups. We see in predictive policing where the predictive policing algorithms are predicting crime rates based on historical data, but these historical data gathered by police forces, which themselves had certain ideas about race and were recording crimes differently for black people vs white people. We then see an algorithm reflecting those biases.
KAM SANDHU: We’ve also been following the advancement of the biometric ID scheme in India, which now has more than a billion people enrolled and is the world’s largest biometrics database. While it started as voluntary and just some kind of ID to help you get your welfare provisions, it’s become increasingly mandatory and it’s expanded quite quickly. This has linked a lot of other details to this biometric ID, which includes your iris scan, your fingerprints, and a photograph for facial recognition. It also includes stuff like your marriage registry and it’s increasingly demanding that you need this ID card to file stuff like your tax records. This is creating one of the biggest mass surveillance projects that the world has seen and it’s continuing a pace; however, there are activists finding back.
MATT KENNARD: There was recently a battle at the Indian supreme court. A bunch of activists took the Government to court over this invasion of privacy and they won against the Government who were arguing that they didn’t have a right to privacy.
KAM SANDHU: That case isn’t over. The supreme court ruling that Indians did have a fundamental rights privacy was a big, big win; however, the activists are still fighting on other elements of this ID system. We spoke to Usha Ramanathan who is a law researcher out in India, and we talked to her about the kinds of arguments that the government used to take away people’s privacy.
USHA RAMANATHAN: The attorney general said not only that there was no fundamental right, but that there was no right to privacy, so that was one dramatic moment when a whole course changed in the public eye. Then, of course, earlier this year, he helped us further by telling the court that we don’t, what makes them think people have an absolute right over their bodies? Now, the point about these questions, as we quickly realized, is not that the question was being asked by the wrong person, and therefore the question itself was wrong. Plainly, the question is not, “Why do you think you have a right to privacy?” But, “Why does the state think it needs to take away our right to privacy?” It’s not our having absolute right over our bodies. Why does the state feel they need to deny it? Then, of course, somewhere along the argument in the court, the attorney general also said they were arguing about the right to be forgotten. The attorney general said, “You may want to be forgotten but the state will never want to forget you.” He was plainly telling us that here we are, we have our power over you and we will use that power to keep you where we want you.
I understand that to a certain extent when technology comes into..you might, but at some point you have to pause and see if it’s okay or not and if it’s okay for the state to function like this. That pausing never happened. With this government, it’s a kind of aggression that you don’t see in democratic states and a kind of coercion that we never see in a democratic state, so it is saying that, “We have power and we can penalize you and we can paralyze you, so you better do our bidding, or else.” I don’t know how they think they will use all this information, but like many other people, even the corporates don’t know the various ways in which they use the information, but having all of this data gives them opportunities to create ways in which they can use the data. First thing is databasing, seeding it in every database, creating the potential to converge all of this. Then, we’ll see how we use it.
KAM SANDHU: The “seeding”, which Ramanathan was talking about there was the idea of using these mobile phone numbers in places like courtrooms or shops or to find out where these people have been. That is then attached to your ID number, so this finds out where you’ve been, where you hang out, and also where you don’t hang out, I guess. There was a point that Ramanathan made in there, which is that biometrics has advanced in countries where there’s lax regulation and poor populations. Tom Fisher talked about Kenya, but further in Africa, and in India biometric ID has been aggressively pushed on the populations whereas we kind of just see it here on iPhones and maybe at the passports and borders, but over in these kinds of countries, it’s far more aggressive and there’s far more coercion with tech companies and the state.
MATT KENNARD: They often pair it with medical projects and different things, so they say to people, “Well, we’ll treat you if you give us your data,” you know? A lot of people obviously in those sort of countries can’t say “no” to that. I also came across the…of the British government funding quite a lot of biometric data programs in the developing world, so that’s taxpayers’ money from here going towards getting their data in the developing world.
KAM SANDHU: One of the things that came out when the privacy ruling was made was that … It was WikiLeak’s release, but there’d been suspicious by Ramanathan actually, that these companies, the biometric ID companies, were largely US companies that were private homeland security contractors, so all this information was going back to the US deep state. The WikiLeaks release was that they had a software that was taking all this information from these biometric software companies back to the CIA.
MATT KENNARD: Yeah. Well, and the other thing that we haven’t really talked about is these are platforms that are used knowingly by people, but there’s obviously the technology that’s made in the west, mainly, which surveys people in ways that they’re not aware of. The UK and the US lead the world in producing cyber technology, which can be used by repressive governments to spy on their own citizens, so here we have a major one called, Gamma Group. I’m not gonna say too much, I don’t want to get sued, but they’re very powerful and they were exposed as providing technology to the dictatorship in Bahrain and that is another element of all this. I mean, the privatization of the security state itself, so if you look at who Snowden was working for when he leaked, he was working for the NSA, but he was working for a private contractor, Booz Allen Hamilton, which was gathering data for the NSA. Now, that in itself, should scare everyone because of course they’re giving the government go-ahead to survey, but that information is essentially held by a private company and they can do what they want with it. It also backfires against the government itself because Snowden would’ve been much more difficult to happen if it was all kept within the state because as soon as you start subcontracting, you let a lot of people and a lot of interests into information that they really shouldn’t have.
KAM SANDHU: We’re gonna go back to Usha Ramanathan now because we asked her why this situation in India was important worldwide and here she is, talking about why the interests driving it are corporate interests that are having an effect on us all.
USHA RAMANATHAN: These are the multinational corporate interest and these are also interests which are relatable to intelligence agencies and governments having interest in other governments and other peoples.
KAM SANDHU: Mm-hmm.
USHA RAMANATHAN: That’s the second thing, and the third is that actually data, like they keep telling us now, is the new oil. If the 20th century was dominated by oil, this century is going to be dominated by data but if we thought it was only about data in terms of understanding the world around us, the market now demands new products and human beings have become that product in a sense. We are now generating data and… calls it the “trickle up phenomenon”, when he says, “The people of this country don’t have wealth but they do have data, so if they can’t buy with us in the market, they can certainly give us their data, so the data will trickle up and then wealth will be created.”
The way it’s being said, when it started it was said that what we have is data and we leave digital footprints and those digital footprints, like fintech companies for instance, will be willing to provide us services without asking us for too much in terms of transaction fees, but they will just ask for our detailed information; detailed information about ourselves. Then, they can give us credit, so credit became the key to explaining why it’s all right to collect all of that data, so today when they say “trickle up”, they say, “Well, if you have only data about yourself, give it to us, you will get something in return, so you will get credit, and we will be able to watch you and give you what you want.”
The most interesting thing I think about this project is that it’s not a project that is being promoted by the state; it is being pushed, using state power.
KAM SANDHU: Right.
USHA RAMANATHAN: … has come by using state power, but it’s being promoted in marketing terms like most other things are marketed.
KAM SANDHU: That was Usha Ramanathan talking to Real Media about what’s going on in India at the moment. She also mentions that the companies that have pushed this kind of payment technology and also these biometric ID schemes have said that this is about financial inclusion. However, financial inclusions also means financial literacy whereas these people haven’t been told anything about the kinds of systems that they’re being put onto and that they can’t essentially escape. Bev Skeggs, who we saw at the start of this, she said in a piece that we recently wrote that this was demonstrative of the stealth by which these companies enter people’s personal lives.
MATT KENNARD: There’s been a big push for digital payment platforms and fin tech technology in India, and here we see a softer version of that with this push that cashless society is gonna be the great progressive hope for the future.
KAM SANDHU: We spoke to Brett Scott about how these fintech companies and the financial industry promote their ideas for the future as kind of an inevitable thing for society.
BRETT SCOTT: What the technology industry often will do is to speak about a future that they would like to see and present it as being completely inevitable and obvious that it will happen, all right? A lot of the sort of tech futurism is really the sort of desires of technology companies projected out as being obvious things that supposedly will happen. When people say, “We’re inevitably moving towards a cashless society, we are inevitably going to this,” it’s largely the marketing departments of the financial technology companies that are trying to create the sense that this is what you want, right? This is why the companies like Visa, for example, will be running all the adverts all the time in London saying, “Cash-free and proud.” They’re trying to engineer this feeling that there’s something wrong with cash.
I mean, cash is a public utility, right? You don’t, anybody can use it. There’s no requirement to be interacting with a private company in order to be using cash whereas private companies have to advertise and make themselves desirable, which is why they’re on this offensive against cash on private payments companies. But, yes, the financial technology industry, more generally has a lot of agendas in trying to tell us that there’s this inevitable automatization and digitalization of finance and there’s the banks and there’s an underlying layered payments companies and the fin tech companies are built on top of that. All of them have agendas in pushing for cashless society because cash represents a form of payment that they don’t make any fees from.
KAM SANDHU: We’ve talked about many elements of how this fin tech industry and how they financial technology industry is advancing aggressively into our lives and making these kinds of decisions about us. Will this change in the coming years as a generation of people grow up with this information being held about them and these decisions being made? Will it affect the way that we are online? Would it mean that we change or think about who’s in our social circles based on what’s better for our credit rating? Do you think that’s what’s gonna happen?
MATT KENNARD: I think so, and I think it probably already is happening. Maybe not in the sort of literal ways you’re talking about, but people are aware that prospective employers will be looking at their social media, yeah exactly, that potentially stuff about you will be kept forever and could come out at a later date. I also think that technology, as I said earlier, will in some ways catch up and that different platforms will have to come up to replace the ones that obviously don’t have our best interests at heart. For example, cryptocurrencies is quite an interesting one now because, firstly, because Bitcoin’s just got skyrocketed in a couple of months, but also because it does bypass the state in a way that we’ve never been able to do as a civilization with fiat currencies, which has been around since the time that capitalism began whereby the central bank allied to the government can print money when it wants and there’s no real accountability mechanism. Now, dispersed network currencies have really bypassed it, and I think probably the price spike in Bitcoin is a sign that people see it as the future because the network’s model is what’s gonna happen.
In summary, corporations can get as much information about you as possible in nefarious ways as a means to control you and sell that later on, or corporation’s can be founded on the idea that privacy is an important thing and we should respect that and we will not give your data to other people to make money off. Of course, in the marketplace, the latter will win because most people want a good product, allied with their privacy rights being guaranteed, so the incentive, even in a sort of capitalist sense, is to provide services, which are allied with privacy rights. At the moment, it hasn’t happened, but we’re at the start of this whole thing; we’ve only just begun.
KAM SANDHU: Yep. I agree, we’ve talked about, I suppose, the bad potentials of technology because it’s not something that’s articulated and we haven’t got to the point where we’re discussing privacy and I think that we do need new regulation. There is some e-regulation, the general data protection regulation that’s coming in and it’s a step in the right direction. There’s a lot more work that needs to be done, whether we’re gonna come anywhere close to keeping that after Brexit, I’m not sure. It doesn’t look like May is very concerned about people’s privacy.
I think, also, if people start igniting a conversation about privacy and human rights, it’s something that’s been so eroded and so pushed aside by companies like the big tech elites that we see now, that hopefully it’ll spur on a conversation about why we need to recenter the rights of human beings against corporations, so that’s kind of one hope that I’m holding out for.
MATT KENNARD: I’m quite pessimistic, merely because if you saw what happened over the IP Bill, the Investigatory Powers Bill, which was pass recently with barely a whimper, and that basically legalized what Snowden had revealed. Snowden’s revelation showed that Prism and other programs could just basically do a Google search of your private email accounts and Facebook accounts. The IP Bill, seeing that, just said, “Well, before we have a leaker, let’s legalize it so we can-,” and no one has said anything about it. That gives power to the police to search your internet history with not even a suspicion that you’ve made a crime. Now, if that can happen and no one really raises a finger, what hope do we have that people are actually gonna care that their information’s getting mined off to different corporations? I don’t know if there’s gonna be a tipping point. I don’t really understand why people don’t care, but it’s probably because it hasn’t actually flipped into real world consequences yet, but as you say, it will and it has, but in a way that’s indirect enough that people can’t trace the causality.
KAM SANDHU: Yeah. Well, the whole point is that they’ve been hiding kind of what they’re capable of, so I think it’s probably gonna be quite shocking when people start to see it having an effect on their daily lives; however, it is still secrecy. We don’t know how they make these decisions, but yeah, as you said earlier, people are getting more used to their social media profiles being looked at by employers and other things that will affect your financial standing.
We’re gonna finish with a last bit from Usha Ramanathan, and here she is talking about how these kinds of biometric ID schemes and other data mining pushes a presumption of guilt upon the public and they’re forced to prove their innocence against forces like corporation who haven’t had a good track record themselves.
USHA RAMANATHAN: Reporting to the state is the central part of this on the basis that if we don’t want to do it, this country is full of corrupt people, black marketeers, black money holders, money launderers, terrorists, so if you refuse and if you say, “Why or not are you asking us to do this?” It is only because you are corrupt and you want to protect your corruption, so this categorization of a whole population as all of these things unless they prove themselves otherwise regularly, I think that’s been one of the most ugly parts of this project because the state talking about its people like this, it’s been really tragic. Also, it’s interesting because just prior to this project coming into being, you have, take the previous ten years, 2000 to 2010, and you’ll see that there are major scams that came out and all those scams were about collaborative scams between politicians, bureaucrats, and corporates. Then, they will sit down and monitor that we are good, clean, harmless people, so think about what it means. This is the project.


Popular Articles