Watching You Watch Me

  • Home
  • Watching You Watch Me

Watching You Watch Me

By Varya Srivastava

The COVID-19 pandemic has emerged as a double whammy for surveillance all over the world. To begin with, the isolated nature of the pandemic and call for social distancing has force-moved everything online. From work to social interaction, the internet has become the backbone that is enabling all global collaboration, pandemic response and economic activity. It is the glue that is holding us together as we weather through this storm. It lets us work from home, catch-up with our friends and family, stay updated with the evolving situation, order groceries, receive medical advice — amongst other things. And as we increasingly use the internet to maintain some degree of normalcy, it is actively being used by governments and tech companies to watch and track us.

A drastic change in 2020 vis-a-vis surveillance has been the increase in the scale of online engagement and the sophistication of analytics tools to track our usage patterns and digital footprints. This has opened multiple debates — Is surveillance ethical? Is privacy more important than public safety and health? Who has right over our data?

While answers to these questions are complex and subjective; one thing is clear. The choices we make in times of crisis, will inform the post-COVID reality. Therefore, navigating issues of surveillance in the times of COVID requires realising and recognising three critical components of our surveillance reality — ubiquity, omnipresence, and stakeholders.

Surveillance in Ubiquitous

First, we tend to think of the surveillance architecture in terms of physical components like CCTV cameras, drones, facial recognition, GPS tracing, etc. Which is true, but is not the complete truth. Surveillance is what and how these physical components collect data and make sense of the data. Biometric data or location data in itself means nothing. The real power of surveillance comes from the underlying algorithm and the interaction of the data collected across different physical components. For instance, using facial recognition for policing and law-enforcement is not necessarily problematic in itself. It becomes problematic when the underlying algorithm and the data-collection process is compromised due to the human biases that exist. When it targets citizens of a particular locality, community or identity influenced by the biases. For instance, according to a 2019 Federal study in the United States, Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Alternatively, it becomes problematic when data collected for purpose X is used for Y without consent.

Modern surveillance architectures that are ubiquitous and pervasive. They have the potential to track every move, thought, and detail through sophisticated tracking systems and analytics tools. Further, with the conception of Smart Cities, the nature of the state itself is taking a back seat. For example, smart cities projects in India are designed in active collaboration with major tech companies. Due to limited technical expertise with the government, minute considerations of implementation fall into the domain of the private sector. The Indore and Chandigarh smart cities are cases in point.

Therefore, now, it is more important than ever to use this potential in a cautious, accountable and ethical manner. Recognise that behind the technical construction of a ubiquitous surveillance system is a team of humans with biases, subjective thoughts and interests, and room for unintentional error.

Our Past Will Become Omnipresent

We all cringe from time-to-time looking at our old Facebook posts and Tweets. Are embarrassed about childhood stories and decisions we made growing-up. The good thing is, we can delete that post/tweet from our timeline; or bury that memory at the back of our mind. But not for long. With a robust surveillance system our past becomes omnipresent. It is no more a thing that we can grow out of or overcome. It is as alive and present as our ‘present’ reality.

A rather harmless example of this is the way we browse the internet or do online shopping. Websites as well as browsers save our online histories. Based on our past actions they show us websites or items to buy. This obviously does not have monumental consequences. At best it saves us some time, at worst it robs us of options we would have liked better. However, the omnipresence of our past becomes problematic when it inhibits our ability to overcome it. This detrimental impact is most evident in the banking and lending system that is increasingly adopting data-based loan giving practices. By doing so, not only are they limiting people from overcoming this socio-economic or cultural past, but also robbing them from a better future.

It’s Not Just About Governments

Surveillance is not just a government issue. Rather, more and more, surveillance capabilities are not in state hands. It is the major tech companies like Google, Huawei, IBM, Facebook, Amazon, etc. that form the foundation of this ubiquitous and omnipresent surveillance architecture. They are building the physical products and designing the algorithms that run it. For them, this surveillance architect provides increaditive user insights and patterns that they then monitise through advertisements and product developments. Therefore, to understand our surveillance reality is to understand the role, capability and stake each of the public and private actors has in the surveillance. Solving this equation is important in determining accountability, liability and building trust.

Actualising Sous Surveillance

Parallel to our process of understanding the post-COVID surveillance reality, is the important task of building capabilities to make this data collection process transparent and accountable, and make the logic, intentions and algorithms ethical. This requires actualising sous surveillance. A French word, it means ‘surveillance from below’. The Republic of Estonia has accomplished great success in this regard. Post the 2007 cyber-attacks, the Estonian government worked to make their digital infrastructure more secure and transparent. They did this by ensuring that — firstly, all citizens can see and control the data that the state collects and secondly, the data is protected and government institutions are individually responsible for data processing and security. Another great example is the city of Barcelona. By creating a digital participatory programme, they have crowdsourced the governance of their smart city and open-sourced the city data via DECODE Project. The DECODE Project is particularly remarkable, as it decentralises data governance and lets individuals choose who gets access to their data. In both Estonia and Barcelona, citizens control data flows and structures in transparent and accountable relationships with the government.

Without sous surveillance we are creating a dystopian post-COVID reality of paranoia, behavioural alteration and curbed freedom. To prevent this, immediate surveillance checks are required for each of monitoring and data collection.

At this stage of development, dismantling the surveillance architecture that exists is improbable. Instead, a sustained approach to involve civil society to build checks and balance on the surveillance apparatus is critical. Building trust, accountability, and authenticity is crucial in processes of data collection is important. As individuals are being watched, they must be allowed to watch what the watcher is watching.

Our weekly newsletter

Analysing the policies, plansand
parities of the 21st Century