June 6, 2020, 7:04 p.m.

The Post-Covid March to Remote Worker Surveillance

DisAssemble

via Claudio Shwarz

I run a Philosophy and Ethics in Technology salon in London. Its members are individuals who are involved in many different fields, but all have a special interest in technology. Each month we tackle issues and questions relating to technology. This month we discussed the topic:

“Watching Your Workers: How Surveillance Technology Can Change Remote Working”

Some insightful themes and solutions manifested themselves, which are worth sharing here.

A New Capacity for Spying

One of the things that is striking about changing the paradigm of work is that new ‘capacities’ occur. Managers can now easily spy (I won’t use quotes for that word!) on employees using a variety of methods — by tracking their typing, seeing their screens, or a plethora of other methods. This is related to the idea that the Philosopher Peter Paul-Verbeek discusses — that our relationship to our world changes, not just through technological extension of existing abilities, but also because the technology and society allow for whole new behaviours and behaviour choices to appear. In this case, the opportunity to monitor employees.

In our salon we discussed how new capacities as engendered by the mixture of new social dynamics and technology allows digital surveillance to happen. Social dynamics of trust, transparency, and habits have changed. And technology allows for surveillance. The dynamics on how someone is monitored, as facilitated through the type of work they were doing (seemingly easily digitally quantified work), and the medium they were using (i.e. a computer), is simply fundamentally different from how work was prior to digital technology.

In this new dynamic, questions around “can we?” become less relevant. The question becomes “should we?” — or, often, it is not questioned at all, it is just done.

An unpalatable frontier

Even within the umbrella of “should we” comes the question of palatability. Is worker surveillance palatable to the employer and employee? It shouldn’t be surprising that for a myriad of reasons this is unpalatable to the employee, but the effect on the employer can be a questionable on as well.

In an article we discussed, a NYT employee installed a time and screen tracking software and asked his manager to use it to monitor him. The manager did indeed do that, but began to feel ‘icky’. This is a major issue — new capacities for technological action on the part of managers appear, and managers themselves have to overcome their own ethical boundaries.

A depersonalised human

It’s not just that it’s icky, however; it’s that it is far easier to depersonalise the human at the other end of the computer. Our salon discussed how it is far easier to compress employees into a quantitative outlay of metrics with these technologies. Things like mouse movements and keyboard activity can be tracked. This is a dangerous precedent, as we noted that these aren’t reflective of work output. Designing, for example, may involve sketching on paper or just thinking, perhaps away from the computer. Coding may involve a lot of reading which may be perceived as inactivity. Moreover, because data is intrinsically reductive, it is easy to fool, and also likely would be subject to abuse and corruption. We noted how it would be easy to create invective to compete on these small metrics rather than other, likely more qualitative outputs (team building, learning etc).

Indeed, one of the articles we read was about Taylorism, which is managerial strategy, created a century ago, which worked by:

“breaking down tasks into inputs, outputs, processes and procedures that can be mathematically analysed and transformed into recipes for efficient production.”

Needless to say, this resulted in people being treated like machines, with employers carefully timing each action and squeezing efficiency out people by making them complete mindless tasks.

We felt that this ‘ depersonalised human’ is a distinct danger with surveillance tech. Interestingly, some of us mentioned that we are already feel like we we are being depersonalised. Some of us mentioned calendars and standups as being perhaps used for purposes they weren’t meant to, that is, ‘evidence’ of productivity.

The new abnormal

And these existing, creeping forms of depersonalisation point to the problem of normalisation — the worry that if this behaviour becomes normalised, the ‘ickiness’ will dissipate. If these technologies are treated as something everyone uses, then people won’t feel as ‘icky’ doing it, given that this type of spying feels natural. Indeed, treating people as digital objects or resources could begin to feel normal, as economic theory and Taylorism did in the non-digital world (e.g. ‘human resources’).

It follows that it’s only unnatural because it’s not currently how workers are treated online (at least, mostly), but there’s no reason why it couldn’t happen, especially with the precedent set by Taylorism.

We also discussed the panopticon of Jeremy Bentham, a prison in which prisoners’ cells were situated around a central hub of guards. The prisoners never knew if they were being watched. In a digital, globalised society we have become used to being watched. The philosopher Foucault used the panopticon as a metaphor for society. It wasn’t always normal to be observed, he noted, it’s only through the nation state and institutions supporting this behaviour that this became an expected state to be in.

Presidio Modelo, a panopticon

If surveillance tech is supported by institutions as remote work marches forward, then the abnormality of digital surveillance could become the norm.

A panopticon for the untrustworthy

A lack of trust from managers to employees is certainly one way it could become normal. We discussed how employers are being pushed by some of these ‘time-tracking’ companies (such as Time Doctor), and even by the capitalist system at large to not trust employees. We discussed the idea that this made little sense — people should be trusted, as for one, they likely would not be in the company they were in if they had no interest in contributing (at least for employees with employment mobility). Additionally, if someone isn’t working or not contributing, it may be difficult to understand why this is the case, and surveillance may be the ‘go-to’ solution for deeper issues. This is a psychological, organisational and societal issue, which can be challenging to parse.

The underclass always loses

But of course, poorer people and those deemed ‘unskilled’ are almost always trusted less. We discussed how individuals who have more to lose, or who are treated as more disposable will be less likely to protest against these surveillance methods. They can also often viewed by those in power as grifters. Indeed, due to remote working, they, like in many areas of society disproportionally lose with new systems of power.

So, how can we ensure that these themes don’t come about? We discussed some solutions, below.

Aggregate, don’t individualise

We felt any surveillance tech, if it must be used, shouldn’t be individualised in a way where individuals could be spied upon or surveilled in ways that do not account for their qualitative output. Methods such as screenshots or keystroke tracking are rife with not only ethical issues, but also are ineffectual. Instead, tracking the aggregate of workers to find key patterns is a useful way to understand how people behave, and what tools, methods or contexts may be useful for improving workers’ lives and how they can contribute to any organisation.

Champion workers’ rights

We thought ‘knowing your rights’ was a vital step to defend against this surveillance, yet workers rights as per their work computers are sadly limited. In the UK, it’s perfectly fine for employers to monitor employees emails, web history and emails. The company just needs to tell employees (in fact it doesn’t always — EPA guidance allows for covert surveillance of employees). Current laws in the Data Protection Act are pathetically limited, with ‘guidance’ just suggested:

“If e-mails and/or internet access are, or are likely to be, monitored, consider, preferably using an impact assessment, whether the benefits justify the adverse impact.”

Given the issues discussed, and the onset of remote work, this is something that needs addressing.

Champion change

The only way that these laws can change is through championing change. We discussed how just talking about it — whether in person or on the internet — is vital. Monitoring may seem normal as something that goes hand in hand with remote work, and challenging this narrative will require a great deal of discussion at all levels. Even challenging a surveillance paradigm through metaphors can be of help. Noting that conversations in offices are not monitored, even though the company owns the building is useful analogous framing.

Perhaps more than that, work computers are certainly company property, but what occurs on them on platforms that aren’t related to work should not be — computers are now far too intertwined in our lives, like infrastructure. In Changing Things: The Future of Objects in a Digital World, the authors liken digital content to a sort of ‘fluid assemblage’ that is made from a wide variety of different technologies, systems and data, into what the user sees on screen. How digital content is owned may need a conceptual change, and it’s likely that only through actively and loudly championing change can we make this happen.

P.S. If you’re interested in joining our Philosophy and Ethics in Tech Salon, email me at vikramsinghbc at gmail dot com! All are welcome!

You just read issue #30 of DisAssemble. You can also browse the full archives of this newsletter.

Powered by Buttondown, the easiest way to start and grow your newsletter.