Surveillance Tech Robotizes Employees, Eroding Trust and Well-Being

, , ,
It very well may be that the costs to employee well-being actually significantly diminish, perhaps even negate, the supposed financial gains surveillance seems to promise.

In some Chinese elementary schools, students wear attention-monitoring headbands that tell teachers, by lighting up in different colors, how focused students are. A classroom robot also assesses their health and participation. Parents keep tabs on this real-time surveillance data, noting their child’s attention. If it dips too low, they’ll often punish their children at home. To parents, teachers, and school administrators, this all makes sense as a way to prepare children for their future in the workplace where employees are emotionally surveilled through wearable wireless sensors, which relay data to algorithms. These systems track emotional states, including anger and depression, so that managers can analyze the effectiveness of company policies and procedures. Power imbalances in China make brain surveillance effectively involuntary.

China is looking over their shoulder. Not to see if the rest of the world is judging, but to make sure they are still leading the way. By embracing monitoring, artificial intelligence, and precognition (predicting bad behavior before it occurs), they seek better behavior and a competitive advantage. In a recent article, The Atlantic reports that the Chinese President, Xi Jingping, is “using artificial intelligence to enhance his government’s totalitarian control—and he’s exporting this technology to regimes around the globe.”

To many American executives, these measures, or ones like them, seem compelling from an efficiency and profits perspective. That’s partly why a variety of surveillance technologies are rapidly becoming available to monitor employees in the workplace or at home. Remote work, which the pandemic has helped to proliferate, may be subjected to more monitoring than in-office work, as employers seek to replace close proximity and “over the shoulder” management presence with digital monitoring. 

It can seem like workplace surveillance is, in a sense, psychologically neutral.

Beyond cameras, recorded calls, and logged emails/chats, newer technologies are more invasive and ubiquitous than most people seem to realize is possible. As we witness the Chinese people diving into the deep end of surveillance, it’s worth considering this question. What impact might this mass experiment have on the American context? How far will companies go in watching and analyzing what their employees do—and at what cost? It very well may be that the costs to employee well-being actually significantly diminish, perhaps even negate, the supposed financial gains surveillance seems to promise. And not only that, surveillance could open up new avenues of discrimination. 

Computer software offers the most visible and popular monitoring right now, particularly for remote work. Some of the names are ActivTrak, HiveDesk, Teramind, Time Doctor, WorkExaminer, and WorkTime, and they vary considerably in invasiveness, features, and visibility to the employee. These products, installed on work machines, track communications like email and chat, log keystrokes, monitor search history, take periodic computer screenshots, and record how long and often employees use applications, among other things.


Subscribe to the Ethical Systems newsletter


“Get an accurate picture of each employee’s performance and intent,” ActivTrak states, describing its product. “Make informed management decisions and eliminate uncertainty about suspected behavior.” It promises to help you manage remote workforces by helping you “monitor employee activity including working hours, engagement, and productivity behaviors of remote workers.” Commendable transparency, you might say. It would be difficult to claim any of this monitoring is hidden. ActivTrak also spots many positive and generally accepted uses of the system. This includes opportunities for training, identifying who is overworked and when, and detecting the use of social media or entertainment. Other claims of identifying anomalous and high-risk behavior (fraud, sabotage, leaks, exfiltration, intellectual-property theft) could enter into more controversial territory. 

The Orwellian vision of the future office—where employees are tracked through phones, badges, implants, and retinal scanners—may have been mistaken. This sort of surveillance regime assumes that employees will carry, wear, or interact with certain devices. That’s transparency of a sort, because other systems avoid this visibility entirely. 

Behind gentle terms like Individual Behavior Recognition you will find technologies that can sense, identify, and recognize without the need for anything worn or connected. AcousticID2 may be able to use your gait (body shape, posture, walking movement) to identify you with 96.6 percent accuracy, using acoustic signals (outside the range of human hearing) and the doppler effect. Many versions of a Wi-Fi based system, including ones called FreeSense3 and BeSense4 can track movements and locations using the signals from existing devices. DeepMV5 (MV = multiview) leverages the fact that each individual is often in the presence of multiple device-free data sources (WiFi, smart devices, bluetooth beacons, acoustics), these can be combined with deep learning, a kind of machine learning that relies on certain artificial neural networks, to better recognize human activity. These technologies are already being developed to map how employees meet and work together6 and combat sedentary or unproductive behavior (e.g. CareFi7).

Soon your phone’s microphone will be able to listen to your computer keystrokes (at work or home). This will help determine how you’re using your computer, alerting your boss that you are chatting or gaming when you should be, say, coding.8 Of course, this seems open to misfires. Keystrokes could mean many things: Socializing on work time, productive work behavior between colleagues, or taking notes on a revolutionary idea that suddenly came to mind.

Just because a technology is available does not mean every company will use it. And simply feeling invasive does not mean it does harm. So what is the evidence for the effect of surveillance and monitoring on the well-being of employees?

It can seem like workplace surveillance is, in a sense, psychologically neutral. You might think that, for workers, the fact that they’re being monitored could fade into their mental background, thereby not affecting their attitude or mood at work. But that looks to be an incorrect assumption. Monitoring at work can lead employees to experience stress, anxiety, depression, health complaints, anger, and fatigue, particularly if they suspect that their monitored activity will provide a pretense for negative repercussions.”9

“The constant AI-powered surveillance risks turning the human workforce into a robotic one.” 

You might say, then, that being monitored is its own job demand (JD-R model). Researchers have, for instance, observed positive associations between monitoring and both stress and job demands.10 This is particularly concerning when managers monitor employees in order to directly ratchet up job demands. That pairing may have synergistically harmful effects on employee well-being. This may explain why researchers have observed electronic performance-monitoring decrease mood only for difficult tasks.11 There appears to be a beneficial association between limited performance-focused monitoring and well-being, as well as a negative association between intense monitoring and well-being. “The increase in efforts to regulate . . . means that more cognitive resources will be devoted to the task at hand.”12 This means that less intense and more performance-focused monitoring are likely a better combination than broader and more comprehensive monitoring. 

As companies seek to surveil employees with little pushback, employers may directly discriminate against people who oppose these measures. Efforts have long been under way to make it possible to select more surveillance-compliant employee personalities.13 Companies have even looked into structuring monitoring in a way that makes employees more willing to accept it.14

Algorithms that process monitoring data are likely to discriminate along gender and racial lines in ways that will be hard to identify and resist.15 With respect to more overt monitoring, women are much less accepting of camera surveillance.16 There also may be a gender difference in how performance monitoring is used among coworkers, where male employees use performance feedback for status seeking and competition with other men, to the exclusion of women.17 While female employees have similar aspirations, company culture may contribute to such counterproductive uses of performance monitoring that tend to disadvantage them.

Intense monitoring has been well on its way in China. But the coronavirus has given the Chinese government a palpable pretext for accelerating its mass surveillance even further. The Guardian recently reported that many in China seem worried that surveillance for the pandemic will outlast the pandemic:

“I don’t know what will happen when the epidemic is over. I don’t dare imagine it,” said Chen Weiyu, 23, who works in Shanghai. Every day when Chen goes to work, she has to submit a daily health check to her company, as well as scan a QR code and register in order to enter the office park. “Monitoring is already everywhere. The epidemic has just made that monitoring, which we don’t normally see during ordinary times, more obvious,” she said.

With all the potential harm of extensive monitoring, it may make more sense to instead pursue mutual organizational trust. This means relying on results-oriented performance indicators and giving top talent the freedom to flourish (and a reason to stay). A recent piece on the tech Web site VentureBeat underlined this well: “The constant AI-powered surveillance risks turning the human workforce into a robotic one.” 

Brian Harward is a research scientist at Ethical Systems.

References

1. Nelson, J. S. (2019). Management Culture and Surveillance. Seattle UL Rev., 43 631.

2. Xu, W., Yu, Z., Wang, Z., Guo, B., & Han, Q. (2019). Acousticid: gait-based human identification using acoustic signal. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3 1-25.

3. Gu, Y., Zhang, X., Liu, Z., & Ren, F. (2019). BeSense: Leveraging WiFi channel data and computational intelligence for behavior analysis. IEEE Computational Intelligence Magazine, 14 31-41. 

4. Xin, T., Guo, B., Wang, Z., Wang, P., Lam, J. C. K., Li, V., & Yu, Z. (2018). Freesense: a robust approach for indoor human detection using wi-fi signals. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2 1-23.

5. Xue, H., Jiang, W., Miao, C., Ma, F., Wang, S., Yuan, Y., … & Su, L. (2020). DeepMV: Multi-View Deep Learning for Device-Free Human Activity Recognition. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4 1-26.

6. Li, Q., Gravina, R., Li, Y., Alsamhi, S. H., Sun, F., & Fortino, G. (2020). Multi-user Activity Recognition: Challenges and Opportunities. Information Fusion.

7. Yang, J., Zou, H., Jiang, H., & Xie, L. (2018). CareFi: Sedentary behavior monitoring system via commodity WiFi infrastructures. IEEE Transactions on Vehicular Technology, 67(8), 7620-7629.

8. Yu, Z., Du, H., Xiao, D., Wang, Z., Han, Q., & Guo, B. (2018). Recognition of human computer operations based on keystroke sensing by smartphone microphone. IEEE Internet of Things Journal, 5 1156-1168.

9. Day, A., Scott, N., & Kelloway, E. K. (2010). Information and communication technology: Implications for job stress and employee well-being. In New developments in theoretical and conceptual approaches to job stress. Emerald Group Publishing Limited.

10. Backhaus, N. (2019). Context Sensitive Technologies and Electronic Employee Monitoring: a Meta-Analytic Review. In 2019 IEEE/SICE International Symposium on System Integration (SII) (pp. 548-553). IEEE.

11. Davidson, R., & Henderson, R. (2000). Electronic performance monitoring: A laboratory investigation of the influence of monitoring and difficulty on task performance, mood state, and self‐reported stress levels. Journal of Applied Social Psychology, 30 906-920.

12. Holman, D., Chissick, C., & Totterdell, P. (2002). The effects of performance monitoring on emotional labor and well-being in call centers. Motivation and Emotion, 26 57-81.

13. White, J. C., Ravid, D. M., & Behrend, T. S. (2020). Moderating effects of person and job characteristics on digital monitoring outcomes. Current opinion in psychology, 31 55-60.

14. Chen, J. V., & Ross, W. H. (2007). Individual differences and electronic monitoring at work. Information, Community and Society, 10 488-505.

15. Kellogg, K. C., Valentine, M. A., & Christin, A. (2020). Algorithms at work: The new contested terrain of control. Academy of Management Annals, 14 366-410.

16. Stark, L., Stanhaus, A., & Anthony, D. L. (2020). “I Don’t Want Someone to Watch Me While I’m Working”: Gendered Views of Facial Recognition Technology in Workplace Surveillance. Journal of the Association for Information Science and Technology.

17. Payne, J. (2018). Manufacturing Masculinity: Exploring Gender and Workplace Surveillance. Work and Occupations, 45 346–383. https://doi.org/10.1177/0730888418780969

Lead image: Stock-Asso / Shutterstock