Skip to main content

CONCEPTUAL ANALYSIS article

Front. Hum. Dyn., 19 September 2023
Sec. Digital Impacts
This article is part of the Research Topic Socio-Legal, Ethical, Technical and Medical Considerations on Neuroprivacy and Brain-Machine Interaction Technologies in the era of A.I. View all 8 articles

Neurosurveillance in the workplace: do employers have the right to monitor employees' minds?

  • 1Institute of Biomedical Ethics and History of Medicine, Digital Society Initiative, University of Zurich, Zurich, Switzerland
  • 2Institute of Biomedical Ethics and History of Medicine, University of Zurich, Zurich, Switzerland

The use of neurotechnologies for surveillance in the workplace have the potential to impact the entire working population of the world. Currently, with the help of neurodevices, employers could analyze the brain data from employees to assess their cognitive functions (such as mental capacity and efficiency), cognitive patterns (such as response to stress), and even detect neuropathologies. The workers brain data obtained with wearable neurodevices could serve employers for purposes such as promotion, hiring, or dismissal. Neurodevices could also be used as new micromanagement tools, aimed at monitoring employees' attention at work. Additionally, they can be implemented as tools for self-control for workers, as the feedback provided about their current cognitive state can help improve the outcomes of ongoing tasks and ensure safety. Recent studies have shown that while employees may recognize the potential benefits of using such technology for self-monitoring purposes, they have a negative perception toward its implementation in the workplace. Surprisingly, only a few scientific papers specifically address the issues of neurosurveillance in the workplace, while international frameworks have not yet provided precise responses to these new intrusive methods of monitoring workers. The overall goal of this paper is to discuss whether employers should be allowed to use neurosurveillance technologies in the workplace to monitor their employees' minds and, if so, under what circumstances. The authors take a hypothetical scenario of neurosurveillance in the workplace using EEG-based devices as a starting point for their analysis. On this basis, three key ethical issues are identified: an increasing power imbalance in the employment relationship; a new threat to employees' privacy, and a risk of neurodiscrimination.

1. Introduction

Although surveillance at the workplace has existed for a long time, technological advances are providing employers with new and more sophisticated ways to monitor their employees' productivity and behavior at work. Scholars have raised concerns that widespread surveillance of employees goes far beyond what is reasonable and necessary (Ball, 2010). Video surveillance, movement tracking by means of company-provided smartphones or wearable devices, the use of software to monitor employees' emails and internet activities, among other methods, give employers very detailed insights into employees' behavior and about how they spend their time at the workplace.

Additionally, the sudden shift to remote work during the COVID-19 pandemic contributed largely to the demand and implementation of new digital tools for employees' surveillance and has further blurred the lines between their work and personal lives (Harwell, 2020).1 According to a recent survey of 2,000 employers, 78% of them reported using a monitoring software to track employees' performance and/or online activity. The study also included 2,000 employees who work in a remote or hybrid capacity; 30% of them did not believe their employers are actively monitoring their online activities, and 15% of them did not even know that that was possible (EXPRESSVPN, 2021).

The reasons behind this surveillance trend could vary depending on the industry, company culture, and the specific job responsibilities of employees. While some employers may use surveillance to ensure that their employees are performing their job, others may be motivated by security concerns. A recent important change in work management is the emergence of digital labor platforms, which digitally connects the demand for particular tasks within a certain geographical area with private persons who are ready to perform those tasks (Eurofound, 2021). In this case, the surveillance and algorithmic management becomes the integral part of the business model (Vallas and Schor, 2020). At the same time, the policymakers are struggling to keep up with the rapid pace of business solutions for non-standard work relationships (Aloisi and Gramano, 2019; Commission, 2021; Proposal for a Directive of the European Parliament of the Council on Improving Working Conditions in Platform Work, 2021).

The latest stage in workplace surveillance may involve the implementation of wearable neurotechnology devices to monitor the concentration level, the degree of alertness or fatigue, and emotional states of employees at the workplace (Wexler and Reiner, 2019; Niso et al., 2023). These devices rely mainly on electroencephalography (EEG), which has been long-standing within medical practice. Despite being sensitive to noise, EEG remains the most accessible technology for recording and analyzing brain activity due to its non-invasive nature, portability, cost-effectiveness, relative simplicity, and exceptional temporal resolution of <1 millisecond (Gevins et al., 1999). It is possible now to identify neurocognitive states like mind wandering, effort withdrawal, and inattentional phenomena (Dehais et al., 2020). Moreover, with the help of neurodevices, the employer could gain access to the employees' mental workload (Maior et al., 2017).

Brain monitoring devices can be easily utilized as new micromanagement tools aimed at monitoring employees' attention at work (Patel et al., 2022). Additionally, they can serve workers as tools for self-understanding and self-control, as the feedback provided about their current cognitive state can help improve the outcomes of ongoing tasks and safety (Wexler and Reiner, 2019).

In the consumer market there are various wireless brain-sensing devices that utilize EEG technology to measure and analyze brain activity and provide personalized feedback to users in order to improve productivity (Johnson, 2017). For example, the EEG-based helmet by SmartCap is designed to detect and alarm fatigue for workers in heavily machine-driven industries such as mining, aviation, and gas companies.2 Surprisingly, at least tens of thousands of workers worldwide are already using this device (Farahany, 2023).

Potentially, brain scanning devices at the workplace could assist people in understanding their focus and productivity, and help avoid accidents caused by the lack of concentration and increased fatigue. These methods may help to enhance safety measures for high-risk professions that require a high level of concentration, such as long-distance truck drivers, airplane pilots, and medical emergency responders. These systems can detect a decrease in attention level and alert the driver or operator, as well as intervene in the case of an individual being unable to regain focus, reducing the risk of accidents and improving overall safety.

Surprisingly, despite the existence of extensive research on the topic of workplace surveillance (Ball, 2010), only a few papers specifically address the issue of neurosurveillance at work from ethical and legal perspectives (Hopkins and Fiser, 2017; Martinez et al., 2022; Farahany, 2023). However, a recent survey revealed that while individuals may see the potential benefits of using such technology for self-monitoring purposes, they have a negative perception towards its implementation at the workplace (Midha et al., 2022). The reason for this negative perception was the concern surrounding privacy, data validity and misinterpretation, and personal identity (Midha et al., 2022).

Undoubtedly, the potential development of neurosurveillance tools raises serious concerns about the risk of excessively invasive forms of surveillance and privacy violation, which may grow largely unchecked due to the lack of legal protections against these practices (Aricò et al., 2020; Park et al., 2020). Given the power imbalance between employers and employees, it is clear that there is limited possibility for workers to refuse neurosurveillance without suffering from some disadvantages, or even risking losing their jobs.

Therefore, if effective regulations are not adopted, the future world of work could be one in which it becomes normal that employers require employees to use devices (like EEG-based headsets or similar devices) that collect their brain data during working hours. This may involve a serious violation of workers privacy rights, and in particular of the so-called “right to mental privacy”. Moreover, these practices may enable new forms of discrimination (“neurodiscrimination”). In addition, neurosurveillance procedures may lead to constant stress for workers due to the permanent monitoring of their minds and the fear of losing their jobs due to inadequate mental performance or emotional problems. It may even happen that employers are tempted to sell their employees' brain data by making them accessible to third parties.

The overall goal of this paper is to consider whether employers should be allowed to use neurosurveillance technologies at the workplace, and, if yes, under what circumstances. To this purpose we will use a hypothetical scenario.

2. A hypothetical scenario

The following hypothetical scenario may help to illustrate the issues at stake if neurosurveillance at the workplace were implemented.

Mr. George Smith owns the Poseidon company, which is specialized in producing luxury goods such as designer clothing, bags, and shoes. He is an enthusiast of the use of new technologies to improve productivity, and has just read a book about Taylorism, which was a system of factory management developed by Frederick W. Taylor in the late nineteenth century to identify the fastest way for the workers to do their job. Taylor had analysed each individual movement of factory workers to determine which were essential and timed the workers with a stopwatch. The goal of the system was to eliminate all unnecessary movements in order that the employees, following a machinelike routine, could significantly increase their efficiency.

Mr. Smith has learnt that some factories in China are already requiring workers to wear electroencephalography-based headsets to monitor their concentration level at work (Fuller, 2018). He has also read that, according to some reports, this measure led to a significant increase in the productivity of those companies (Chan, 2018). Mr. Smith thinks: “Why couldn't I do the same in my own factory? After all, my lawyer told me that there is no law specifically prohibiting to ask employees to wear neurodevices at work”.

Based on these considerations, Mr Smith decides to make it obligatory for his 120 employees to wear an EEG-based device in the shape of a helmet. The system will measure their brain activity and assess more precisely the level of concentration and performance of each employee. In the long run, the system is expected to help the factory manager make decisions about the continuity or not of the employment relationships, depending on whether the workers are able to adapt or not to the increased level of concentration and efficiency that is expected from them.

These new rules spark a strong discussion among employees. Some are highly motivated to use the technology because they believe their work efficiency is higher than that of some of their colleagues, and this system would do justice to them. However, others strongly oppose the new monitoring method arguing that it would be “a step too far” in the employers' control over them and “contrary to their dignity”. The debate takes place during a trade union meeting, in which all employees must vote for or against the new requirement. In the end, the results are evenly split, with the vote being blocked.

Mr. Smith is informed of the heated debate at the trade union meeting and decides to take a step back from the proposed measure. Wearing the helmet would not become mandatory, but those employees who agree to use it will receive a 15 percent salary increase.

Two months after the implementation of this measure, Mr. Smith analyses the data from the neurodevices and can determine that those employees who have used the helmet have substantially improved their level of concentration at work and their productivity. To reward them, he decides that, in addition to the increased salary, they will have priority in choosing their working hour shifts and their holiday period. They will also have a better prospect of continuation and promotion within the company.

As soon as the other workers learn that the information from the neurodevices has been used for such management decisions and to the detriment of those who did not agree to wear the helmet, a protest breaks out. They file a complaint with the Labour Ministry alleging that their employer violates labour regulations and discriminates them just for refusing to allow him to assess their cognitive capacity and productivity based on their mental data.

This fictitious though not unrealistic scenario leads to the following fundamental questions: Should employers be permitted to monitor their employees' minds by means of neurodevices? If so, under what circumstances?

Before starting with the analysis of the case scenario, it is important to make three preliminary remarks:

a) The case scenario deals with the simplest technology available today to monitor mental states, which are EEG-based neurodevices. They enable the detection and interpretation of neural signals without the capacity to modify them. For example, the devices developed by Emotiv include, according to its advertisement, EEG buds for “workplace wellness, safety, and productivity” that “help to measure and analyze changes in your levels of stress and attention using EEG and Emotiv's proprietary machine learning algorithms” (EMOTIV, 2023). Emotiv also takes care of specifying that the device “cannot read thoughts or feelings, but it can provide easy-to-understand feedback on your level of stress and attention to empower you to make better choices” (EMOTIV).

 It must be mentioned however that the future steps in the use of neurotechnologies at the workplace remains uncertain. If instead of using EEG-based devices, employers take advantage of more advanced technologies (such as for instance, fMRI or invasive brain-computer interfaces), new and more complex ethical and legal issues will emerge. Some of these advancements have made it possible, to some extent, to read minds (Tang et al., 2023). Moreover, the new devices may not only monitor but also enhance the level of concentration by means of Transcranial Direct Current Stimulation (tDCS) (Karthikeyan et al., 2021). This technology may also help workers fighting fatigue in safety-critical professionals like firefighters, nurses, and emergency doctors (Cofas, 2019).

b) Despite the fact that the neurodevice records and analyzes brain activity, the manufacturers are able to bypass specific medical device regulations (such as the EU Regulation on Medical Devices) [Regulation (EU) 2017/745 of the European Parliament of the Council, 2017] by defining the product's purpose of use as “lifestyle and wellbeing”. As a result, the consumer neurodevices should only comply with general safety requirements [Directive 2001/95/EC of the European Parliament of the Council of 3 December 2001 on General Product Safety, 2001]. The European Court of Justice (ECJ) has supported this strategy by proposing a narrow understanding of the notion of “medical device” [Brain Products GmbH v BioSemi VOF and Others, 2012]. According to the ECJ, “in situations in which a product is not conceived by its manufacturer to be used for medical purposes, its certification as a medical device cannot be required” (Brain Products GmbH v BioSemi VOF and Others, 2012). As a result, non-invasive wearables EEG-based neurodevices intended to “read” brain activity could potentially be used at the workplace without requiring a conformity assessment as medical devices, in a similar way as other wearable devices and health apps that collect lifestyle data (Brassart Olsen, 2020). However, it is not the case for neurodevices intended for brain stimulation that penetrate the cranium to modify neuronal activity in the brain as they are subjected to regulation as medical devises (Commission Implementing Regulation (EU) 2022/2347 of 1 December 2022 laying down rules for the application of Regulation (EU) 2017/745 of the European Parliament and of the Council as regards reclassification of groups of certain active products without an intended medical purpose).

c) It is important to take into account the risk of dual-use of this technology (Ienca and Ignatiadis, 2020). In our context, this means that neurotechnologies that were initially developed for medical or military purposes may spill over into daily life and the consumer market, leading to changes in the ethical attitudes towards them and, consequently, to new legal practices (Marchant and Gulley, 2010; Shein, 2022). The consumer-oriented technologies are generally viewed more favorably when there is no power imbalance at stake, such as in everyday use. However, step by step, society could become more familiar with neurodevices and might possibly trade their private information for daily comfort from technological progress (Zuboff, 2019). This highlights the importance of establishing as early as possible appropriate boundaries for the implementation of neurotechnologies in general, and in the workplace in particular.

3. Ethical issues with neurosurveillance at the workplace

As it happens with digitalisation in general, the countries that lead the innovation usually are also the first ones to set the relevant norms for their implementation. In this regard, it is very likely that USA and China, as the unsurpassed leaders in brain research investment, would influence largely the standards for the use of neurotechnologies outside of clinical context and will shape the society (Kosal and Putney, 2023). That is why the international regulations need to be studied in order to determine if they are ready to respond to the new challenges posed by the use of neurodevices at the workplace.

The three ethical issues that need to be considered in this context are: (a) An increasing power imbalance in the employment relationship; (b) A new threat to employees' privacy; (c) The risk of neurodiscrimination.

3.1. An increasing power imbalance in the employment relationship

Historically, the struggle for a fair balance in the relationship between employers and employees has been ongoing for centuries. During the Industrial Revolution, employers had an overwhelming advantage over employees. Workers had very low level of protection against exploitation, and they were often subject to long working shifts, low wages, and inhuman working conditions. These abuses led in the early twenteeth century to widespread social protests and labor strikes, which in turn prompted governments to take legal measures to protect workers' rights. Modern labor law is the result of this process.

However, neurotechnological developments and the possibility to monitor workers' cognitive and emotional states opens up a new chapter in the employment relationship, and may involve a step backward in the promotion of workers' rights. The prospect of neurosurveillance at the workplace creates a new condition in this relationship that was not negotiated before, since the employer becomes able not only to control productivity and behavior at work, but also to understand how exactly the employees' brain was reacting to the tasks. The employer could now get insights into some mental patterns of employees' work productivity, and for example, compare the working results and productivity with emotional states and concentration level throughout the day.

It might be argued that neurosurveillance at the workplace has the potential to increase productivity and safety for high-risk professions. Particularly, there is an opinion that neurotechnology could be effective and at the same time less intrusive if compared with other monitoring tools, for example, such as constant video recording at the workplace. In addition, transparent ethical rules and adequate data protection policies could mitigate this inconvenience (Farahany, 2023). More in general, it can be argued that it is perfectly reasonable that employers would like to monitor their employees' level of concentration, particularly in high-risk professions (such as workers using metal cutting machines) or when the life of a third party is at stake (such as high speed train drivers).

On the other hand, it is also important to consider the potential negative consequences of such monitoring. First of all, this puts employees in a vulnerable position, as their cognitive and emotional states can be monitored and potentially used against them. Indeed, it is plausible that some employers would not resist the temptation of using this information for purposes beyond safety protection. Specifically, possessing such information could lead employers to use the data to compare and evaluate employees' performance, to make decisions related to their working conditions, and decide about possible bonuses and promotions.

As humans, it is unrealistic to expect employees to maintain a consistent level of concentration throughout the workday. Factors outside of work, such as issues in their personal lives or changing hormone levels, could influence their mood, concentration and work results. Moreover, it could also reveal some sensitive states such as mental health conditions or personal problems that employees may not want to share with their employers.

For instance, there is no evidence to suggest that a driver with depression would necessarily pose a risk to passengers. On the contrary, it could be argued that work itself can actually provide a sense of purpose and help individuals cope with personal difficulties. Therefore, allowing new surveillance tools could potentially harm employees who are recovering from their struggles by continuing to maintain an active social life.

As a result, the use of neurotechnologies at the workplace has the potential to negatively impact employees' self-determination. When individuals are aware that their brain activity is being monitored or analyzed, they may feel pressured to self-censor or modify their behavior to align with perceived expectations or norms (Farahany, 2023). Employees may become cautious about exploring unconventional ideas, or engaging in open discussions, if they fear that their neurological data could be used against them or negatively impact their professional evaluations. This phenomenon is commonly referred to as “self-censorship.”

Even the argument of the increase in productivity due to the use of neurosurveillance is not blatantly obviously true. Research has shown that increased monitoring in the workplace can actually be counterproductive, as it may decrease productivity because employees' risk being more focused on how to evade the system rather than on performing their tasks (Siegel et al., 2022). The culture of surveillance and control can create a toxic work environment where employees are under constant pressure to perform and meet certain standards. This, in turn, can lead to reduced motivation and job satisfaction, ultimately harming both the employee and employer and destroying a fragile balance in their relations.

Surprisingly, not many international documents specifically address the issues of employees' surveillance. The international document that serves as a foundation for privacy in labour relations and defines non-binding standards is the Code on Code on Protection of Workers' Data (1996) (hereinafter, “the Code”), issued by the International Labour Organization (International Labour Office Meeting of Experts on Worker's, 1997). According to the Code, if workers are monitored, they should be informed in advance of the reasons for monitoring, the time schedule, the methods and techniques used and the data to be collected (Article 5 of the Code). Furthermore, the employer must minimize the intrusion on the workers' privacy. Secret monitoring should only be permitted if it aligns with national legislation or if there are reasonable grounds to suspect criminal activity or serious wrongdoing by an employee. Continuous monitoring should only be allowed if it is necessary for ensuring health and safety or protecting property (Article 6.14 of the Code) (see more in Technical and Ethical Guidelines for Workers' Health Surveillance, 1998).

Article 12.2 of the Code emphasizes the obligation to inform and consult workers' representatives before introducing any electronic monitoring of workers' behavior in the workplace. This involvement of trade unions in the negotiation and implementation of neurosurveillance tools may contribute to protect the workers interests. Remarkably, the current activity of the international global unions serves a very tangible role to push forward guiding principles for protection of workers' rights. For example, a recent practice consists in the use of international framework agreements for implementing social and labor policy standards in transnational corporations and fostering more equitable global labor environment (International Framework Agreement, European Observatory of Working Life, 2019).

There exists already strategies for future collective bargaining against extreme monitoring, constant surveillance, and real-time performance feedback of the workers. For instance, in 2023 the UNI Global Union, a federation affiliated with unions in 150 countries, approved a Guide for Workers and Trade Unions on algorithmic management (Union, 2023). Importantly, many of the recommendations in that document are relevant to the context of neurosurveillance. To summarize some key points: (1) It is crucial to give workers and unions sufficient notice and trial periods when introducing new technology. (2) Workers have the right to understand the underlying logic of algorithms used in decision-making processes, and appeal the results. (3) Occupational Safety and Health risk assessments should be regularly conducted to address the potential impact of surveillance technology on health and safety. (4) Monitoring should prioritize mutual trust, respect, job satisfaction, worker autonomy, and privacy. (5) Workplace monitoring results should not be used for disciplinary purposes, unless there is a severe violation of conduct, nor for individual productivity targets.

It must be stressed that although the ILO Code and the UNI Global Union Guidelines are not legally binding documents, their significance should not be underestimated, as they can serve as sources of inspiration for hard law legislation or soft law arrangements worldwide (Wallach, 2011).

Probably, the use of neurotechnologies at the workplace will soon become the subject of collective discussions between employers and employees. Trade unions should be proactive in preparing compelling arguments to safeguard workers from neurosurveillance methods.

3.2. A new threat to employees' privacy

While challenges to privacy can arise from the processing of any personal data, the processing of brain data raises specific ethical issues due to its direct connection to one's inner life and personhood. In addition, it should be stressed that neurotechnology has the potential to access not only conscious brain processes but also subconscious processing over which individuals have limited or no mindful control at all. Brain signals can be used to capture and predict health (neurological) status, individual preferences, attitudes and behavior even without the person's awareness. As a result, brain data admit no separation between the processed data and human brain (Ienca et al., 2018). Moreover, some experts consider brain scans to be comparable to unique fingerprints, as they provide a distinct depiction of an individual's brain, and therefore, they can be regarded as biometric data (Finn et al., 2015). Unsurprisingly, scholars supporting the notion of “neurorights” have suggested the need to formally recognize a “right to mental privacy”, which would aim to prevent the use of brain data by others without the individuals' free informed consent (Ienca and Andorno, 2017). The need for specific provisions on the protection of private mind-related information (through mental privacy and neuroprivacy) seems to share a high degree of acceptance and recognition among experts (Ienca, 2021).

The use of neurotechnology in the workplace, as outlined in our scenario, would allow an employer to have access to information about an employee that goes far beyond what is typically revealed through conventional means of candidate interview, employers' assessment and monitoring. With the assistance of neurodevices, employers could, at least to some extent, become aware of the employees' feelings, emotions, and mental workload. The prospect of employers having access to their employees' brain data would increase the vulnerability of the latter and undermine their self-determination power. Since no one can truly control their own brain activity, this intrusive and paternalistic form of monitoring leaves employees with no escape or ability to hide. It also means that brain data could potentially be combined with other personal information about the worker, such as their gender, age or health status for management purposes. This combined data could then be used to create analytics that invade privacy, harm someone's reputation or lead to discrimination. Employers could also make important decisions based on predictions or subjective opinions that may not be reliable or accurate. On the normative level, the right to privacy is formally recognized by the Universal Declaration of Human Rights (Art. 12) (Universal Declaration of Human Rights, 1948), the European Convention on Human Rights (hereafter “ECHR”) (Art. 8) (European Convention on Human Rights, 1950), and by the most domestic laws worldwide.

However, in spite of privacy being a fundamental right, it is not an absolute right, as it can be subjected to reasonable limitations in the common interest of society.3 This also applies to labor relationships, where the workers' right to privacy can be limited by some form of monitoring at the workplace, as well as to serve legitimate interests of public safety and national security.

In our hypothetic scenario, with the deployment of neuromonitoring devices at the workplace, the employer aims to achieve various goals such as increasing safety, enhancing productivity, and optimizing the employment process. While employers may be interested in pursuing all these objectives, we need to balance that legitimate interest with the potential invasion of the employee's privacy.

In this regard, it is important to note that the right to respect for private life also applies to employees during working hours and in the workplace. The European Court of Human Rights (hereafter “ECtHR”, “the Court”) supported this view in the Case of Niemietz v. Germany (Application no. 13710/88) (1992). The judges determined that there is no fundamental reason why the concept of “private life” should exclude professional or business activities “given that most people have the opportunity to establish and develop relationships with others during the course of their work” [Case of Niemietz v. Germany (Application no. 13710/88), 1992].

Over the years, the Court has reviewed numerous disputes concerning data protection and the right to respect the private life of employees. As surveillance methods and technologies have progressed over time, the Court has adapted its approach to ensure the continued protection of these rights under the European Convention on Human Rights (Guide to the Case-Law of the of the European Court of Human Rights, 2022). For instance, there are cases regarding surveillance of non-professional phone calls from professional premises [Case of Halford v. The United Kingdom (Application no. 20605/92)., 1997], GPS monitoring [Affaire Florindo De Almeida Vasconcelos Gramaxo C Portugal (Requête no 26968/16), 2023], the monitoring of an employee's work computer, including their personal emails [Resolution CM/ Res, 2010], opening of files stored by an employee on a computer provided by his employer for work purposes [Case of Libert v. France (Application no. 588/13)., 2018], pictures taken via a video recording showing the conduct of an identified or identifiable employee at his workplace (López Ribalda Others v. Spain - 1874/13 8567/13, 2019).

While the ECtHR has not directly addressed the issue of neurosurveillance at the workplace, its jurisprudence has developed criteria that should be considered when assessing the legitimacy of employer's surveillance methods. In this regard, the Case of Bǎrbulescu v. Romania (Application no. 61496/08) (2017) deserves special attention. In this case, the Court's Grand Chamber addressed the issue of monitoring an employee's personal Yahoo Messenger account followed by the employee dismissal. The Court recognized that “the employer has a legitimate interest in ensuring the smooth running of the company, and that this can be done by establishing mechanisms for checking that its employees are performing their professional duties adequately and with the necessary diligence”. However, the Court found that the employer had failed to fulfill certain obligations. Specifically, in addition to considering the proportionality and subsidiarity of monitoring methods, the employer was required to inform employees in advance about the possibility of their communications being monitored and provide information regarding the nature and extent of the monitoring. In this case the Court has defined six criteria for assessing the necessity of intrusion of employee's privacy:

1. Notification: whether the employees have been notified in advance about the possibility and nature of monitoring.

2. Extent of intrusion: whether the employees knew the degree of monitoring and intrusion into their privacy, including the monitoring of communication flow and content, the duration and scope of monitoring, and the number of people with access to the monitoring results.

3. Legitimate reasons: whether the employer has provided legitimate justifications for monitoring the employees' communications, with a higher justification required for monitoring the content of communications.

4. Proportionality: whether the aim pursued by the employer through monitoring could have been achieved using less intrusive methods.

5. Consequences: whether the impact of monitoring on the employee and the use of monitoring aim to achieve the intended purpose.

6. Adequate safeguards: whether the employer has provided adequate safeguards, particularly when the monitoring operations are intrusive, including ensuring that the employer cannot access the actual content of communications without prior notification to the employee.

Based on these criteria, if our hypothetical case were to be reviewed by the European Court of Human Rights, it is likely that the use of neurosurveillance tools for employee monitoring would be considered unlawful. The employer would likely fail to demonstrate compliance with the principles of proportionality (European Data Protection Supervisor, 2019), subsidiarity, and transparency. The principle of proportionality would require the employer to show that the use of neurodevices in the workplace is necessary and justified, considering the potential intrusion into employees' privacy. If there were less invasive methods available to control and monitor employee's performance, such as performance evaluations, goal setting, and regular feedback, the use of neurosurveillance may be deemed disproportionate. The principle of subsidiarity would demand that the employer justifies why traditional monitoring methods are insufficient to achieve their management objectives. Transparency and accountability relate to the need of providing accurate information about various issues and the commitments and performance of actors. Accountability refers to the imperative of holding institutions and actors accountable for their commitments and performance. Neurosurveillance technology is still in its early stages, and there may be concerns about its accuracy. Incorrect conclusions about an employee's behavior or performance may result from its use. The employer should provide clear technical documentation to ensure transparency of the technology and accountability of its use.

Thus, it appears from the case study that it is highly unlikely that the use of neurodevices in the workplace would be justified as a reasonable limitation of privacy if the purpose is just to monitor the worker's performance.

Could the workers' consent to the use of neuromonitoring devices give the employers a valid legal basis for using them? The legal validity of the employee's consent for the use of neurodevices becomes problematic for at least the following reasons: firstly, it raises the issue of real voluntariness of the employee's consent. As the employees are inherently dependent on the employers, their consent could have been given under coercion, that is, by the fear of job loss or of losing some additional financial incentives, as outlined in our scenario. In this regard, the workers' consent is per se not sufficient to justify the neuromonitoring. For example, it has been argued that “consent can only be an appropriate lawful basis if a data subject is offered control and is offered a genuine choice with regard to accepting or declining the terms offered or declining them without detriment” (Guidelines on Consent Under Regulation 2016/679, European Data Protection Board, 2016).

Secondly, consent should not only be free but also “informed”. It can only be valid if both parties are aware of the type of information that might be revealed through neurosurveillance. But the data obtained from an employee's brain signals may uncover unconscious or implicit information that the employee is unaware of and cannot control. Furthermore, this information may reveal sensitive health data, such as the predisposition to a serious mental illness.

Thirdly, while an employee hypothetically may provide consent for the use of their information for a specific purpose, such as monitoring their fatigue level for safety reasons, there is a possibility that the data could be intentionally or unintentionally used for other purposes (for instance, comparative evaluation of workers efficiency) without the employee's awareness or permission.

Additionally, as research advances, previously obtained data may be rediscovered with greater precision in the future, without the data subject's permission and awareness. It can disclose the employee's cognitive and mental capabilities, unconscious biases, attitudes, and beliefs, and could have a substantial impact on their professional and private life. In other words, the risks associated with gathering and analyzing workers' brain data are significant and consent by itself does not solve those issues.

In sum, granting employers access to workers' brain data raises concerns regarding the control and appropriate use of such data, whether it is used as a safety measure or as a tool for evaluating employees' performance and cognitive abilities. The specific functionality of an EEG device will depend on the quality of the data collected and the algorithms used for analysis. It is necessary to reach a consensus that certain privacy-sensitive statistical correlations cannot be derived from brain data outside the clinical context for surveillance purposes. The employer should not use the brain data to assess the cognitive functions (such as capacity and efficiency), cognitive patterns (such as response to stress and sleep), emotional responses, and early detection of neuropathologies.

More concretely, it should be prohibited to obtain the data about cognitive abilities of employees for purposes such as promotion, hiring, or dismissal, because these methods are intrusive and overall violate workers' dignity.

As a recent example of a regulatory initiative in this area within the EU, the European Parliament suggested the amendment to the draft of EU AI Act that prohibits “the placing on the market, putting into service or use of AI systems to infer emotions of a natural person in the areas of law enforcement, border management, in workplace and education institutions” Amendments adopted by the European Parliament on 14 June 2023 on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts [COM (2021) 0206 – C9-0146/2021 – 2021/0106 (COD)].

3.3. Risk of neurodiscrimination

Despite the fact that discrimination at work based on characteristics such as race, gender, age, religion, disability, or sexual orientation is very negatively perceived in most contemporary societies and is prohibited by law, the modern world is still struggling with this phenomenon. The protection mechanisms are relatively recent, as they date back to the last 80 years and are constantly challenged as there are difficulties in proving and investigating certain forms of discrimination. In addition, emerging technologies, particularly those that are AI-based, bring new challenges to anti-discrimination efforts. For example, if an AI algorithm is trained on historical data that reflects discriminatory practices, it may continue to perpetuate those practices in the future. Additionally, the use of AI in hiring and recruitment processes can lead to discriminatory outcomes even if the algorithms are designed with diversity and inclusion in mind (Drage and Mackereth, 2022).

Our case scenario shows that the use of neurotechnologies complicates even more the already difficult challenge posed by discrimination. Employers may use mental data from their employees for making decisions regarding hiring, firing, pay, job assignments, promotions, layoffs, training, fringe benefits, or any other term or condition of employment, leading to unequal treatment based on automated estimations of cognitive abilities. More concretely, employers could analyze statistics on fatigue levels and concentration, and select employees who are less prone to taking breaks during the workday and are able to maintain a higher level of concentration throughout their shifts. While such practices may be deemed unfair, they may not necessarily be illegal. However, they can create barriers for job seekers who do not possess certain neurological characteristics, contributing to a lack of diversity and the perpetuation of existing inequalities in the workplace.

By allowing neurosurveillance at work, we could bring new discriminative criteria for employees' selection. Someone could be dismissed only because their cognitive abilities do not comply with the employer's expectations or with the average level of the group. Employers may give priority to employees who are less likely to make mistakes or experience stress or burnout, and take actions like demotion or dismissal based on these predictions, whether consciously or unconsciously.

This new form of discrimination [so called “neurodiscrimination” (Ienca et al., 2022)] could result in reduced opportunities for individuals who have lower cognitive abilities (memory, attention, or decision-making skills) and emotional stability. In addition, if a particular industry adopts the use of the same neurodevice for employee selection, it could create barriers for job seekers who do not meet those characteristics to access employment opportunities. This would result in a lack of diversity in the workplace and perpetuate existing inequalities.

One could argue that the competitive employment market already imposes high standards for certain jobs, and that employers can easily detect candidates who do not fit the job requirements through other criteria such as education level, test tasks, and multilevel interviews. However, dismissal based on neurotechnology screening mechanizes the hiring process and removes the human touch from the employer-employee relationship. This creates a world where everything is subject to strict mathematical estimation of biological processes, ignoring the fact that the environment at work is based on many factors, including the personalities of employees, their level of empathy, creativity, and other intangible qualities. Moreover, this could result in a biased and unfair treatment of employees who have lower cognitive or emotional profiles, but are equally capable of performing their job duties.

It is important to emphasize that lower levels of attention or cognitive abilities are not necessarily connected to health issues, mental illnesses or genetic predispositions. They could be influenced by various factors, such as multitasking, insufficient sleep, or personal problems (Alhola and Polo-Kantola, 2007; Müller et al., 2021). However, in some cases, the signs of decreased concentration could indicate neuroatypical characteristics. The term “neuroatypical” is often used interchangeably with “neurodivergent” and is commonly associated with individuals who have various neurological problems, including autism spectrum disorder, attention-deficit/hyperactivity disorder, or dyslexia (Bewley and George, 2016). While there is a growing body of research exploring the experiences of minority groups based on race, gender, and sexual orientation, there is relatively limited works focusing on individuals who belong to the neurodiversity spectrum. Consequently, there is a lack of research investigating the factors that influence workplace experiences and outcomes for neurodiverse individuals (LeFevre-Levy et al., 2023). The use of neurodevices in the workplace could have even more disadvantageous effects on employees with neuroatypical characteristics, exacerbating the lack of diversity.

Another argument against the use of neurotechnology as a micromanagement tool to select candidates to a job is that it would be nearly impossible to prove that the employer used precisely this criterion in decision-making. Employers are already wise enough to avoid the traces of discriminatory decision-making. This is to say that, in practice, it would be very difficult for employees to defend themselves against neurodiscrimination and to prove that the decision to hire, fire or promote them was based on discriminatory criteria like the cognitive capacity measured by neurodevices.

Advocates of neurotechnology in the workplace may argue that individuals are free to decline neurotechnology screening during the job interview or work process. As more people actively refuse this practice, the likelihood of it being involuntarily implemented on employees by their employers decreases. While this argument could be very reasonable for some societies, it could not work for others. For instance, in some low- and middle-income countries where individuals are living in poverty and highly dependent on their employers, the use of neurosurveillance technologies could be more readily accepted. This argument could also be relevant for dictatorial regimes, where human rights frameworks are not properly implemented. In these cases, employees may find themselves in a coerced situation to accept the requirement imposed by employers and unable to object to wearing such devices.

Surprisingly, it seems that human rights instruments do not explicitly include a specific ground to protect individuals from discrimination based on their neural or mental characteristics, which is referred to as “neurodiscrimination” (Ienca and Ignatiadis, 2020).

Nevertheless, the European Convention on Human Rights includes an open clause that allows the protection against discrimination based on characteristics or circumstances that are not explicitly listed as grounds of discrimination. According to Article 14, the Convention “guarantees the enjoyment of rights and freedoms without discrimination on any grounds, such as sex, race, color, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth, or any other status.”

The ECtHR in cases relating to discrimination, has shown that the term “any other status” includes many other situations that are not explicated included in that list (Guide on Article 14 of the European Convention on Human Rights on Article 1 of Protocol No. 12 to the Convention Prohibition of Discrimination, 2023). The Court have also ruled that the human rights protection from discrimination includes direct and indirect discrimination. Indirect discrimination may take the form of “disproportionately prejudicial effects of a general policy or measure which, though couched in neutral terms, has a particular discriminatory effect on a particular group” (Guide on Article 14 of the European Convention on Human Rights on Article 1 of Protocol No. 12 to the Convention Prohibition of Discrimination, 2023). This occurs when a rule that applies to everyone equally actually puts certain groups at a disadvantage. For example, in a recent case (Case of Negovanović Others v. Serbia (Applications nos. 29907/16), 2022), four blind chess players from Serbia claimed that they were discriminated against by being denied certain financial awards provided under the Sporting Achievements Recognition and Rewards Decree. The decree granted financial benefits to sighted chess players who won medals at the Chess Olympiad but excluded blind chess players who won medals at the Blind Chess Olympiad. The Court found that the differential treatment based on disability constituted discrimination. The blind chess players and sighted chess players were engaged in the same activity and had achieved similar international accolades, placing them in analogous situations. The Court determined that there was no objective and reasonable justification for the differential treatment, as the prestige and significance of the achievements should not depend on disability [Case of Negovanović Others v. Serbia (Applications nos. 29907/16), 2022]. Regarding our case scenario, whether the European Court of Human Rights would label the neurotechnological evaluation of employees' cognitive abilities as neurodiscrimination remains uncertain at this point in time. To date, cases specifically addressing neurosurveillance and neurodiscrimination have not been the subject of disputes brought before the court.

There is a critical need for extensive normative and legal research, as well as broader public discussion to address the complexities of neurodiscrimination. These are important questions that require thorough examination and consideration. To adequately address this emerging issue, we would need to reach a consensus about the serious ethical and legal problem posed by the neuromonitoring of employees and the potential discriminatory implications of this practice. First of all, selection of the employees based on neural or mental characteristics and the cognitive capacities should be considered neurodiscrimination. And therefore, it should be prohibited.

4. Conclusion: should employers be permitted to monitor their employees' minds?

Taking into account the fast development of neurotechnologies, it is still challenging to give a straightforward answer to the question of whether employers should be allowed to use neurotechnologies to monitor their employees' mental states. While acknowledging the potential benefits of using such devices in the interest of workers themselves, it is also crucial to anticipate their potential impact on their rights. As Nita Farahany has pointed out, the use of brain wearables in the workplace carries implications beyond safety, productivity, and employee stress, as it directly relates to the dignity of workers (Farahany, 2023). Since the implementation of neurotechnologies in the workplace is not omnipresent yet, it is not too late to correct the course of action in this area.

At the level of international labor rights, it is critical to reach a consensus on the permissible boundaries for utilizing neurotechnologies in the workplace and define what qualifies as intrusive and unacceptable forms of neurosurveillance. While the mere goal of increasing productivity does not seem to justify this technology, the improvement of safety working conditions may, at least in some very specific cases (such as high-risk professions), give sufficient grounds for it.

However, it is unlikely that companies will invest in neurotechnologies solely in order to increase employees' safety. Since the success of a business is typically measured in financial terms, there may be other profit-driven goals for introducing neurotechnologies in the workplace, such as cost optimization, facilitating micromanagement decisions, and increasing productivity. Therefore, at the legislative level, we need to anticipate employers' attempts to implement intrusive neurosurveillance methods. Having this potential development in mind, it is vital to ensure a fair interplay between the principles of proportionality and subsidiarity to guarantee that the use of neurodevices in the workplace is really necessary and justified.

Taking into account the principle of proportionality, we suggest that neurotechnologies should not be used just to increase the companies' profit, but primarily to contribute to the workers' safety and to the interest of society. In addition, the principle of subsidiarity requires employers to show that there are no other alternative safety measures of comparable effectiveness that are less intrusive on employees' mental privacy. Thus, both principles have to be considered at the time of assessing the justification for the use of neurosurveillance tools at the workplace.

Recent studies indicate that employees are more likely to use safety equipment regularly if they perceive it as beneficial for themselves (Siegel et al., 2022). So, it would be desirable to promote a culture of trust between all stakeholders and only implement neurodevices at the workplace as self-check tools for monitoring fatigue. Trade unions are called to play an active role in this process.

Additionally, as neurotechnology can be used to assess the cognitive abilities of employees, the primary risk is the impact on inclusivity and diversity within the labor relationship. Neurotechnology has the capacity to redefine societal perceptions of what constitutes a “normally gifted” individual and can challenge conventional understandings of a normal state of consciousness. Therefore, there is a critical need for extensive normative and legal research, as well as broader public discussion, to prevent the risk of neurodiscrimination. It is essential to establish legal standards providing that the evaluation of cognitive abilities by means of neurotechnologies in the workplace will be regarded as a discriminatory measure.

In sum, it is time to acknowledge the potential advantages of implementing neurotechnologies in the workplace and, at the same time, establish clear limits for safeguarding the workers' dignity and rights.

Author contributions

EM has prepared the first draft of the manuscript. RA substantially contributed to manuscript revision. All authors contributed to the article and approved the submitted version.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1. ^According to the New York Times estimation, eight of the ten largest private U.S. employers track the productivity metrics of individual workers, many in real-time.

2. ^“As SmartCap advertises, “where fatigue is a problem, SmartCap is the solution”. See: https://www.smartcaptech.com/life-smart-cap/, Accessed 30 Mar 2023.” and the Bibliography: “SmartCap. “Where Fatigue Is a Problem, SmartCap Is the Solution.” Accessed March 30, 2023. https://www.smartcaptech.com/life-smart-cap/.”

3. ^Art. 8.2, European Convention on Human Rights. 1950. https://www.coe.int/en/web/human-rightsconvention/the-convention-in-1950, Accessed 30 May 2023.

References

Advertises (2023). SmartCap. Where Fatigue is the Problem, SmartCap is the Solution. Available online at: https://www.smartcaptech.com/life-smart-cap/ (accessed March 30, 2023).

Google Scholar

Affaire Florindo De Almeida Vasconcelos Gramaxo C Portugal (Requête no 26968/16) (2023). European Court of Human Rights 2023. Available online at: https://hudoc.echr.coe.int/eng#%22itemid%22:%22001-221474%22 (accessed May 30, 2023).

Google Scholar

Alhola, P., and Polo-Kantola, P. (2007). Sleep deprivation: impact on cognitive performance. Neuropsychiatr Dis Treat. 3, 553–567.

PubMed Abstract | Google Scholar

Aloisi, A., and Gramano, E. (2019). Artificial intelligence is watching you at work. Comp. Labor Law Policy J. 41, 95–121.

PubMed Abstract | Google Scholar

Aricò, P., Sciaraffa, N., and Babiloni, F. (2020). Brain–computer interfaces: toward a daily life employment. MDPI 10, 157. doi: 10.3390/brainsci10030157

PubMed Abstract | CrossRef Full Text | Google Scholar

Ball, K. (2010). Workplace surveillance: an overview. Labor History 51, 87–106. doi: 10.1080/00236561003654776

CrossRef Full Text | Google Scholar

Bewley, H., and George, A. (2016). Neurodiversity at Work. London: National Institute of Social and Economic Research.

Google Scholar

Brain Products GmbH v BioSemi VOF and Others (2012). Judgment of the Court (Third Chamber). Case C-219/11. Available online at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A62011CJ0219 (accessed November 22, 2012).

Google Scholar

Brassart Olsen, C. (2020). To track or not to track? Employees' data privacy in the age of corporate wellness, mobile health, and GDPR. Int. Data Priv. Law 10, 236–252. doi: 10.1093/idpl/ipaa004

CrossRef Full Text | Google Scholar

Case of Bǎrbulescu v. Romania (Application no. 61496/08) (2017). European Court of Human Rights. Available online at: https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22001-177082%22]} (accessed May 30, 2023).

Google Scholar

Case of Halford v. The United Kingdom (Application no. 20605/92). (1997). COURT (CHAMBER). Available online at: https://hudoc.echr.coe.int/tur#{%22itemid%22:[%22001-58039%22]} (accessed May 30, 2023).

Google Scholar

Case of Libert v. France (Application no. 588/13). (2018). (European Court of Human Rights 2018). Available online at: https://hudoc.echr.coe.int/rus/#{%22itemid%22:[%22001-181273%22]} (accessed May 30, 2023).

Google Scholar

Case of Negovanović Others v. Serbia (Applications nos. 29907/16) (2022). (European Court of Human Rights 2022). Available online at: https://hudoc.echr.coe.int/eng#{%22itemid%22:[%22001-215183%22]} (accessed May 30, 2023).

Google Scholar

Case of Niemietz v. Germany (Application no. 13710/88) (1992). COURT (CHAMBER) 1992. Available online at: https://hudoc.echr.coe.int/rus#{%22itemid%22:[%22001-57887%22]} (accessed May 30, 2023).

Google Scholar

Code on Protection of Workers' Data (1996). Code on Protection of Workers' Data, International Labour Organization. Available online at: https://www.ilo.org/wcmsp5/groups/public/—ed_protect/—protrav/—safework/documents/normativeinstrument/wcms_107797.pdf (accessed May 30, 2023).

Google Scholar

Cofas, A. V. (2019). Energizing the Brain: Combating Worker Fatigue Using Wearable Neurotechnology. Texas A&M Today. Available online at: https://engineering.tamu.edu/news/2022/01/energizing-the-brain-combating-worker-fatigue-using-wearable-neurotechnology.html (accessed May 30, 2023).

Google Scholar

Commission (2021). Proposal for a Directive of the European Parliament and of the Council on Improving Working Conditions in Platform Work, COM/2021/762. Available online at: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex:52021PC0762 (accessed March 30, 2023).

Google Scholar

Dehais, F., Lafont, A., Roy, R., and Fairclough, S. (2020). A neuroergonomics approach to mental workload, engagement and human performance. Fron. Neurosci. 14, 268. doi: 10.3389/fnins.2020.00268

PubMed Abstract | CrossRef Full Text | Google Scholar

Directive 2001/95/EC of the European Parliament of the Council of 3 December 2001 on General Product Safety (2001). Text with EEA Relevance. Available online at: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32001L0095 (accessed December 3 2001).

Google Scholar

Drage, E., and Mackereth, K. (2022). Does AI debias recruitment? Race, gender, and AI'S “eradication of difference”. Philos. Technol. 35, 89. doi: 10.1007/s13347-022-00543-1

PubMed Abstract | CrossRef Full Text | Google Scholar

EMOTIV (2023). Enterprise Neurotechnology Solutions. Available online at: https://www.emotiv.com/workplace-wellness-safety-and-productivity-mn8/ (accessed May 30, 2023).

Google Scholar

Eurofound (2021). Sara Riso. Monitoring and Surveillance If Workers in the Digital Age. Available online at: https://www.eurofound.europa.eu/data/digitalisation/research-digests/monitoring-and-surveillance-of-workers-in-the-digital-age#s-01 (accessed December 16, 2021).

Google Scholar

European Convention on Human Rights (1950).

Google Scholar

European Data Protection Supervisor (2019). EDPS Guidelines on Assessing the Proportionality of Measures that Limit the Fundamental Rights to Privacy and to the Protection of Personal Data. Available online at: https://edps.europa.eu/data-protection/our-work/subjects/necessity-proportionality_en (accessed May 30, 2023).

Google Scholar

EXPRESSVPN (2021). EXPRESSVPN Survey Reveals the Extent of Surveillance on the Remote Workforce. Available online at: https://www.expressvpn.com/blog/expressvpn-survey-surveillance-on-the-remote-workforce/#ethics (accessed May 30, 2023).

Google Scholar

Farahany, N. A. (2023). The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. England: St. Martin's Press.

PubMed Abstract | Google Scholar

Finn, E. S., Shen, X., Scheinost, D., Rosenberg, M. D., Huang, J., and Chun, M. M. (2015). Functional connectome fingerprinting: identifying individuals using patterns of brain connectivity. Nat. Neurosci. 18, 1664–1671. doi: 10.1038/nn.4135

PubMed Abstract | CrossRef Full Text | Google Scholar

Francis Chan, T. (2018). China Is Monitoring Employees' Brain Waves and Emotions - and the Technology Boosted One Company's Profits by $315 Million. Business Insider. Available online at: https://www.businessinsider.com/china-emotional-surveillance-technology-2018-4?r=US&IR=T (accessed May 30, 2023).

Google Scholar

Fullerton, J. (2018, April 30). Mind-reading tech being used to monitor Chinese workers' emotions. The Telegraph. Available online at: https://www.telegraph.co.uk/news/2018/04/30/mind-reading-tech-used-monitor-chinese-workers-emotions/ (accessed May 30, 2023).

Google Scholar

Gevins, A., Smith, M., McEvoy, L., Leong, H., and Le, J. (1999). Electroencephalographic imaging of higher brain function. Philosophical transactions of the Royal Society of London. Series B Biol. Sci. 354, 1125–1133. doi: 10.1098/rstb.1999.0468

PubMed Abstract | CrossRef Full Text | Google Scholar

Guide on Article 14 of the European Convention on Human Rights on Article 1 of Protocol No. 12 to the Convention Prohibition of Discrimination (2023). European Court of Human Rights. Available online at: https://www.echr.coe.int/Documents/Guide_Art_14_Art_1_Protocol_12_ENG.pdf (accessed May 30, 2023).

Google Scholar

Guide to the Case-Law of the of the European Court of Human Rights (2022). Available online at: https://www.echr.coe.int/Documents/Guide_Data_protection_ENG.pdf (accessed May 30, 2023).

Google Scholar

Guidelines on Consent under Regulation 2016/679, European Data Protection and Board. (2016). Guidelines 05/2020 on Consent under Regulation 2016/679 Adopted by the European Data Protection Board (‘EDPB'). Available online at: https://ec.europa.eu/newsroom/article29/items/623051/en (accessed May 30, 2023).

Google Scholar

Harwell, D. (2020, April 30). Managers turn to surveillance software, always-on webcams to ensure employees are (really) working from home. The Washington Post. Available online at: https://www.washingtonpost.com/technology/2020/04/30/work-from-home-surveillance/ (accessed January 31, 2023).

Google Scholar

Hopkins, P. D., and Fiser, H. L. (2017). “This position requires some alteration of your brain”: on the moral and legal issues of using neurotechnology to modify employees. J. Bus. Ethics 144, 783–797. doi: 10.1007/s10551-016-3182-y

CrossRef Full Text | Google Scholar

Ienca, M. (2021). On neurorights. Front. Hum. Neurosci. 15, 701258. doi: 10.3389/fnhum.2021.701258

PubMed Abstract | CrossRef Full Text | Google Scholar

Ienca, M., and Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sci. Soc. policy 13, 1–27. doi: 10.1186/s40504-017-0050-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Ienca, M., Fins, J. J., Jox, R. J., Jotterand, F., Voeneky, S., and Andorno, R. (2022). Towards a governance framework for brain data. Neuroethics 15, 20. doi: 10.1007/s12152-022-09498-8

CrossRef Full Text | Google Scholar

Ienca, M., and Ignatiadis, K. (2020). Artificial intelligence in clinical neuroscience: methodological and ethical challenges. AJOB Neurosci. 11, 77–87. doi: 10.1080/21507740.2020.1740352

PubMed Abstract | CrossRef Full Text | Google Scholar

Ienca, M., Jotterand, F., and Elger, B. S. (2018). From healthcare to warfare and reverse: How should we regulate dual-use neurotechnology? Neuron 97, 269–274. doi: 10.1016/j.neuron.2017.12.017

PubMed Abstract | CrossRef Full Text | Google Scholar

International Framework Agreement, European Observatory of Working and Life. (2019). Available online at: https://www.eurofound.europa.eu/observatories/eurwork/industrial-relations-dictionary/international-framework-agreement (accessed May 30, 2023).

Google Scholar

International Labour Office (1998). Technical and Ethical Guidelines for Workers' Health Surveillance (OSH No. 72). Geneva: International Labour Office. Available online at: https://www.ilo.org/wcmsp5/groups/public/@ed_protect/@protrav/@safework/documents/normativeinstrument/wcms_177384.pdf (accessed May 30, 2023).

Google Scholar

Johnson, S. (2017). Brainwave Headsets are Making Their Way Into Classrooms—for Meditation and Discipline. EdSurge. Available online at: https://www.edsurge.com/news/2017-11-14-brainwave-headsets-are-making-their-way-into-classrooms-for-meditation-and-discipline (accessed November 14, 2017).

Google Scholar

Karthikeyan, R., Smoot, M. R., and Mehta, R. K. (2021). Anodal tDCS augments and preserves working memory beyond time-on-task deficits. Sci. Rep. 11, 19134. doi: 10.1038/s41598-021-98636-y

PubMed Abstract | CrossRef Full Text | Google Scholar

Kosal, M., and Putney, J. (2023). Neurotechnology and international security: Predicting commercial and military adoption of brain-computer interfaces (BCIs) in the United States and China. Polit. Life Sci. 42, 81–103. doi: 10.1017/pls.2022.2

PubMed Abstract | CrossRef Full Text | Google Scholar

LeFevre-Levy, R., Melson-Silimon, A., Harmata, R., Hulett, A. L., and Carter, N. T. (2023). Neurodiversity in the workplace: Considering neuroatypicality as a form of diversity. Ind. Org. Psychol. 16, 1–19. doi: 10.1017/iop.2022.86

CrossRef Full Text | Google Scholar

López Ribalda and Others v. Spain - 1874/13and 8567/13 (2019). European Court of Human Rights 2019. Available online at: https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22002-12630%22]} (accessed May 30, 2023).

Google Scholar

Maior, H., Wilson, M., and Sharples, S. (2017). Workload alerts—using physiological measures of mental workload to provide feedback during tasks. ACM Trans. Comput. Hum. Int. 1, 380. doi: 10.1145/3173380

PubMed Abstract | CrossRef Full Text | Google Scholar

Marchant, G., and Gulley, L. (2010). National security neuroscience and the reverse dual-use dilemma. AjOB Neuroscience 1, 20–22. doi: 10.1080/21507741003699348

PubMed Abstract | CrossRef Full Text | Google Scholar

Martinez, W., Benerradi, J., Midha, S., Maior, H. A., and Wilson, M. L. (2022). “Understanding the ethical concerns for neurotechnology in the future of work,” in 2022 Symposium on Human-Computer Interaction for Work (CHIWORK 2022) (New York, NY: Association for Computing Machinery), 1–19. doi: 10.1145/3533406.3533423

CrossRef Full Text | Google Scholar

Midha, S., Wilson, M. L., and Sharples, S. (2022). “Ethical concerns and perceptions of consumer neurotechnology from lived experiences of mental workload tracking,” in 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT '22) (New York, NY: Association for Computing Machinery), 564–573.

Google Scholar

Müller, S. M., Schiebener, J., Brand, M., and Liebherr, M. (2021). Decision-making, cognitive functions, impulsivity, and media multitasking expectancies in high versus low media multitaskers. Cognit. Proc. 22, 593–607. doi: 10.1007/s10339-021-01029-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Niso, G., Romero, E., Moreau, J. T., Araujo, A., and Krol, L. R. (2023). Wireless EEG: a survey of systems and studies. Neuroimage 269, 119774. doi: 10.1016/j.neuroimage.2022.119774

PubMed Abstract | CrossRef Full Text | Google Scholar

Park, S., Han, C.-H., and Im, C.-H. (2020). Design of wearable EEG devices specialized for passive brain–computer interface applications. Sensors 20, 4572. doi: 10.3390/s20164572

PubMed Abstract | CrossRef Full Text | Google Scholar

Patel, V., Chesmore, A., Legner, C. M., and Pandey, S. (2022). Trends in workplace wearable technologies and connected-worker solutions for next-generation occupational safety, health, and productivity. Adv. Int. Syst. 4, 2100099. doi: 10.1002/aisy.202100099

CrossRef Full Text | Google Scholar

Proposal for a Directive of the European Parliament of the Council on Improving Working Conditions in Platform Work, COM. (2021). Available online at: https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex:52021PC0762 (accessed December 9, 2021).

Google Scholar

Regulation (EU) 2017/745 of the European Parliament and of the Council (2017). Medical Devices, Amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and Repealing Council Directives 90/385/EEC and 93/42/EEC (Text with EEA Relevance). Available online at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32017R0745 (accessed April 5. 2017).

Google Scholar

Resolution CM/ Res, DH. (2010). (2007). 79 [1] Execution of the judgment of the European Court of Human Rights Copland against the United Kingdom (Application No. 62617/00, judgment of 3 April 2007, final on 3 July 2007), (European Court of Human Rights 2007). Available online at: https://hudoc.echr.coe.int/fre#%22itemid%22:[%22001-99457%22] (accessed April 3, 2007).

Google Scholar

Shein, E. (2022). Neurotechnology and the law. Commun. ACM 65, 16–18. doi: 10.1145/3542816

CrossRef Full Text | Google Scholar

Siegel, R., König, C. J., and Lazar, V. (2022). The impact of electronic monitoring on employees' job satisfaction, stress, performance, and counterproductive work behavior: a meta-analysis. Comput. Hum. Behav. Rep. 8, 100227. doi: 10.1016/j.chbr.2022.100227

CrossRef Full Text | Google Scholar

Tang, J., LeBel, A., Jain, S., and Huth, A. G. (2023). Semantic reconstruction of continuous language from non-invasive brain recordings. Nat. Neurosci. 12, 1–9. doi: 10.1038/s41593-023-01304-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Technical and Ethical Guidelines for Workers' Health Surveillance (1998).

Google Scholar

Union (2023). UNI Global Union, Algorithmic Management: Opportunities for Collective Action. A Guide for Workers and Trade Unions. Available online at: https://uniglobalunion.org/news/report-unions-repsonse-to-algorithmic-management/ (accessed May 30, 2023).

Google Scholar

Universal Declaration of Human Rights, United Nations (1948). Available online at: https://www.un.org/en/about-us/universal-declaration-of-human-rights

Google Scholar

Vallas, S., and Schor, J. (2020). What do platforms do? Understanding the gig economy. Ann. Rev. Sociol. 46, 857. doi: 10.1146/annurev-soc-121919-054857

CrossRef Full Text | Google Scholar

Wallach, S. (2011). The Medusa Stare: surveillance and monitoring of employees and the right to privacy. Int. J. Comp. Lab. Law Ind. Relat. 27, 2. doi: 10.54648/IJCL2011013

CrossRef Full Text | Google Scholar

Wexler, A., and Reiner, P. (2019). Oversight of direct-to-consumer neurotechnologies. Science 363, 234–235. doi: 10.1126/science.aav0223

PubMed Abstract | CrossRef Full Text | Google Scholar

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York, NY: Public Affairs.

Google Scholar

Keywords: neurosurveillance, neurorights, labor rights, neurotechnology and brain-machine interface, neuromonitoring, neurodiscrimination

Citation: Muhl E and Andorno R (2023) Neurosurveillance in the workplace: do employers have the right to monitor employees' minds? Front. Hum. Dyn. 5:1245619. doi: 10.3389/fhumd.2023.1245619

Received: 23 June 2023; Accepted: 01 September 2023;
Published: 19 September 2023.

Edited by:

Gianluigi M. Riva, Bocconi University, Italy

Reviewed by:

Marta Sosa, University of Milano-Bicocca, Italy
Giulia Pesci, Università degli Studi di Milano, Italy

Copyright © 2023 Muhl and Andorno. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Ekaterina Muhl, ZWthdGVyaW5hLnRpa2hvbWlyb3ZhJiN4MDAwNDA7aWJtZS51emguY2g=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.