Skip to main content
Global HR Lawyers

Mind-reading tech in the workplace – a good thing, or the end of the world as we know it?

03 August 2023

The ICO released its first report on neurotechnology on 8 June: a fast-emerging tech that records and processes data directly from an individual’s brain and nervous system (“neurodata”).

Click here to read the report

With the recent 2023 zeitgeist around artificial intelligence and the range of data privacy issues that can arise from its use, the ICO is certainly being kept on its toes as more novel data processing technologies are created.

In its report, the Commissioner discusses the neurotechnology currently available in the market (including invasive, such as deep brain stimulation devices, and non-invasive examples including headbands that read and interpret signals given off by the brain), the regulatory issues, and examples of sectors in which neurotech is currently, and expected to be, used.

While the medical sector is currently the major player in this space, with treatments of nervous system disorders being the most obvious and possibly most impactful uses of the technology, the ICO also puts time into considering other areas it envisages a rollout of this futuristic processing. One such example that is foreseen is the rollout of neurotech in a workplace context – from using it for recruitment purposes, to safety and productivity monitoring. Unsurprisingly, the ICO does not see this as being risk-free, and there are several issues that organisations will need to keep in mind when considering, developing and implementing the use of neurotech in these situations.

Similar to AI, there is a potential for discrimination where the models the technology is based on themselves contain bias, which can throw out inaccurate data or assumptions about people. A particular issue arises around neurodivergent people, whose brain and nervous systems may exhibit patterns that are different to those recorded from neurotypical individuals. The result of a neurotech system showing a person’s neurodata as undesirable or negative could therefore result in a missed job opportunity, being passed over for promotion, etc., solely due to its ingrained bias – which is clearly an area of concern. It is therefore important that neurotech systems are “trained” using data from as wide a range of patterns as possible, in order to mitigate this risk of ingrained bias – something that organisations looking to gain access to these systems would be well-advised to keep in mind.

Another question for those processing neurodata is how to identify what the appropriate legal basis for processing would be. There are calls for explicit consent to be the sole legal basis available, but the innate nature of neurodata makes this problematic: individuals will have limited or no control over what data they are consenting to the processing of, as their neurodata output will be subconsciously generated. This makes consent as a basis difficult, and potentially impossible, to justify. In a workplace context there is also the perpetual difficulty that the power imbalance between employers and potential or current employees poses, which generally invalidates consent as a basis.

Organisations considering putting neurotech in place will have to think carefully about how they comply with the principle of data minimisation, i.e. processing only that data which is directly relevant and necessary to accomplish a specified purpose. Neurodata is generated subconsciously, and therefore the data that is collected may not be entirely relevant to the purpose it is being used for. How will organisations recognise what is and isn’t relevant without processing and analysing the data that is collected? This may be a question more for the tech whizzes developing this technology, however business would be well-advised to ensure parameters can be put in place where only data that is required is collected and processed.

Addressing data subject rights is also a question yet to be answered. The complexity of neurodata and questions about how it is/will be presented will certainly pose issues when it comes to subject rights – e.g. how will neurodata be dealt with following a subject access request (an area of ongoing focus by the ICO, see our latest article on employer SARs here)? The hope is that the ICO will address these questions in the future, with the Commissioner planning to release formal guidance on neurotechnology in the future. However, this is part of a very long to-do list for the Commisisoner, with international transfers taking centre stage recently (see our article on the recent passing of the Data Privacy Framework here) and AI also contributing to the Commissioner’s workload (see here for our interactive map of AI regulation around the world). The recent emergence, rapid development, and complex regulatory questions surrounding neurotech means that specific guidance may not be forthcoming particularly soon!

Related items

Artificial Intelligence (AI) – Your legal experts

AI has been all over the headlines in recent months. Generative AI can already produce written content, images and music which is often extremely impressive, if not yet perfect – and it is increasingly obvious that in the very near future, AI’s capabilities will revolutionise the way we work and live our lives.

Back To Top