How might the emerging trend of employers microchipping employees impact the workplace and can it ever be justified?
Ever since the 1990s, convicted criminals have been ‘tagged’ to ensure compliance with bail conditions and tagging continues to this day.
The fact that the Ministry of Justice is scaling back the tagging system, at a time when employers are increasingly considering going one step further and microchipping their employees is absurd, is it not?
We have a public sector-driven criminal tagging system that has critics claiming it is lenient on offenders, whilst at the same time we have some in the private sector promoting a technology of microchipping the wider employed population and subjecting them to increased monitoring.
Microchipping is not new. Valuable possessions, like works of art and furniture, are chipped for security and insurance purposes. If you own a pet it is likely that you will have had your cat or dog chipped – the benefit is that, if they are lost, they can be readily identified on a national database and safely returned to you.
Some employers are now suggesting that their employees should wear microchips, inserted superficially under the skin.
But what about microchipping humans? Since the 1990s we have also been implanting microchips into humans, enabling the microchipped individual to do things like access their home and car, switch on lights and log in to their computer.
In recent years, matters have progressed and some employers are now suggesting that their employees should wear microchips, inserted superficially under the skin.
Surprisingly, the reason given by employers for this is not restricted to security or health and safety. The ‘opt-in’ is often offered on the basis that it will make life easier for the employee, as it saves time in carrying out various tasks such as: accessing the office without a pass, logging on to the employer’s IT system without passwords, operating copiers and buying food and drink in the workplace without the need for cash or a card.
Are they secure?
However, there are a number of serious issues with the practice of microchipping employees, including the obvious legal concerns about human rights, privacy and data protection.
What rights does the employee have to challenge that data and require it to be destroyed in the future?
Questions need to be asked, such as what data is being gathered by the employer, how is it being processed and stored, how long is it being kept and what exactly is it being used for? Even more importantly, what rights does the employee have to challenge that data and require it to be destroyed in the future?
Microchips can easily be read and cloned, so their security is not guaranteed. Employers therefore cannot say, with confidence, that the information collated by their employee’s microchip is secure.
In addition to these serious legal concerns, we must not forget that the process of microchipping a human is an invasive procedure and requires explicit written consent.
A risk too far?
Many argue that employees may not give true and informed consent when agreeing to be microchipped. Employees who refuse to be chipped may consider such a request unjustified, and that it damages the implied duty of trust and confidence between the employee and employer.
Particular veterinary and toxicological studies in animals suggest that there is a risk of cancer at the injection site. This would justify further investigation to establish whether there is any risk in humans.
The identification of employees in the workplace using badges, tags or wristbands, together with passwords and security checks, is (in certain sectors) not sufficient, and therefore fingerprint, retina or face recognition are known and accepted solutions.
Veterinary and toxicological studies in animals suggest that there is a risk of cancer at the injection site.
So why some employers have decided to choose such a controversial and invasive process to microchip their employees, in sectors that do not appear to have compelling reasons to microchip, is not clear. In my view, the microchipping of employees will not become the norm in the UK.
Surveillance pros and cons
Let’s not forget, in this apparent dystopian nightmare of compulsory invasive surveillance by employers, we are currently all under constant surveillance, which does bring many personal and societal benefits.
Most of us carry, all day, every day, our mobile phone with the location service ‘on’, allowing location-based apps and websites to use information from GPS networks to determine our location.
We are currently all under constant surveillance, which does bring many personal and societal benefits.
One reason parents provide their school children with mobile phones is to enable them to track where they are. Our satnavs and tracking devices in our cars, whether personal or work-related, track our movements when driving, and in most public places, we can be tracked by CCTV.
Every time we use our debit/credit card we leave a ‘trail’, mapping where we have paid for goods, services, train or air travel.
Whilst we recognise the benefits of being under surveillance, let us never stop asking challenging questions to ensure innovative processes that keep us all connected and safe do not go one step too far.
It’s important to ensure that every process that subjects individuals to surveillance is reasonable, justified and goes no further that is necessary.
Never stop asking challenging questions to ensure innovative processes that keep us all connected and safe do not go one step too far.
In other words, surveillance must be proportionate and only carried out in order to achieve a legitimate and lawful aim. In the workplace, this guidance is not just more persuasive, it is undeniable.