dotshock/Shutterstock.comA workplace wellbeing app might seem like a simple and helpful tool – a mood check-in, some stress management advice, or a chatbot asking how your week has gone. But behind that supportive language, some systems are also quietly analysing your voice, writing style and digital behaviour for signs of psychological distress.These tools are already on the market – aimed at workplaces, universities and healthcare. They are framed as early-intervention systems that promise to cut costs and identify problems before they become serious. Unfortunately, companies are under no obligation to report using them, so data about how widespread they are is lacking.The basic idea behind these tools is that behaviour leaves patterns. Artificial intelligence (AI) systems trained on large datasets learn to recognise signals associated with particular mental health conditions, and when similar signals appear in new data, the system produces a probability estimate.For many people, the surprising part is how much ordinary behaviour can reveal. Voice recordings can pick up changes in rhythm, pitch and hesitation. Language models can analyse word choice and emotional tone. Smartphone data has also been explored as a way of tracking changes in sleep, movement and social interaction – all without the person doing anything out of the ordinary.But detecting a statistical signal is very different from identifying a genuine problem. Human behaviour is deeply contextual. Someone may speak slowly because they are tired, nervous or communicating in a second language. Reduced online activity might simply reflect a busy week.Even well-designed systems will make mistakes. A person who is genuinely struggling may not show the behavioural patterns the system was trained to recognise, while someone else may be incorrectly flagged as being in distress.The pressure to develop these tools is real. The World Health Organization estimates that depression and anxiety cost the global economy US$1 trillion (£800 million) a year in lost productivity. Universities report rising demand for counselling, and employers are dealing with burnout and stress-related absence. Automated early-warning systems can seem like an attractive answer. Employers are dealing with burnout. PeopleImages/Shutterstock.com When wellbeing becomes surveillanceBut this technology can change something fundamental about how mental health is understood. Traditionally, mental health is assessed through conversations between a person and a therapist, where context matters enormously. These systems work differently, inferring psychological states from behavioural traces that were never intended to communicate emotional information.Once those inferences are made, they can influence decisions well beyond healthcare. Assessments of someone’s emotional state could shape workplace programmes, student support systems or insurance models, affecting how institutions judge a person’s reliability or suitability for a role. In effect, psychological states become a new kind of data.There are particular risks for some groups. Neurodivergent people often communicate in ways that differ from the norms assumed by many datasets. Someone speaking in a second language may pause more frequently, producing speech patterns an algorithm could misinterpret. A person going through grief or illness may display signals that resemble those associated with mental health conditions – without actually having one.Used carefully by healthcare professionals, these tools could have genuine value – helping therapists spot early warning signs of deteriorating mental health. But the same capability looks very different when deployed across a workplace or university without people’s knowledge.At a minimum, people should know when these tools are being used, what data is being analysed and whether the system has been independently tested. A claim that software can detect distress is not, on its own, enough.Mohammad Hossein Amirhosseini does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.