Mix

Do you have ADHD voice?

Our voices can reveal so much about us, from clues about class and geographical origins to our mood. But it turns out that our voices also carry complex cognitive patterns that could unlock new understanding about our minds, too. 

Simon AI is a new AI-powered analysis system that surveys these patterns in our voices to offer insights about neurodiversity and mental health. Developed after decades of research by tech company Psyrin, the technology uses 260 distinct voice biomarkers – from speech rhythms to tonal patterns – to identify neurodivergent traits such as ADHD and autism with up to 88 per cent accuracy. It’s already being used in US clinics to support mental health assessments on schizophrenia, bipolar disorder and depressive disorder, but Psyrin sees broader potential in preventative healthcare. Rather than replacing traditional diagnoses, Simon AI aims to give people data-driven insights about their minds in minutes rather than months.

I took the Simon assessment on my phone at home. The prompt showed me a detailed illustrative image, and I was asked to describe everything I noticed for 60 seconds. It was like recording a voicenote for a friend, describing what I saw and how the picture made me feel. Within a few seconds of hitting submit, Simon had compared my voice patterns to datasets from clinically diagnosed individuals, and I was given my result with access to a full voice report.

I am someone who has wondered my whole life if I am neurodivergent, especially now social media feeds are constantly pathologising everyday behaviour (from having an inner monologue or procrastinating, to sleep positions). So receiving an instant, research-backed assessment was both affirming and enlightening – though it turns out my results from speech rhythm to volume were very neurotypical. Following my 60-second assessment, I spoke with Psyrin’s co-founder and CTO Dr Julianna Olah about how voice analysis could transform our approach to mental wellbeing, and what our daily speech patterns reveal about us.

Hey Julie! What can we tell about our brains and our wellbeing from our voice?

Dr Julianna Olah: A lot of people are surprised by how much we can glean from studying voice, but speech is an incredibly complex cognitive process. Think about how you can instantly tell if someone who is ‘fine’ isn’t really fine over the phone. Clinical mental health evaluations already assess things like speech speed, tone and coherence, but with Simon we can measure 260 different elements of your voice. Machine learning picks up delicate and complex things that the human ear and brain can’t translate the same way. Simon uses voice to tell with a high degree of accuracy the likelihood of someone being neurodivergent, but voice changes can also be early indicators of conditions like Alzheimer’s or Parkinson’s, and you can tell whether someone is pregnant or not from their voice.

For neurodiversity and mental health conditions, how much speech do you need to make an accurate assessment? 

Dr Julianna Olah: With just one minute of speech, we achieve 72 per cent accuracy in matching clinical diagnoses. This increases to 83 per cent for ADHD and 88 per cent for autism with five-minute speech samples. However, it’s important to understand that Simon provides probabilities rather than diagnosis. If you get a 70 per cent ADHD result, it means your voice profile matches 70 per cent of clinically diagnosed individuals in our dataset, so if we take 100 people who have been diagnosed with ADHD, 70 of them will have a similar voice profile to you. This is a tool for reflection, prompting self-exploration or further medical diagnosis if needed.

Often it’s said that women mask conditions like autism does this masking also happen with speech? 

Dr Julianna Olah: We specifically account for sex in our assessments because women are historically underdiagnosed in neurodevelopmental conditions. Women often mask their neurodivergent traits more effectively due to socialisation. AI models, if curated well, can be more equal and less biased than a psychiatrist. Humans are amazing at many things, but they’re bad at avoiding cultural and social influence.

For example, you’re more likely to get a borderline personality disorder diagnosis if you are a woman, whereas you’re more likely to be diagnosed with narcissism if you are a man. From a personal perspective, it’s also frustrating that if you’re a strong-minded woman or have deep interests, people automatically say you have autism. As a woman, not conforming to societal expectations can lead to this labelling, even though men are free to behave the same way without judgment. Simon helps to strip back that societal and cultural input and assess you purely on an objective metric.

We’re trying to shift the conversation from ‘what’s wrong with me?’ to ‘how does my mind work?’

What do you think about the current culture of self-diagnosis, driven by TikTok and wider social media trends? 

Dr Julianna Olah: We need to be thoughtful about how we use diagnostic labels. These terms were originally created as clinical tools to connect people with treatment, not as personality descriptors or moral judgments.

What concerns me is how these labels can be weaponised when taken out of their helpful context, and how some people get labelled without understanding what that means for them as a unique human being. Instead of saying ‘I have ADHD’, it might be more meaningful to express, ‘I want you to listen to me, even if I go off on tangents or speak quickly.’ You may have a label of ADHD, but you might find that that’s come from a really divergent mode of thinking where your vocabulary is very wide and you can switch and link topics quickly when you speak – you can consider that and think, this isn’t just ADHD, I’m actually a poet! It’s about understanding our unique patterns rather than collecting labels. We’re trying to shift the conversation from ‘what’s wrong with me?’ to ‘how does my mind work?’

What’s the end goal with this technology?

Dr Julianna Olah: We envision voice biomarkers becoming a standard tool in mental health assessments, from wellness spaces to clinical diagnosis. The key is making these insights accessible while maintaining user dignity and data privacy. I understand people can be sceptical about new technologies like AI, but it’s such a broad umbrella and it’s important to ask whether you’re afraid of technological exploitation or the technology itself. We never sell data to third parties and ensure users get valuable support in exchange for their participation in improving the technology. That should be the bare minimum in healthcare. Our goal is to use this incredible technology to ensure people are getting the support they need. We believe voice will play a crucial role in the future of prevention, early intervention and mental health support.

  • For more: Elrisala website and for social networking, you can follow us on Facebook
  • Source of information and images “dazeddigital”

Related Articles

Leave a Reply

Back to top button