top of page

Apple's Commitment to Mental Health?

This year, at Apple’s 2023 Worldwide Developers Conference (WWDC), the tech giant unveiled many exciting new projects, with its first new product launch in over a decade, the Apple Vision Pro. This new virtuality headset has been the main talking point for many, but aside from this futuristic hardware, Apple has also announced a new range of physical and mental health tools that will soon be available for all iPhone and Apple Watch users.


As a marketing aficionado, I have always been captivated by Apple’s marketing strategies, from selling a lifestyle to product placement and overall rethinking the need for advertising,

allowing them to enjoy unparalleled revenue growth. It is no surprise that Apple continuously ranks within the top 4 Fortune 500 companies, year after year.


The new updates to the inbuilt ‘Health App’, “will allow users to log their daily emotions and moods, see valuable insights and easily access assessments and resources”, by allowing users to continuously reflect on their current state of mind. This new feature will have sliding scales of emotions, ranging from very unpleasant to very pleasant, with each feeling being accompanied by a different colour. From this, users will then be able to identify what may be contributing to their feelings, creating trends, and identifying patterns – whether these are due to associations like family factors, or lifestyle factors such as sleep or exercise.



Apple proudly demonstrated how this tool will give its users the ability to reflect on their mental state and “help build emotional awareness”. Additionally, Apple claims that their Health App will enable users to take the same depression and anxiety assessments used in clinics, which will supposedly help users “determine their risk levels, connect to resources available in their region, and create PDFs shareable with their doctor”. To this, Sumbul Desai, M.D., Apple’s vice president of Health, said that this new software feature was created to “empower people to take charge of their own health journey”.


“Mental health is important, but often overlooked, and we’re excited to introduce features that offer valuable new insights to provide users with an even better understanding of their health,” Desai added. “These insights help support users in their daily decisions and offer more informed conversations with their doctors.”


Yet, I have varied concerns about this new feature. Not only is this tool solely based on emotional appraisals, but how securely stored is this data really?


According to a study reported by Forbes Magazine, there are 2.5 quintillion bytes of new consumer data created each day and it is only accelerating. So, whilst this allows businesses, especially in the tech industry, to gain rich and timely customer insight, it threatens the privacy of all its users. Author Cathy O’Neil even goes as far to say that “big data increases inequity and threatens democracy” and claims that “algorithms are mathematical models that have harmful outcomes by encoding socio-political biases and enable predatory companies to advertise selectively vulnerable people”. Her book, Weapons of Math Destruction, offers a critical look at the growing third-party data algorithms, arguing that they have the power to manipulate conversations and are essentially misleading tools used to merely sell products. Over the last years, there has been been a major push for better consumer privacy regulations, data protection and transparency about the information companies are collecting about users, challenging tech giants like Apple and Google to give their users the ability to choose whether apps and websites can access their data, as well as providing information as to how their data will be used.


Whilst Apple specifically installed a system that offers randomised email addresses to use when signing up for new apps and services, they still have a far way to go and must find a bulletproof way of securing this data, or not collect it at all.


Now, what about the health concerns of this new tool? A software cannot possibly, accurately, and sustainably, diagnose the owner to the same level as a licensed healthcare professional.


Making mental health accessible and providing information that allows people to acknowledge their own mental health, allowing them to take matters into their own hands, is not bad at all. I believe that it is important to provide easy and convenient ways for the population to recognise mental health and provide appropriate resources for further assistance. But this new feature has inbuilt anxiety assessments that even suggest next steps based on assessed risks.


This technology, was created to diagnose depression, anxiety, and cognitive decline. These special algorithms are already able to analyze mobility, sleep patterns, physical activity, heart rate and even your typing behaviour to see if you should be concerned about your physical and mental health. But mental health conditions are incredibly difficult to diagnose as they are because they can manifest very differently from person to person, with different symptoms at very different times. Sometimes, patients cannot even be diagnosed due to the complexity of their symptoms or other comorbid health problems.


So, while we can train machine learning on large pools of data, there is an incredible complexity to mental health and a wrong diagnosis could have incredibly harmful effects on an individual, both physically and psychologically.


Today, the momentum for digital software and AI solutions in mental health is building rapidly. However, my mixed feelings remain. I believe in making mental health more accessible and as user-friendly as possible. It’s important for this information to be approachable, available, and welcoming. But it is vital for health-based platforms to be precise and error-free and for the data to be transparent, secure, and protected… unfortunately, I am not convinced that we are there just yet.


bottom of page