By: Julie Do
Apple’s new iPhone may have received its biggest makeover yet for its 10th anniversary, but it isn’t the only thing Apple is looking to change. Soon, people will be using Siri for more than just directions and basic phone utilities.
The tech giant has plans to make Siri better at detecting and responding to depression in its users. The announcement means that people could start turning to their phone for emotional support.
For Faisal Hemani, a third-year computer engineering student, it’s an alternative that he thinks can allow other students to de-stress when they feel like there’s nobody else to talk to.
“The workload can be suffocating at times, especially during exam season,” he said.
“Having someone there for you is very important, but can be the most difficult to find when everybody is isolating themselves in order to study. It would be a great idea to get Siri to recognize that, because [students are] probably the ones interacting with [Siri] the most.”
A study, released in 2016 by the Canadian Association of College and University Student Services, found that a fifth of Canadian post-secondary students suffer from depression or anxiety, or are battling other mental health issues.
Yet, the possibility that people could begin turning to artificial intelligence (A.I.) for emotional support isn’t supported by everybody.
Alan Faigal, an early childhood education professor at Ryerson, says that even though having the option is a great start, it just doesn’t replace the benefits of human interaction.
“The idea of having face-to-face communication with students is lacking now, and I find that when I ask them to come in to see me during office hours, the simple act of being with a human being – being face-to-face – is already sometimes so comforting to them and it lowers the anxiety,” he said.
He thinks it’s a good start for Apple, but that they should focus on programming Siri so that it could bridge people to other resources, such as psychiatrists and counsellors. This is a concern being echoed by both medical and programming experts. Apple’s move with Siri has raised red flags about whether A.I. reliance could do more harm than good for those with mental illnesses.
Dr. Miranda Lang, a family physician at Humber River Hospital, worries that a phone app alone cannot accurately diagnose depression. She says a medical professional makes their diagnosis based on history of in-person conversations and physical examinations.
“My concern is that another diagnosis that presents in a similar way may be missed, or that such a diagnoses would be delayed due to a false reassurance that depression was the only factor,” she said.
Wei Sun, a computational medicine trainee at SickKids, says that the tech company will face many different barriers as it tries to redevelop Siri to respond to nuances like tone and inflection.
“They’ll have to do a lot of research, particularly in the field of language processing,” he said. “It deals with human language and they’ll have to research what kind of vocabulary a person would use to indicate they had depression… [including] sentence structure, [and] all the linguistic components that computers can analyze to determine whether or not a person has depression.”
To improve Siri so it can accurately diagnose, Siri must tap into private conversations with users.
“This is unavoidable,” Sun said “But it can be managed.”
The issue of privacy is something that Sun hopes Apple will be able to make perfectly clear to its users so that both sides are completely aware of what they’re asking and signing up for.
“Consent is crucial,” he said.
Apple is one of several tech giants making the step towards mental health. Facebook and Google have said that they plan to improve their own A.I. software and put more funding towards the betterment of mental health.