In response to the news that Alexa will be used by the NHS to give health advice, please fine below comments from experts as part of our experts comment series.
Ask #Alexa to diagnose your health problems as #NHS announces 'world-first' partnership with Amazon https://t.co/3NmUIPUvBj pic.twitter.com/MOrQXB9w7c
— David Icke (@davidicke) July 10, 2019
Experts Comments:
David Emm, Principal Security Researcher at Kaspersky:
“We know that people are relying on these devices more and more, and their popularity is growing. They do have their benefits, and they are convenient, however, they are, at their core, smart listeners and have made headlines in recent times because of this – leaving a scepticism around them.
“We also know that Amazon is storing and analysing data that these devices collect, which also raises cybersecurity alarms when it comes to how this data will be used. They will be privy to sensitive health data, and so it must be made clear to the public how our data will be protected. It is integral, however the many benefits they can provide the NHS with, that Amazon is totally transparent about this, to provide consumers with the assurance they need that their data is well safeguarded. It is also important that people are provided with a way of opting out of their data being stored, if they choose to do so.”
lMatt Walmsley, EMEA Director at Vectra:
‘This a new way to access existing informisation from NHS Choices, and the type of inquiry data is unlikely to be any more sensitive than that used in utilising their web site to access the same clinical information and advice. Accessing high quality information with assured provenance is important when using the internet to research medical issues. The existing broader privacy concerns and risks around the anonymisation, storage and re-use of smart speaker’s recordings of our spoken interactions remain, however. Caveat emptor; users need to be informed and comfortable with how Amazon and NHS Choices are processing and using their data.’
Boris Cipot, Senior Security Engineer at Synopsys:
“From a convenience standpoint, the ease of such an offering provides instant answers for users seeking health advice. This is especially true if users aren’t feeling well. And yet, data protection is definitely a concern with the news that Amazon Alexa has now teamed up with the NHS to offer UK users health-related advice through voice search. Not only will the data be saved, but it remains unclear how the user data that is collected will be used. For instance, if an insurance provider gains access to the user-specific data, they could potentially categorise users into risk categories based on the advice they sought which could also lead to increased insurance rates for those deemed high risk. Doctor-patient privacy could also be circumvented through this method of data collection since a doctor isn’t actually involved; therefore, nullifying patient privacy protection policies.
Data privacy protection regulations such as the GDPR do involve how private data needs to be treated and what the repercussions could be if data isn’t handled appropriately. We’re looking at the early phases of data usage transparency—a standardised way in which user data is protected and the intended use on behalf of the collectors. A few questions that I would want to know as a user include: what happens with the backups that are saved within data storage facilities, and how do companies such as Amazon delete the back-end data after the specified period of retention?
Users need to use such services with care. Remember that home assistants are connected to the internet, so the questions you ask and data you provide are only so private.”
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.