Ralf Sydekum, Technical Manager, F5 Networks, discusses whether artificial intelligence can be empathetic to all the needs of patient care.
Healthcare related Artificial intelligence (AI) is developing fast and advances in critical diagnosis have revolutionised patient care.
However, there is still a clear dividing line between automating manual tasks and mediating relationships between the medical profession and patients. Therefore, can an AI-enabled app really replace a human doctor and still provide quality patient care?
AI Doctors
There is no doubt that AI is making its presence felt in hospitals today, including detecting diseases earlier and more accurately. According to the American Cancer Society, AI can now deliver 99% accurate reviews and translations of mammograms 30 times faster than before, dramatically reducing unnecessary biopsies.
With five billion people in the world having no access to surgery or primary medical care, there is clearly a pressing need for substantial technological intervention. Recent use cases of note include the UK’s National Health Service (NHS), trialling symptom-checker chatbot apps to triage primary care patients. Users submit their symptoms to the app and quickly receive a recommended action drawing on algorithms, clinicians, and data analytics.
Across the sector, AI is already being used in various ways to sift through colossal amounts of data to highlight treatment mistakes and workflow inefficiencies. It is even helping healthcare systems to avoid unnecessary hospitalisations. Other intriguing examples that are being trialled include facial tracking systems to determine whether a patient is distressed based on muscle movements in their nose, lips, cheeks, or eyebrows.
In fact, the reality of an all-encompassing AI Doctor may be closer than we think.
Recently, subscription health service provider Babylon Health launched AI software capable of beating human doctors in a medical exam. It is a remarkable achievement and suggests a near future where AI can play a major role in supporting local medical practitioners and relieving the burden of rising community demands.
Health and safety
While AI is fast evolving, major concerns still remain about the technology’s ultimate capacity to make an accurate diagnosis or properly empathise with patients.
Crucially, the role of patient empathy in healthcare must be the primary consideration. An inevitable lack of emotional intelligence means there is always a danger that crucial psychological or behavioural nuances are missed, and people are put at risk. Doctor-patient relationships are especially important when individuals suffer from illnesses like stress or depression. AI critics also decry its possible role in diminishing human interaction, depersonalising the healthcare environment, and damaging community spirit.
Security issues are another significant obstacle to widespread AI adoption. According to McAfee Labs, healthcare was 2017’s most targeted sector in terms of breach instances. Unfortunately, too many applications are still built without security in mind, whereas monitoring systems and other wearables are frequently hacked by cybercriminals hunting for sensitive personal data. Meanwhile, malicious automated bots are spreading throughout the world’s healthcare sectors collecting, tracing, and swiftly retrieving data with greater sophistication than ever.
Ransomware attacks are also on the rise. Earlier this year, SamSam brought havoc to the healthcare sector, seeking insecure remote desktop protocol (RDP) connections and vulnerable JBoss systems to carry out its infections.
These are worrying trends and there is no margin for error when lives are at stake. It is therefore vital that healthcare providers stay ahead of the technological curve, including intelligently managing traffic from a single platform that controls user access and delivers robust application and network security. It is essential to know what makes your apps vulnerable, as well as how they can be attacked. Only then can you implement appropriate solutions. App developers also have a duty to do better and ensure applications, in public or private cloud environments, are fully tested for scale, performance, and security.
The prognosis
Innovation in the health sector has huge potential. Early AI use-cases and cognitive apps are already making an influential and welcome difference. In the future, it may even be possible for AI to diagnose, as well as adequately and empathetically care for you. For now, the potentially impassive world of a full-service ‘download a doctor’ will have to wait.
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.