Wild Health Summit says the future of healthcare is… fax machines?
I attended the Wild Health Summit 2018 this week. The two things commonly mentioned in talks were fax and paper. As the co-founder of Transhumanism Australia, which investigates emerging tech, this Summit was a eye opener as to the opportunities for change in healthcare.
International speakers were blasphemous that the Australian health system still relies heavily on fax and at our defensiveness of its use, in light of groundbreaking developments in data exchange technologies which could bring us leaps and bounds forward.
There was also mention of the Singularity, Elon Musk’s and Stephen Hawking’s warnings of the dangers of AI, and Nick Bostrom’s Super Intelligence. However, much of the Summit was focused on practical implications, use cases and imperatives for change.
When it comes to data, cyber security and ability to share data are some key issues.
Dr John Halamka, a reknowned healthcare CIO and Professor of Medicine at Harvard, mentioned that as the custodian of a large hospital, what keeps him up at night is cyber security. Dr Halamka looks after 11 petabytes of data which contain health records of millions of patients dating all the way back to the 1980’s, while also having to share more and more data with more and more parties.
In terms of exchanging data, The Fast Healthcare Interoperability Resource (FHIR) is the latest protocol being used. It is being adopted by tech giants like Google, Apple and Amazon. FHIR joins siloed systems that use different data formats and structures together. Health data is turned into data elements that have a tag that is a unique identifier.
In a world where data mostly lives in documents, FHIR allows developers to create Application Programming Interfaces (APIs) to provide access to data. The architect of the FHIR standard is Australia’s very own Grahame Grieve.
Some of the challenges with health data is that there is lots of it and it is very messy. According to Eyal Oren, Product Manager at Google Brain, 150,000 data points can be generated from a patient and this does not even take into account imaging and genomic data. Very few machine learning models can read all these data points, let alone a human doctor, so in practice all this data is not used.
Another complexity with health data is that there is a lot of jargon, which when read by a doctor and in context of other medical notes makes sense. However in isolation it can mean something else entirely. To the layperson, HD can mean hard drive, to a doctor it can mean Huntington Disease.
Google Brain’s Eyal Oren spoke about one of it’s AI projects to assist in medical diagnosis. FHIR was adopted and a machine learning model was created from 150,000 patient data points and unstructured clinician notes. Google’s approach was to use all available data, rather than filtering and cleaning up data. Eyal Oren pointed out it is easier to make algorithms adapt to humans rather than make humans provide clean and structured data. The result was a more accurate diagnosis than a diagnosis made by a human doctor as more data points were used.
AI won’t be replacing human doctors just yet. Until we have Artificial General Intelligence (AI which can think like a human) which some believe is 10–20 years away, AI will only augment and assist doctors.
Today’s AI is only trained to diagnose specific illnesses and recommend specific therapies, according to Professor Enrico Coiera, Director of the Centre of Health Informatics at Macquarie University.
Interpreting medical imaging using computer vision is a popular area in machine learning. For example, applications have been developed for interpreting chest x-rays for tuberculosis. Enrico Coiera states machines have become so good at interpreting images there is a question as to whether the profession of radiology is still relevant.
In the long term, radiology could just be integrated into regular diagnostics, where a doctor reads your x-rays, along with your genomic data and other data that is standard today such as blood work results. All with the help of AI.
When AI begins interpreting medical images more accurately than a human, with no specialist knowledge or even human needed, the radiology sector will need to think of new business models, Enrico Coiera states. Radiologists perform a number of tasks but are paid based on number of images read. When an AI does the image reading, it is unclear who should be paid for that.
Once radiology can be performed on the spot by doctors with hand image scanners, this is where true disruption occurs, not because the human has been removed from the process but because the whole business model has changed.
However, machine vision and diagnosis isn’t easy. Even IBM Watson, IBM’s multi billion dollar ‘cognitive computing’ service, didn’t get it right, coming under fire for their suggestions on cancer treatment which are considered unsafe. They have since let go thousands of their workforce in the Watson Health division.
Using AI will free up doctors to provide more patient care. Dr John Halamka stated that clinicians are spending 50% of their time doing administrative work that is best done by machines. AI could reduce the burden of documentation and help clinicians keep up to date with the latest clinical research and treatments.
Having access to AI and data, could significantly improve accuracy of diagnosis. Dr John Halamka suggested a scenario of a patient coming in with puffy eyes. If you have access to a database of puffy eyes, you could narrow down the possibilities of which disease it is.
In developing countries like India, the Gates Foundation is launching a service to track tuberculosis, including who is on medications to treat it, and if they are taking their medication. All this is done on a mobile, using slow networks, and storage forward methodologies.
In South Africa, the Gates Foundation is tracking HIV patients and HIV lab results and identifying efficacy of treatments using biometrics such as finger prints and retinal scans. The data is being made available on flip-phones.
During the startup pitches, many products being demoed showed that startups are taking a mobile first approach when it came to developing products.
In dermatology, there are applications which allow you to take a picture with your smart phone and have the AI interpret it for you. SkinVision is one mobile app which helps assess skin cancer.
MetaOptima’s DermEngine, an end to end dermatology platform provides a hand held scanner and phone camera attachment which can provide on the spot diagnosis for skin cancer.
As of today, Australia has not adopted FHIR for their national health record system. Neither is AI being used extensively, or a mobile-first approach being taken in healthcare.
Cheryl McCullagh, Director at the Sydney Children’s Hospitals Network stated we are an innovative country willing to take risks, however a major constraint is budget and a lower level of funding compared to other countries. For example, Cheryl is keen to digitise all data but lacks the human resources to do it fast enough.
Will change be driven bottom up by patients? A point was made that the majority of people who currently interact with the healthcare system might be digitally illiterate, but their carers are not. Conversations post-event buzzed about change, no doubt driven by the dialogue in the summit.
Keen to hear your thoughts on what needs to change in healthcare.
I only write about emerging tech and transhumanism. Follow me on Medium if you also want to know how to make sense of a world impacted by AI, blockchain and other emerging technologies.
Wild Health Summit says the future of healthcare is… fax machines?
Research & References of Wild Health Summit says the future of healthcare is… fax machines?|A&C Accounting And Tax Services
Source