DeepMind artificial intelligence (AI) can detect almost 50 different eye diseases by analyzing three-dimensional retinal scan images.
After Google acquired DeepMind, the British artificial intelligence company famous for its AlphaGo algorithm that defeated the world champion in the Go strategy game, in 2014, DeepMind has become able to detect almost 50 different eye diseases by examining three-dimensional retinal scan images. According to an article published in the British journal Nature Medicine, the algorithm, developed using anonymized health data, can detect diseases such as yellow spot and eye diabetes with a success rate close to that of specialist physicians.
It can also inform patients about the treatment methods they should follow and what they need to do urgently. The software primarily aims to facilitate the work of specialist physicians. In this context, when a disease is detected, the findings on the basis of which this decision is made are shared with the physician. The parts of the image that may be related to the disease are even labeled with a prediction percentage. This helps the physician make a more accurate assessment. The examination of three-dimensional retinal scan data can take a long time and patients may experience problems that can lead to temporary vision loss. With artificial intelligence, it may be possible to shorten this time, and especially patients who are thought to have a serious problem may be moved to the front line.
How Deepmind Works?
What is particularly important about the research, according to DeepMind’s Co-Founder Mustafa Suleyman, is that the AI has a level of ‘explainability’ that can increase doctors’ trust in its recommendations. He said, “It is possible for the clinician to interpret what the algorithm is thinking. They can look at the underlying segmentation. In other words, our AI software doesn’t work like a mysterious black box that reveals the results. DeepMind’s AI was trained on a database of about 15,000 eye scans stripped of identifying information. DeepMind worked with clinicians to label areas of disease and then can calculate these labeled images with a percentage score in its system. The two-and-a-half year project required a major investment for DeepMind and involved Moorfields researchers as well as 25 other employees.” Suleyman called the findings a ‘research breakthrough’ and said the next step is to prove that AI works in a clinical setting, which will take several years.
Accelerating Eye Scans
He emphasizes that patients are at risk of losing their eyesight because doctors are unable to attend to eye scans in a timely manner. Mustafa Suleyman says that one of the reasons DeepMind and Moorfields undertook the research project was that clinicians were ‘overwhelmed’ by the demand for eye scans. He said, “If you have a sight-threatening condition, you want to be treated as soon as possible and unlike A&E, a staff nurse will talk to you and make an assessment of how serious your condition is, then use that assessment to decide how quickly you will be seen. When an eye scan is sent, there is no system for prioritizing medical intervention based on the severity of your scan. The fact that eye scans are done through artificial intelligence will speed up the whole process. It’s clear how Deepmind has the potential to transform public hospitals. In the future, we envision a person going to their local high street optician and getting an eye scan, and this algorithm will identify patients with sight-threatening diseases at a very early stage of the condition.”
Not Limited to the Eye
Artificial intelligence’s detection of diseases is not limited to the eye. The algorithm developed by MIT researchers Tuka Alhanai, Muhammad Ghassemi and James Glass can predict whether a person is depressed or not by analyzing their texts and voices. While therapists try to understand a person’s mental state through question and answer methods and observation, artificial intelligence tries to do this by looking at the content the person produces on certain topics. Of course, while the therapist detects depression, the AI only guesses. Snapchat has developed a similar artificial intelligence software. This application can determine how happy people in a selfie are on a scale of 1-5. This kind of software can help identify people who are ‘thought’ to be depressed and take precautions by analyzing their social media posts.
August 2024