AI app could help diagnose HIV more accurately

Pioneering technology developed by researchers from UCL and Africa Health Research Institute (AHRI) could transform the ability to accurately interpret HIV test results, particularly in low- and middle-income countries.

Academics from the London Centre for Nanotechnology at UCL and AHRI supported by the BRC used deep learning (artificial intelligence/AI) algorithms to improve health workers’ ability to diagnose HIV using lateral flow tests in rural South Africa.

Their findings are published today in Nature Medicine. By harnessing mobile phone sensors, cameras, processing power and data sharing capabilities, the team developed an app that can read test results from an image taken by end users on a mobile device. It may also be able to report results to public health systems for better data collection and ongoing care.

A team of more than 60 trained field workers at AHRI first helped build a library of more than 11,000 images of HIV tests taken in various conditions in the field in KwaZulu-Natal, South Africa, using a mobile health tool and image capture protocol developed by UCL.

The UCL team then used these images as training data for their machine-learning algorithm. They compared how accurately the algorithm classified images as either negative or positive, versus users interpreting test results by eye.

The machine learning classifier was able to reduce errors in reading the rapid diagnostic tests (RDTs), correctly classifying RDT images with 98.9% accuracy overall, compared to traditional interpretation of the tests by eye (92.1%).

A previous study of users of varying experience in interpreting HIV RDTs showed the accuracy varied between 80% and 97%.

Lead author and Director of i-sense Professor Rachel McKendry (UCL London Centre for Nanotechnology and UCL Division of Medicine) said: “This research shows the positive impact the mobile health tools can have in low- and middle-income countries, and paves the way for a larger study in the future.”

First author Dr Valérian Turbé (UCL London Centre for Nanotechnology) and i-sense researcher in the McKendry group said: “If these tools can help train people to interpret the images, you can make a big difference in detecting very early-stage HIV, meaning better access to healthcare or avoiding an incorrect diagnosis. This could have massive implications on people’s lives, especially as HIV is transmissible.”

The team now plan a larger evaluation study to assess the performance of the system, with users of differing ages, gender and levels of digital literacy.

Photo credit: Africa Health Research Institute