Researchers at IISc, in collaboration with Aster-CMI Hospital, have developed an AI tool that can identify the median nerve in ultrasound videos and detect carpal tunnel syndrome (CTS). The study was published in IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control.
CTS arises when the median nerve, which runs from the forearm into the hand, is compressed at the carpal tunnel part of the wrist, resulting in numbness, tingling or pain. It is one of the most common nerve-related disorders, specifically affecting individuals who perform repetitive hand movements, such as office staff who work with keyboards, assembly line workers, and sportspersons.
Doctors currently use ultrasound to visualise the median nerve, and assess its size, shape, and any potential abnormalities. “But unlike X-rays and MRI scans, it’s hard to detect what’s going on in ultrasound images and videos,” explains Karan R Gujarati, first author and former MTech student at the Department of Computational and Data Sciences (CDS), IISc. “At the wrist, the nerve is quite visible, its boundaries are clear, but if you go down to the elbow region, there are many other structures, and the boundaries of the nerve are not clear.” Tracking the median nerve is also important for treatments that require doctors to administer local anaesthesia to the forearm or block the median nerve to provide pain relief.
To develop their tool, the team turned to a machine learning model based on transformer architecture, similar to the one powering ChatGPT. The model was originally developed to detect dozens of objects simultaneously in YouTube videos. The team stripped the model’s computationally expensive elements to speed it up, and cut down the number of objects it could track to just one – the median nerve, in this case. They collaborated with Lokesh Bathala, Lead Consultant Neurologist at Aster-CMI Hospital, to collect and annotate ultrasound videos from both healthy participants and people with CTS, to train the model. Once trained, the model was able to segment the median nerve in individual frames of the ultrasound video.
“Imagine a video of an autonomous car. If the car is moving on the road, you want to track the car,” explains corresponding author Phaneendra K Yalavarthy, Professor at CDS. “In the same way, we are able to track the nerve throughout the video.”
The model was also able to automatically measure the cross-sectional area of the nerve, which is used to diagnose CTS. This measurement is performed manually by a sonographer. “The tool automates this process. It measures the cross-sectional area in real time,” explains Bathala. It was able to report the cross-sectional area of the median nerve with more than 95% accuracy at the wrist region, the researchers say.
Although many machine learning models have been developed to screen CT and MRI scans, very few have been developed for ultrasound videos, especially nerve ultrasound, explains Yalavarthy.
“Initially, we trained the model on one nerve. Now we are going to extend it to all nerves in the upper and lower limbs,” says Bathala. He adds that it has already been deployed as a pilot test in the hospital. “We have an ultrasound machine connected to an additional monitor where the model is running. I can look at the nerve, and at the same time, the software tool is also delineating the nerve. We can see its performance in real time.”
Bathala says that the next step would be to look for ultrasound machine manufacturers who can integrate this into their systems. “This kind of tool can assist any doctor. It can reduce the inference time,” he says. “But of course, the final diagnosis will need to be done by the physician.”