Researchers are using machine learning to screen for autism in children

Researchers are using machine learning to screen for autism in children
Image by 192635 from Pixabay

Researchers at Duke Engineering and the Duke School of Medicine have created an app to screen young children for signs of autism by scanning their reactions as they watch movies, the university said on 11 July.

The app uses a video coding algorithm which tracks and looks for movements that display the child’s emotions and ability to pay attention, which can indicate autism risk. The initial study was conducted using an app from the Apple Store and based on Apple’s ResearchKit open source development platform.

The first results from the pilot study of this five year project started coming in last year, leading to “new insights about autism spectrum disorder (ASD) and has the potential to transform how children’s development is screened and monitored”.

The app first administers caregiver consent forms and survey questions and then uses the phone’s selfie camera to collect videos of young children’s reactions while they watch movies designed to elicit autism risk behaviours, such as patterns of emotion and attention, on the device’s screen.

The videos of the child’s reactions are sent to the study’s servers, where automatic behavioural coding software tracks the movement of video landmarks on the child’s face and quantifies the child’s emotions and attention. For example, in response to a short movie of bubbles floating across the screen, the video coding algorithm looks for movements of the face that would indicate joy.

“Babies who go on to develop autism typically don’t pay attention to social cues,” Geraldine Dawson, director of the Duke Center for Autism and Brain Development, said in an article published on Wired and cited by the university in its announcement. “They’re more interested in non-social things, like toys or objects. They’re also less emotionally expressive. They smile less, particularly in response to positive social events.”

Guillermo Sapiro, professor of electrical and computer engineering, is using Amazon Web Services and tools called TensorFlow and PyTorch to build machine learning algorithms that connect children’s facial expressions and eye movements to potential signs of ASD. His group is also using these cloud computing tools to develop such algorithms for privacy filters for the images and videos they collect.

Through the app, the Duke team was able to collect behavioural data from around 1,700 children, which is far more than the 50 to 100 typically found in an ASD study. With that amount of data in hand, the researchers have so far found the app to be almost 90 percent accurate for some subsets of behaviours.

Reader Interactions

Leave a Reply

Your email address will not be published. Required fields are marked *