R01LM013364
Project Grant
Overview
Grant Description
A Mobile Game for Domain Adaptation and Deep Learning in Autism Healthcare - Project Summary
Neuropsychiatric disorders are the single greatest cause of disability due to non-communicable diseases worldwide, accounting for 14% of the global burden of disease. The current standards of care suffer from subjectivity, inconsistent delivery, and limited access with growing waitlists. New informatics solutions, in particular artificial intelligence (AI) that can port to more ubiquitous mobile health devices and that are not restricted for use in clinical settings, have great potential to complement or even replace aspects of the standards of care.
We propose to develop a novel informatics solution for one of the most pressing mental health burdens, autism, which has increased in incidence by more than 600% since 1990. Autism is among the fastest-growing pediatric concerns today and is highly representative of many other neuropsychiatric conditions. We have invented a prototype mobile system called Guess What (guesswhat.stanford.edu) (GW) that turns the focus of the camera on the child through a fluid social engagement with his/her social partner. This engagement reinforces prosocial learning while simultaneously measuring the child's developmental learning progress.
At its simplest level, the GW app challenges the child to imitate social and emotion-centric prompts shown on the screen of a smartphone held just above the eyes of the individual with whom the child is playing. But more importantly, as a home-based repeat-use system, GW uses computer vision algorithms and emotion classifiers integrated into gameplay to detect emotion in the child's face via the phone's front camera. It automatically finds agreement with the displayed prompt while capturing features such as gaze, eye contact, and joint attention.
Preliminary work with more than 20 autistic children resulted in positive user feedback, evidence of high engagement for both the parents and children, and, importantly, evidence of clinically meaningful gains in socialization. A single session produces 90 seconds of enriched social video and sensor data, opening up an exciting opportunity for the gameplay itself to passively generate labeled computer vision libraries. These libraries enable the development of better models with higher diagnostic precision going forward.
Our proposed project will show that GW can (a) serve as a mobile therapy that can be used repeatedly by families to target core deficits of autism while inherently tracking progress during use, and (b) serve as a distributed system to crowdsource the acquisition of new labeled image libraries for AI models. These models can automatically classify diagnostic features relevant to autism and extend to other sectors of mental health and even beyond.
Neuropsychiatric disorders are the single greatest cause of disability due to non-communicable diseases worldwide, accounting for 14% of the global burden of disease. The current standards of care suffer from subjectivity, inconsistent delivery, and limited access with growing waitlists. New informatics solutions, in particular artificial intelligence (AI) that can port to more ubiquitous mobile health devices and that are not restricted for use in clinical settings, have great potential to complement or even replace aspects of the standards of care.
We propose to develop a novel informatics solution for one of the most pressing mental health burdens, autism, which has increased in incidence by more than 600% since 1990. Autism is among the fastest-growing pediatric concerns today and is highly representative of many other neuropsychiatric conditions. We have invented a prototype mobile system called Guess What (guesswhat.stanford.edu) (GW) that turns the focus of the camera on the child through a fluid social engagement with his/her social partner. This engagement reinforces prosocial learning while simultaneously measuring the child's developmental learning progress.
At its simplest level, the GW app challenges the child to imitate social and emotion-centric prompts shown on the screen of a smartphone held just above the eyes of the individual with whom the child is playing. But more importantly, as a home-based repeat-use system, GW uses computer vision algorithms and emotion classifiers integrated into gameplay to detect emotion in the child's face via the phone's front camera. It automatically finds agreement with the displayed prompt while capturing features such as gaze, eye contact, and joint attention.
Preliminary work with more than 20 autistic children resulted in positive user feedback, evidence of high engagement for both the parents and children, and, importantly, evidence of clinically meaningful gains in socialization. A single session produces 90 seconds of enriched social video and sensor data, opening up an exciting opportunity for the gameplay itself to passively generate labeled computer vision libraries. These libraries enable the development of better models with higher diagnostic precision going forward.
Our proposed project will show that GW can (a) serve as a mobile therapy that can be used repeatedly by families to target core deficits of autism while inherently tracking progress during use, and (b) serve as a distributed system to crowdsource the acquisition of new labeled image libraries for AI models. These models can automatically classify diagnostic features relevant to autism and extend to other sectors of mental health and even beyond.
Funding Goals
TO MEET A GROWING NEED FOR INVESTIGATORS TRAINED IN BIOMEDICAL INFORMATICS RESEARCH AND DATA SCIENCE BY TRAINING QUALIFIED PRE- AND POST-DOCTORAL CANDIDATES, TO CONDUCT RESEARCH IN BIOMEDICAL INFORMATICS, BIOINFORMATICS AND RELATED COMPUTER, INFORMATION AND DATA SCIENCES, TO FACILITATE MANAGEMENT OF ELECTRONIC HEALTH RECORDS AND CLINICAL RESEARCH DATA, TO PREPARE SCHOLARLY WORKS IN BIOMEDICINE AND HEALTH, TO ADVANCE BIOCOMPUTING AND BIOINFORMATICS THROUGH PARTICIPATION IN FEDERAL INITIATIVES RELATING TO BIOMEDICAL INFORMATICS, BIOINFORMATICS AND BIOMEDICAL COMPUTING, AND TO STIMULATE AND FOSTER SCIENTIFIC AND TECHNOLOGICAL INNOVATION THROUGH COOPERATIVE RESEARCH DEVELOPMENT CARRIED OUT BETWEEN SMALL BUSINESS CONCERNS AND RESEARCH INSTITUTIONS, THROUGH SMALL BUSINESS INNOVATION RESEARCH (SBIR) AND SMALL BUSINESS TECHNOLOGY TRANSFER (STTR) GRANTS.
Grant Program (CFDA)
Awarding / Funding Agency
Place of Performance
Stanford,
California
94305
United States
Geographic Scope
Single Zip Code
Related Opportunity
Analysis Notes
Amendment Since initial award the total obligations have increased 377% from $667,434 to $3,183,539.
The Leland Stanford Junior University was awarded
Mobile Game for Autism Healthcare: AI-Powered Deep Learning Solution
Project Grant R01LM013364
worth $3,183,539
from the National Library of Medicine in July 2021 with work to be completed primarily in Stanford California United States.
The grant
has a duration of 4 years 8 months and
was awarded through assistance program 93.879 Medical Library Assistance.
The Project Grant was awarded through grant opportunity Research Project Grant (Parent R01 Clinical Trial Not Allowed).
Status
(Ongoing)
Last Modified 7/3/25
Period of Performance
7/2/21
Start Date
3/31/26
End Date
Funding Split
$3.2M
Federal Obligation
$0.0
Non-Federal Obligation
$3.2M
Total Obligated
Activity Timeline
Transaction History
Modifications to R01LM013364
Additional Detail
Award ID FAIN
R01LM013364
SAI Number
R01LM013364-1951318948
Award ID URI
SAI UNAVAILABLE
Awardee Classifications
Private Institution Of Higher Education
Awarding Office
75NL00 NIH National Library of Medicine
Funding Office
75NL00 NIH National Library of Medicine
Awardee UEI
HJD6G4D6TJY5
Awardee CAGE
1KN27
Performance District
CA-16
Senators
Dianne Feinstein
Alejandro Padilla
Alejandro Padilla
Budget Funding
Federal Account | Budget Subfunction | Object Class | Total | Percentage |
---|---|---|---|---|
National Library of Medicine, National Institutes of Health, Health and Human Services (075-0807) | Health research and training | Grants, subsidies, and contributions (41.0) | $1,289,298 | 100% |
Modified: 7/3/25