*Eugene Agichtein, Ph.D., Elizabeth Buffalo, Dmitry Lagun, Alan I. Levey, Cecelia Manzanares, JongHo Shin, Stuart Zola
Emory University
Oral Presentation – Research Track
Saturday, Sept 29, 2012: 11:50 AM – 12:10 PM – LK130

← Back to Medicine X 2012 Proceedings

*Presenting Speaker

(Research in Progress)

Background
Alzheimer’s disease (AD) affects over 5.4 million Americans, and this number is expected to grow significantly in the coming decade. A critical goal of AD research is to improve methods of early diagnosis and enable early intervention. Recently, the Visual Paired Comparison task (VPC) has shown the potential to predict cognitive impairment up to three years prior to clinical diagnosis. VPC is a recognition memory task that assesses the proportion of time an individual spends viewing a novel picture compared to a recently seen (familiar) picture. Because they remember the familiar picture, normal individuals spend more time viewing the novel picture, unlike those with high risk for developing AD. However, clinical application of VPC is limited by the need to use expensive eye tracking equipment, a dedicated testing facility, and on-site trained personnel.

Objective
We aim to develop VPC-Web, an automated version of the VPC test, which would enable cognitive assessment of patients and research subjects using any computer with internet access. VPC-Web employs a specialized user interface to track a patient’s examination of the images without requiring eye tracking equipment or specialized personnel, together with machine learning techniques from computer science to accurately identify the memory status of subjects by mining their image viewing behavior. These tools could dramatically facilitate the current practice of clinical translational research as well as the current methods used for diagnosing cognitive deficits.

Methods
The present study was designed to establish feasibility of VPC-Web in a group of elderly subjects with and without memory loss. VPC-Web can be administered either in the lab or over the internet. The subjects are presented with blurred pictures and are instructed to move an oval-shaped “viewport” with a computer mouse or trackpad to reveal a part of the picture in sharp detail (simulating foveal vision). Initial testing and validation of VPC-Web was performed with over 50 presumed normal subjects recruited over the Internet. These data were used to optimize the VPC-Web interface. Subsequently, VPC-Web was administered to 12 elderly individuals diagnosed with memory impairment and 11 healthy age-matched controls. Both studies were approved by the Emory IRB. The image examination data were analyzed for novelty preference, and further mined using automated classification algorithms (e.g., Support Vector Machines).

Results
90% (21/23) of the subjects successfully completed the test. In addition, performance on the task was significantly different for the two groups of subjects: 65% novelty preference for Controls, and 51% for impaired subjects (p=0.035, Wilcoxon rank-sum test), similar to performance on the eye tracking-based VPC task. Using randomized cross validation experiments, machine learning methods allowed VPC-Web to classify impaired vs. normal control subjects with accuracy of 86%, specificity of 91%, and sensitivity of 80%. These levels are competitive with the best known manually administered tests currently used in clinical practice.

Conclusions
These preliminary data demonstrate that VPC-Web is able to distinguish normal from impaired patients, providing an accessible, inexpensive way to detect a prodromal phase of Alzheimer’s disease. VPC-Web has the potential to dramatically increase the number of people who could be assessed for cognitive disorders and to significantly improve our ability to identify those at risk. Current and future research plans include improving the usability of the test, refining the detection algorithms, and performing a large-scale longitudinal study.

Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Start typing and press Enter to search