scholarly journals Parkinson’s Disease Diagnosis and Severity Assessment Using Ground Reaction Forces and Neural Networks

2020 ◽  
Vol 11 ◽  
Author(s):  
Srivardhini Veeraragavan ◽  
Alpha Agape Gopalai ◽  
Darwin Gouwanda ◽  
Siti Anom Ahmad
2011 ◽  
Vol 106 (2) ◽  
pp. 915-924 ◽  
Author(s):  
Mark W. Rogers ◽  
Robert Kennedy ◽  
Sonia Palmer ◽  
Monika Pawar ◽  
Maggie Reising ◽  
...  

People with Parkinson's disease (PD) frequently have difficulties with generating anticipatory postural adjustments (APAs) for forward propulsion and lateral weight transfer when initiating gait. This impairment has been attributed to deficits in motor planning and preparation. This study examined the preparation of APAs prior to an imperative cue to initiate forward stepping. A startling acoustic stimulus (SAS) was used to probe the state of preparation of the APA in eight PD (off medication) and seven matched control subjects. Subjects performed visually cued trials involving a pre-cue light instructing them to prepare to step, followed 3.5 s later by a go-cue light to rapidly initiate stepping. In random trials, a SAS (124 dB) was presented at −1,500, −1,000, −500, −250, −100, or 0 ms before the go-cue. Subjects also performed self-initiated steps. Ground reaction forces (GRFs), center of pressure (CoP) changes, and electromyographic (EMG) signals were recorded. The SAS triggered APAs in 94 ± 11% (PD) and 96 ± 8% (control) of trials at latencies 89 ± 4 ms (PD) and 97 ± 3 ms (control) earlier than Control trials. The temporal profile of APA preparation was similar between groups. However, peak EMG, GRF, and mediolateral CoP amplitudes were reduced in PD. SAS-evoked APAs at 0 ms matched Control trial APAs and were enhanced compared with self-initiated stepping. These results demonstrate that people with mild to moderate PD can plan and prepare the appropriate APA sequence prior to the expected cue to initiate gait; however, the prepared APAs are underscaled in magnitude.


2017 ◽  
Vol 50 ◽  
pp. 75-82 ◽  
Author(s):  
Moataz Eltoukhy ◽  
Christopher Kuenze ◽  
Michael S. Andersen ◽  
Jeonghoon Oh ◽  
Joseph Signorile

2018 ◽  
Vol 28 (10) ◽  
pp. 1850035 ◽  
Author(s):  
Francisco J. Martinez-Murcia ◽  
Juan M. Górriz ◽  
Javier Ramírez ◽  
Andres Ortiz

Spatial and intensity normalizations are nowadays a prerequisite for neuroimaging analysis. Influenced by voxel-wise and other univariate comparisons, where these corrections are key, they are commonly applied to any type of analysis and imaging modalities. Nuclear imaging modalities such as PET-FDG or FP-CIT SPECT, a common modality used in Parkinson’s disease diagnosis, are especially dependent on intensity normalization. However, these steps are computationally expensive and furthermore, they may introduce deformations in the images, altering the information contained in them. Convolutional neural networks (CNNs), for their part, introduce position invariance to pattern recognition, and have been proven to classify objects regardless of their orientation, size, angle, etc. Therefore, a question arises: how well can CNNs account for spatial and intensity differences when analyzing nuclear brain imaging? Are spatial and intensity normalizations still needed? To answer this question, we have trained four different CNN models based on well-established architectures, using or not different spatial and intensity normalization preprocessings. The results show that a sufficiently complex model such as our three-dimensional version of the ALEXNET can effectively account for spatial differences, achieving a diagnosis accuracy of 94.1% with an area under the ROC curve of 0.984. The visualization of the differences via saliency maps shows that these models are correctly finding patterns that match those found in the literature, without the need of applying any complex spatial normalization procedure. However, the intensity normalization — and its type — is revealed as very influential in the results and accuracy of the trained model, and therefore must be well accounted.


Sign in / Sign up

Export Citation Format

Share Document