scholarly journals Fatigue Monitoring in Running Using Flexible Textile Wearable Sensors

Sensors ◽  
2020 ◽  
Vol 20 (19) ◽  
pp. 5573
Author(s):  
Mohsen Gholami ◽  
Christopher Napier ◽  
Astrid García Patiño ◽  
Tyler J. Cuthbert ◽  
Carlo Menon

Fatigue is a multifunctional and complex phenomenon that affects how individuals perform an activity. Fatigue during running causes changes in normal gait parameters and increases the risk of injury. To address this problem, wearable sensors have been proposed as an unobtrusive and portable system to measure changes in human movement as a result of fatigue. Recently, a category of wearable devices that has gained attention is flexible textile strain sensors because of their ability to be woven into garments to measure kinematics. This study uses flexible textile strain sensors to continuously monitor the kinematics during running and uses a machine learning approach to estimate the level of fatigue during running. Five female participants used the sensor-instrumented garment while running to a state of fatigue. In addition to the kinematic data from the flexible textile strain sensors, the perceived level of exertion was monitored for each participant as an indication of their actual fatigue level. A stacked random forest machine learning model was used to estimate the perceived exertion levels from the kinematic data. The machine learning algorithm obtained a root mean squared value of 0.06 and a coefficient of determination of 0.96 in participant-specific scenarios. This study highlights the potential of flexible textile strain sensors to objectively estimate the level of fatigue during running by detecting slight perturbations in lower extremity kinematics. Future iterations of this technology may lead to real-time biofeedback applications that could reduce the risk of running-related overuse injuries.

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 300
Author(s):  
Mark Lokanan ◽  
Susan Liu

Protecting financial consumers from investment fraud has been a recurring problem in Canada. The purpose of this paper is to predict the demographic characteristics of investors who are likely to be victims of investment fraud. Data for this paper came from the Investment Industry Regulatory Organization of Canada’s (IIROC) database between January of 2009 and December of 2019. In total, 4575 investors were coded as victims of investment fraud. The study employed a machine-learning algorithm to predict the probability of fraud victimization. The machine learning model deployed in this paper predicted the typical demographic profile of fraud victims as investors who classify as female, have poor financial knowledge, know the advisor from the past, and are retired. Investors who are characterized as having limited financial literacy but a long-time relationship with their advisor have reduced probabilities of being victimized. However, male investors with low or moderate-level investment knowledge were more likely to be preyed upon by their investment advisors. While not statistically significant, older adults, in general, are at greater risk of being victimized. The findings from this paper can be used by Canadian self-regulatory organizations and securities commissions to inform their investors’ protection mandates.


2021 ◽  
Author(s):  
Aria Abubakar ◽  
Mandar Kulkarni ◽  
Anisha Kaul

Abstract In the process of deriving the reservoir petrophysical properties of a basin, identifying the pay capability of wells by interpreting various geological formations is key. Currently, this process is facilitated and preceded by well log correlation, which involves petrophysicists and geologists examining multiple raw log measurements for the well in question, indicating geological markers of formation changes and correlating them with those of neighboring wells. As it may seem, this activity of picking markers of a well is performed manually and the process of ‘examining’ may be highly subjective, thus, prone to inconsistencies. In our work, we propose to automate the well correlation workflow by using a Soft- Attention Convolutional Neural Network to predict well markers. The machine learning algorithm is supervised by examples of manual marker picks and their corresponding occurrence in logs such as gamma-ray, resistivity and density. Our experiments have shown that, specifically, the attention mechanism allows the Convolutional Neural Network to look at relevant features or patterns in the log measurements that suggest a change in formation, making the machine learning model highly precise.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Mohammad Nahid Hossain ◽  
Mohammad Helal Uddin ◽  
K. Thapa ◽  
Md Abdullah Al Zubaer ◽  
Md Shafiqul Islam ◽  
...  

Cognitive impairment has a significantly negative impact on global healthcare and the community. Holding a person’s cognition and mental retention among older adults is improbable with aging. Early detection of cognitive impairment will decline the most significant impact of extended disease to permanent mental damage. This paper aims to develop a machine learning model to detect and differentiate cognitive impairment categories like severe, moderate, mild, and normal by analyzing neurophysical and physical data. Keystroke and smartwatch have been used to extract individuals’ neurophysical and physical data, respectively. An advanced ensemble learning algorithm named Gradient Boosting Machine (GBM) is proposed to classify the cognitive severity level (absence, mild, moderate, and severe) based on the Standardised Mini-Mental State Examination (SMMSE) questionnaire scores. The statistical method “Pearson’s correlation” and the wrapper feature selection technique have been used to analyze and select the best features. Then, we have conducted our proposed algorithm GBM on those features. And the result has shown an accuracy of more than 94%. This paper has added a new dimension to the state-of-the-art to predict cognitive impairment by implementing neurophysical data and physical data together.


2017 ◽  
Author(s):  
Aymen A. Elfiky ◽  
Maximilian J. Pany ◽  
Ravi B. Parikh ◽  
Ziad Obermeyer

ABSTRACTBackgroundCancer patients who die soon after starting chemotherapy incur costs of treatment without benefits. Accurately predicting mortality risk from chemotherapy is important, but few patient data-driven tools exist. We sought to create and validate a machine learning model predicting mortality for patients starting new chemotherapy.MethodsWe obtained electronic health records for patients treated at a large cancer center (26,946 patients; 51,774 new regimens) over 2004-14, linked to Social Security data for date of death. The model was derived using 2004-11 data, and performance measured on non-overlapping 2012-14 data.Findings30-day mortality from chemotherapy start was 2.1%. Common cancers included breast (21.1%), colorectal (19.3%), and lung (18.0%). Model predictions were accurate for all patients (AUC 0.94). Predictions for patients starting palliative chemotherapy (46.6% of regimens), for whom prognosis is particularly important, remained highly accurate (AUC 0.92). To illustrate model discrimination, we ranked patients initiating palliative chemotherapy by model-predicted mortality risk, and calculated observed mortality by risk decile. 30-day mortality in the highest-risk decile was 22.6%; in the lowest-risk decile, no patients died. Predictions remained accurate across all primary cancers, stages, and chemotherapies—even for clinical trial regimens that first appeared in years after the model was trained (AUC 0.94). The model also performed well for prediction of 180-day mortality (AUC 0.87; mortality 74.8% in the highest risk decile vs. 0.2% in the lowest). Predictions were more accurate than data from randomized trials of individual chemotherapies, or SEER estimates.InterpretationA machine learning algorithm accurately predicted short-term mortality in patients starting chemotherapy using EHR data. Further research is necessary to determine generalizability and the feasibility of applying this algorithm in clinical settings.


Author(s):  
Sachin Dev Suresh ◽  
Ali Qasim ◽  
Bhajan Lal ◽  
Syed Muhammad Imran ◽  
Khor Siak Foo

The production of oil and natural gas contributes to a significant amount of revenue generation in Malaysia thereby strengthening the country’s economy. The flow assurance industry is faced with impediments during smooth operation of the transmission pipeline in which gas hydrate formation is the most important. It affects the normal operation of the pipeline by plugging it. Under high pressure and low temperature conditions, gas hydrate is a crystalline structure consisting of a network of hydrogen bonds between host molecules of water and guest molecules of the incoming gases. Industry uses different types of chemical inhibitors in pipeline to suppress hydrate formation. To overcome this problem, machine learning algorithm has been introduced as part of risk management strategies. The objective of this paper is to utilize Machine Learning (ML) model which is Gaussian Process Regression (GPR). GPR is a new approach being applied to mitigate the growth of gas hydrate. The input parameters used are concentration and pressure of Carbon Dioxide (CO2) and Methane (CH4) gas hydrates whereas the output parameter is the Average Depression Temperature (ADT). The values for the parameter are taken from available data sets that enable GPR to predict the results accurately in terms of Coefficient of Determination, R2 and Mean Squared Error, MSE. The outcome from the research showed that GPR model provided with highest R2 value for training and testing data of 97.25% and 96.71%, respectively. MSE value for GPR was also found to be lowest for training and testing data of 0.019 and 0.023, respectively.


2021 ◽  
Author(s):  
Gábor Csizmadia ◽  
Krisztina Liszkai-Peres ◽  
Bence Ferdinandy ◽  
Ádám Miklósi ◽  
Veronika Konok

Abstract Human activity recognition (HAR) using machine learning (ML) methods is a relatively new method for collecting and analyzing large amounts of human behavioral data using special wearable sensors. Our main goal was to find a reliable method which could automatically detect various playful and daily routine activities in children. We defined 40 activities for ML recognition, and we collected activity motion data by means of wearable smartwatches with a special SensKid software. We analyzed the data of 34 children (19 girls, 15 boys; age range: 6.59 – 8.38; median age = 7.47). All children were typically developing first graders from three elementary schools. The activity recognition was a binary classification task which was evaluated with a Light Gradient Boosted Machine (LGBM)learning algorithm, a decision based method with a 3-fold cross validation. We used the sliding window technique during the signal processing, and we aimed at finding the best window size for the analysis of each behavior element to achieve the most effective settings. Seventeen activities out of 40 were successfully recognized with AUC values above 0.8. The window size had no significant effect. The overall accuracy was 0.95, which is at the top segment of the previously published similar HAR data. In summary, the LGBM is a very promising solution for HAR. In line with previous findings, our results provide a firm basis for a more precise and effective recognition system that can make human behavioral analysis faster and more objective.


Sensors ◽  
2020 ◽  
Vol 20 (6) ◽  
pp. 1557 ◽  
Author(s):  
Ilaria Conforti ◽  
Ilaria Mileti ◽  
Zaccaria Del Prete ◽  
Eduardo Palermo

Ergonomics evaluation through measurements of biomechanical parameters in real time has a great potential in reducing non-fatal occupational injuries, such as work-related musculoskeletal disorders. Assuming a correct posture guarantees the avoidance of high stress on the back and on the lower extremities, while an incorrect posture increases spinal stress. Here, we propose a solution for the recognition of postural patterns through wearable sensors and machine-learning algorithms fed with kinematic data. Twenty-six healthy subjects equipped with eight wireless inertial measurement units (IMUs) performed manual material handling tasks, such as lifting and releasing small loads, with two postural patterns: correctly and incorrectly. Measurements of kinematic parameters, such as the range of motion of lower limb and lumbosacral joints, along with the displacement of the trunk with respect to the pelvis, were estimated from IMU measurements through a biomechanical model. Statistical differences were found for all kinematic parameters between the correct and the incorrect postures (p < 0.01). Moreover, with the weight increase of load in the lifting task, changes in hip and trunk kinematics were observed (p < 0.01). To automatically identify the two postures, a supervised machine-learning algorithm, a support vector machine, was trained, and an accuracy of 99.4% (specificity of 100%) was reached by using the measurements of all kinematic parameters as features. Meanwhile, an accuracy of 76.9% (specificity of 76.9%) was reached by using the measurements of kinematic parameters related to the trunk body segment.


Sensors ◽  
2020 ◽  
Vol 20 (15) ◽  
pp. 4299 ◽  
Author(s):  
Eui Jung Moon ◽  
Youngsik Kim ◽  
Yu Xu ◽  
Yeul Na ◽  
Amato J. Giaccia ◽  
...  

There has been strong demand for the development of an accurate but simple method to assess the freshness of food. In this study, we demonstrated a system to determine food freshness by analyzing the spectral response from a portable visible/near-infrared (VIS/NIR) spectrometer using the Convolutional Neural Network (CNN)-based machine learning algorithm. Spectral response data from salmon, tuna, and beef incubated at 25 °C were obtained every minute for 30 h and then categorized into three states of “fresh”, “likely spoiled”, and “spoiled” based on time and pH. Using the obtained spectral data, a CNN-based machine learning algorithm was built to evaluate the freshness of experimental objects. In addition, a CNN-based machine learning algorithm with a shift-invariant feature can minimize the effect of the variation caused using multiple devices in a real environment. The accuracy of the obtained machine learning model based on the spectral data in predicting the freshness was approximately 85% for salmon, 88% for tuna, and 92% for beef. Therefore, our study demonstrates the practicality of a portable spectrometer in food freshness assessment.


Sensors ◽  
2020 ◽  
Vol 20 (22) ◽  
pp. 6413
Author(s):  
Victor A. Convertino ◽  
Steven G. Schauer ◽  
Erik K. Weitzel ◽  
Sylvain Cardin ◽  
Mark E. Stackle ◽  
...  

Vital signs historically served as the primary method to triage patients and resources for trauma and emergency care, but have failed to provide clinically-meaningful predictive information about patient clinical status. In this review, a framework is presented that focuses on potential wearable sensor technologies that can harness necessary electronic physiological signal integration with a current state-of-the-art predictive machine-learning algorithm that provides early clinical assessment of hypovolemia status to impact patient outcome. The ability to study the physiology of hemorrhage using a human model of progressive central hypovolemia led to the development of a novel machine-learning algorithm known as the compensatory reserve measurement (CRM). Greater sensitivity, specificity, and diagnostic accuracy to detect hemorrhage and onset of decompensated shock has been demonstrated by the CRM when compared to all standard vital signs and hemodynamic variables. The development of CRM revealed that continuous measurements of changes in arterial waveform features represented the most integrated signal of physiological compensation for conditions of reduced systemic oxygen delivery. In this review, detailed analysis of sensor technologies that include photoplethysmography, tonometry, ultrasound-based blood pressure, and cardiogenic vibration are identified as potential candidates for harnessing arterial waveform analog features required for real-time calculation of CRM. The integration of wearable sensors with the CRM algorithm provides a potentially powerful medical monitoring advancement to save civilian and military lives in emergency medical settings.


Author(s):  
Rahayu Abdul Rahman ◽  
◽  
Suraya Masrom ◽  
Nor Balkish Zakaria ◽  
Sunarti Halid

-External auditor is one of the governance mechanisms in mitigating corporate managerial misconduct and thereby enhance the credibility of accounting information. Thus, the main objective of this study is to develop machine learning prediction model on auditor choice of the firm which signal the quality of auditing and financial reporting processes.This paper presents the fundamental knowledge on the design and implementation of machine learning model based on four selected algorithms tested on the real dataset of 2,262 firm-year observations of companies listed on Malaysian stock exchange from 2000 to 2007. The performance of each machine learning algorithm on the auditor choice dataset has been observed based on three groups of features selection namely firm characteristics, governance and ownership. The findings indicated that the machine learning models present better accuracy performance with ownership features selection mainly with the Naïve Bayes algorithm. Keywords-Auditor Choice, Machine Learning, Prediction


Sign in / Sign up

Export Citation Format

Share Document