scholarly journals Monitoring Occupational Sitting, Standing, and Stepping in Office Employees With the W@W-App and the MetaWearC Sensor: Validation Study

10.2196/15338 ◽  
2020 ◽  
Vol 8 (8) ◽  
pp. e15338 ◽  
Author(s):  
Judit Bort-Roig ◽  
Emilia Chirveches-Pérez ◽  
Francesc Garcia-Cuyàs ◽  
Kieran P Dowd ◽  
Anna Puig-Ribera

Background Replacing occupational sitting time with active tasks has several proposed health benefits for office employees. Mobile phones and motion sensors can provide objective information in real time on occupational sitting behavior. However, the validity and feasibility of using mobile health (mHealth) devices to quantify and modify occupational sedentary time is unclear. Objective The aim of this study is to validate the new Walk@Work-Application (W@W-App)—including an external motion sensor (MetaWearC) attached to the thigh—for measuring occupational sitting, standing, and stepping in free-living conditions against the activPAL3M, the current gold-standard, device-based measure for postural behaviors. Methods In total, 20 office workers (16 [80%] females; mean age 39.5, SD 8.1 years) downloaded the W@W-App to their mobile phones, wore a MetaWearC sensor attached to their thigh using a tailored band, and wore the activPAL3M for 3-8 consecutive working hours. Differences between both measures were examined using paired-samples t tests and Wilcoxon signed-rank tests. Agreement between measures was examined using concordance correlation coefficients (CCCs), 95% CIs, Bland-Altman plots (mean bias, 95% limits of agreement [LoA]), and equivalence testing techniques. Results The median recording time for the W@W-App+MetaWearC and the activPAL3M was 237.5 (SD 132.8) minutes and 240.0 (SD 127.5) minutes, respectively (P<.001). No significant differences between sitting (P=.53), standing (P=.12), and stepping times (P=.61) were identified. The CCC identified substantial agreement between both measures for sitting (CCC=0.98, 95% CI 0.96-0.99), moderate agreement for standing (CCC=0.93, 95% CI 0.81-0.97), and poor agreement for stepping (CCC=0.74, 95% CI 0.47-0.88). Bland-Altman plots indicated that sitting time (mean bias –1.66 minutes, 95% LoA –30.37 to 20.05) and standing time (mean bias –4.85 minutes, 95% LoA –31.31 to 21.62) were underreported. For stepping time, a positive mean bias of 1.15 minutes (95% LoA –15.11 to 17.41) was identified. Equivalence testing demonstrated that the estimates obtained from the W@W-App+MetaWearC and the activPAL3M were considered equivalent for all variables excluding stepping time. Conclusions The W@W-App+MetaWearC is a low-cost tool with acceptable levels of accuracy that can objectively quantify occupational sitting, standing, stationary, and upright times in real time. Due to the availability of real-time feedback for users, this tool can positively influence occupational sitting behaviors in future interventions. Trial Registration ClinicalTrials.gov NCT04092738; https://clinicaltrials.gov/ct2/show/NCT04092738

2019 ◽  
Author(s):  
Judit Bort-Roig ◽  
Emilia Chirveches-Pérez ◽  
Francesc Garcia-Cuyàs ◽  
Kieran P Dowd ◽  
Anna Puig-Ribera

BACKGROUND Replacing occupational sitting time with active tasks has several proposed health benefits for office employees. Mobile phones and motion sensors can provide objective information in real time on occupational sitting behavior. However, the validity and feasibility of using mobile health (mHealth) devices to quantify and modify occupational sedentary time is unclear. OBJECTIVE The aim of this study is to validate the new Walk@Work-Application (W@W-App)—including an external motion sensor (MetaWearC) attached to the thigh—for measuring occupational sitting, standing, and stepping in free-living conditions against the activPAL3M, the current gold-standard, device-based measure for postural behaviors. METHODS In total, 20 office workers (16 [80%] females; mean age 39.5, SD 8.1 years) downloaded the W@W-App to their mobile phones, wore a MetaWearC sensor attached to their thigh using a tailored band, and wore the activPAL3M for 3-8 consecutive working hours. Differences between both measures were examined using paired-samples <i>t</i> tests and Wilcoxon signed-rank tests. Agreement between measures was examined using concordance correlation coefficients (CCCs), 95% CIs, Bland-Altman plots (mean bias, 95% limits of agreement [LoA]), and equivalence testing techniques. RESULTS The median recording time for the W@W-App+MetaWearC and the activPAL3M was 237.5 (SD 132.8) minutes and 240.0 (SD 127.5) minutes, respectively (<i>P</i>&lt;.001). No significant differences between sitting (<i>P</i>=.53), standing (<i>P</i>=.12), and stepping times (<i>P</i>=.61) were identified. The CCC identified substantial agreement between both measures for sitting (CCC=0.98, 95% CI 0.96-0.99), moderate agreement for standing (CCC=0.93, 95% CI 0.81-0.97), and poor agreement for stepping (CCC=0.74, 95% CI 0.47-0.88). Bland-Altman plots indicated that sitting time (mean bias –1.66 minutes, 95% LoA –30.37 to 20.05) and standing time (mean bias –4.85 minutes, 95% LoA –31.31 to 21.62) were underreported. For stepping time, a positive mean bias of 1.15 minutes (95% LoA –15.11 to 17.41) was identified. Equivalence testing demonstrated that the estimates obtained from the W@W-App+MetaWearC and the activPAL3M were considered equivalent for all variables excluding stepping time. CONCLUSIONS The W@W-App+MetaWearC is a low-cost tool with acceptable levels of accuracy that can objectively quantify occupational sitting, standing, stationary, and upright times in real time. Due to the availability of real-time feedback for users, this tool can positively influence occupational sitting behaviors in future interventions. CLINICALTRIAL ClinicalTrials.gov NCT04092738; https://clinicaltrials.gov/ct2/show/NCT04092738


2014 ◽  
Vol 11 (7) ◽  
pp. 1318-1323 ◽  
Author(s):  
Gemma Cathrine Ryde ◽  
Helen Elizabeth Brown ◽  
Nicholas David Gilson ◽  
Wendy J. Brown

Background:Prolonged occupational sitting is related to poor health outcomes. Detailed data on sitting time at desks are required to understand and effectively influence occupational sitting habits.Methods:Full-time office employees were recruited (n = 105; mean age 40.9 ± 11.5 years; BMI 26.1 ± 3.9, 65% women). Sitting at the desk and in other work contexts was measured using a sitting pad and ActivPAL for an entire working week. Employees used a diary to record work hours. Time spent at work, sitting at work and at the desk; number of sit to stand transitions at the desk; and number of bouts of continuous sitting at the desk < 20 and > 60 minutes, were calculated.Results:Average time spent at work was 8.7 ± 0.8 hours/day with 67% spent sitting at the desk (5.8 ± 1.2 hours/day), and 4% in other workplace settings. On average, employees got up from their desks 3 times/hour (29 ± 13/day). Sitting for more than 60 consecutive minutes occurred infrequently (0.69 ± 0.62 times/day), with most sit to stands (80%; 23 ± 14) occurring before 20 minutes of continual sitting.Conclusion:The findings provide highly detailed insights into desk-based sitting habits, highlighting large proportions of time spent sitting at desks, but with frequent interruptions.


Author(s):  
S. Karnouskos

An old saying coming from the telecom world states that nothing can be really considered as a service unless you are able to charge for it. As we move towards a service-oriented society, the necessity to pay in real time for a variety of services via different channels anywhere, anytime, in any currency increases. According to Gartner (www.gartner.com), worldwide mobile phone sales totaled 816.6 million units in 2005, a 21% increase from 2004. Due to the high penetration rates of the mobile devices, they pose an interesting candidate for the real-time payment scenarios. Several efforts have already been done (Karnouskos, 2004), but as new technology comes aboard, new capabilities are also brought along. Near Field Communication (NFC) is such a technology, which due to the industry support and its low cost (in comparison with similar ones) may become dominant in short-range communication among a variety of devices, including mobile phones. NFC is well equipped in order to facilitate mobile payments with little interference from the user side.


2011 ◽  
Author(s):  
Christopher S. Walsh ◽  
Tom Power
Keyword(s):  

Author(s):  
Gabriel de Almeida Souza ◽  
Larissa Barbosa ◽  
Glênio Ramalho ◽  
Alexandre Zuquete Guarato

2007 ◽  
Author(s):  
R. E. Crosbie ◽  
J. J. Zenor ◽  
R. Bednar ◽  
D. Word ◽  
N. G. Hingorani

2019 ◽  
Vol 2019 ◽  
pp. 1-14 ◽  
Author(s):  
Yong He ◽  
Hong Zeng ◽  
Yangyang Fan ◽  
Shuaisheng Ji ◽  
Jianjian Wu

In this paper, we proposed an approach to detect oilseed rape pests based on deep learning, which improves the mean average precision (mAP) to 77.14%; the result increased by 9.7% with the original model. We adopt this model to mobile platform to let every farmer able to use this program, which will diagnose pests in real time and provide suggestions on pest controlling. We designed an oilseed rape pest imaging database with 12 typical oilseed rape pests and compared the performance of five models, SSD w/Inception is chosen as the optimal model. Moreover, for the purpose of the high mAP, we have used data augmentation (DA) and added a dropout layer. The experiments are performed on the Android application we developed, and the result shows that our approach surpasses the original model obviously and is helpful for integrated pest management. This application has improved environmental adaptability, response speed, and accuracy by contrast with the past works and has the advantage of low cost and simple operation, which are suitable for the pest monitoring mission of drones and Internet of Things (IoT).


2021 ◽  
Vol 11 (11) ◽  
pp. 4940
Author(s):  
Jinsoo Kim ◽  
Jeongho Cho

The field of research related to video data has difficulty in extracting not only spatial but also temporal features and human action recognition (HAR) is a representative field of research that applies convolutional neural network (CNN) to video data. The performance for action recognition has improved, but owing to the complexity of the model, some still limitations to operation in real-time persist. Therefore, a lightweight CNN-based single-stream HAR model that can operate in real-time is proposed. The proposed model extracts spatial feature maps by applying CNN to the images that develop the video and uses the frame change rate of sequential images as time information. Spatial feature maps are weighted-averaged by frame change, transformed into spatiotemporal features, and input into multilayer perceptrons, which have a relatively lower complexity than other HAR models; thus, our method has high utility in a single embedded system connected to CCTV. The results of evaluating action recognition accuracy and data processing speed through challenging action recognition benchmark UCF-101 showed higher action recognition accuracy than the HAR model using long short-term memory with a small amount of video frames and confirmed the real-time operational possibility through fast data processing speed. In addition, the performance of the proposed weighted mean-based HAR model was verified by testing it in Jetson NANO to confirm the possibility of using it in low-cost GPU-based embedded systems.


Sign in / Sign up

Export Citation Format

Share Document