Prediction of Seoul House Price Index Using Deep Learning Algorithms with Multivariate Time Series Data

2018 ◽  
Vol 8 (2) ◽  
pp. 39-56 ◽  
Author(s):  
Tae Hyeong Lee ◽  
Myung-Jin Jun
2021 ◽  
Vol 13 (3) ◽  
pp. 67
Author(s):  
Eric Hitimana ◽  
Gaurav Bajpai ◽  
Richard Musabe ◽  
Louis Sibomana ◽  
Jayavel Kayalvizhi

Many countries worldwide face challenges in controlling building incidence prevention measures for fire disasters. The most critical issues are the localization, identification, detection of the room occupant. Internet of Things (IoT) along with machine learning proved the increase of the smartness of the building by providing real-time data acquisition using sensors and actuators for prediction mechanisms. This paper proposes the implementation of an IoT framework to capture indoor environmental parameters for occupancy multivariate time-series data. The application of the Long Short Term Memory (LSTM) Deep Learning algorithm is used to infer the knowledge of the presence of human beings. An experiment is conducted in an office room using multivariate time-series as predictors in the regression forecasting problem. The results obtained demonstrate that with the developed system it is possible to obtain, process, and store environmental information. The information collected was applied to the LSTM algorithm and compared with other machine learning algorithms. The compared algorithms are Support Vector Machine, Naïve Bayes Network, and Multilayer Perceptron Feed-Forward Network. The outcomes based on the parametric calibrations demonstrate that LSTM performs better in the context of the proposed application.


Author(s):  
Pradeep Lall ◽  
Tony Thomas ◽  
Ken Blecker

Abstract This study focuses on the feature vector identification and Remaining Useful Life (RUL) estimation of SAC305 solder alloy PCB's of two different configurations during varying conditions of temperature and vibration. The feature vectors are identified using the strain signals acquired from four symmetrical locations of the PCB at regular intervals during vibration. Two different types of experiments are employed to characterize the PCB's dynamic changes with varying temperature and acceleration levels. The strain signals acquired during each of these experiments are compared based on both time and frequency domain characteristics. Different statistical and frequency-based techniques were used to identify the strain signal variations with changes in the environment and loading conditions. The feature vectors in predicting failure at a constant working temperature and load were identified, and as an extension to this work, the effectiveness of the feature vectors during varying conditions of temperature and acceleration levels are investigated. The remaining Useful Life of the packages was estimated using a deep learning approach based on Long Short Term Memory (LSTM) network. This technique can identify the underlying patterns in multivariate time series data that can predict the packages' life. The autocorrelation function's residuals were used as the multivariate time series data in conjunction with the LSTM deep learning technique to forecast the packages' life at different varying temperatures and acceleration levels during vibration.


Author(s):  
B. Sushrith Et.al

In this paper, focus is made on predicting the patients who are going to be re-admitted back in the hospital before discharge using latest deep-learning algorithms is applied on the electronic health records of patients which is a time-series data. To begin with the study of the data and its analysis this project deployed the conventional supervised ML algorithms like the Logistic Regression, Naïve Bayes, Random Forest and SVM and compared their performances on different portion sizes of dataset. The final model built uses deep-learning architectures such as RNN and LSTM to improve the prediction results taking advantage of the time series data. Another feature added has been of low dimensional descriptions of medical concepts as the input to the model. Ultimately, this work tests, validates, and explains the developed system using the MIMIC-III dataset, which contains around 38000 patient’s information and about 61,155 patient’s data who admitted in ICU, duration of 10 years. The support from this exhaustive dataset is used to train the models that provide healthcare workers with proper information regarding their discharge and readmission in hospitals. These ML and deep learning models are used to know about the patient who is getting to be readmitted in the ICU before his discharge will help the hospital to allocate resources properly and also reduce the financial risk of patients. In order to reduce ICU readmission that can be avoided, hospitals have to be able to recognize patients who have a higher risk of ICU readmission. Those patients can then continue to stay in the ICU so that they will not have the risk of getting admit back to the hospital. Also, the resources of hospitals that were required for avoidable readmission can be re-allocated to more critical areas in the hospital that need them. A more effective model of predicting readmission system can play an important role in helping hospitals and ICU doctors to find the patients who are going to be readmitted before discharge. To build this system here we use different ML and deep-learning algorithms. Predictive models based on huge amounts of data are made to predict the patients who are going to be admitted back in the hospital after discharge.


2021 ◽  
Vol 11 (20) ◽  
pp. 9373
Author(s):  
Jie Ju ◽  
Fang-Ai Liu

Deep learning models have been widely used in prediction problems in various scenarios and have shown excellent prediction effects. As a deep learning model, the long short-term memory neural network (LSTM) is potent in predicting time series data. However, with the advancement of technology, data collection has become more accessible, and multivariate time series data have emerged. Multivariate time series data are often characterized by a large amount of data, tight timeline, and many related sequences. Especially in real data sets, the change rules of many sequences will be affected by the changes of other sequences. The interacting factors data, mutation information, and other issues seriously impact the prediction accuracy of deep learning models when predicting this type of data. On the other hand, we can also extract the mutual influence information between different sequences and simultaneously use the extracted information as part of the model input to make the prediction results more accurate. Therefore, we propose an ATT-LSTM model. The network applies the attention mechanism (attention) to the LSTM to filter the mutual influence information in the data when predicting the multivariate time series data, which makes up for the poor ability of the network to process data. Weaknesses have greatly improved the accuracy of the network in predicting multivariate time series data. To evaluate the model’s accuracy, we compare the ATT-LSTM model with the other six models on two real multivariate time series data sets based on two evaluation indicators: Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). The experimental results show that the model has an excellent performance improvement compared with the other six models, proving the model’s effectiveness in predicting multivariate time series data.


2020 ◽  
Vol 34 (04) ◽  
pp. 6845-6852 ◽  
Author(s):  
Xuchao Zhang ◽  
Yifeng Gao ◽  
Jessica Lin ◽  
Chang-Tien Lu

With the advance of sensor technologies, the Multivariate Time Series classification (MTSC) problem, perhaps one of the most essential problems in the time series data mining domain, has continuously received a significant amount of attention in recent decades. Traditional time series classification approaches based on Bag-of-Patterns or Time Series Shapelet have difficulty dealing with the huge amounts of feature candidates generated in high-dimensional multivariate data but have promising performance even when the training set is small. In contrast, deep learning based methods can learn low-dimensional features efficiently but suffer from a shortage of labelled data. In this paper, we propose a novel MTSC model with an attentional prototype network to take the strengths of both traditional and deep learning based approaches. Specifically, we design a random group permutation method combined with multi-layer convolutional networks to learn the low-dimensional features from multivariate time series data. To handle the issue of limited training labels, we propose a novel attentional prototype network to train the feature representation based on their distance to class prototypes with inadequate data labels. In addition, we extend our model into its semi-supervised setting by utilizing the unlabeled data. Extensive experiments on 18 datasets in a public UEA Multivariate time series archive with eight state-of-the-art baseline methods exhibit the effectiveness of the proposed model.


2021 ◽  
Author(s):  
Ilan Sousa Figueirêdo ◽  
Tássio Farias Carvalho ◽  
Wenisten José Dantas Silva ◽  
Lílian Lefol Nani Guarieiro ◽  
Erick Giovani Sperandio Nascimento

Abstract Detection of anomalous events in practical operation of oil and gas (O&G) wells and lines can help to avoid production losses, environmental disasters, and human fatalities, besides decreasing maintenance costs. Supervised machine learning algorithms have been successful to detect, diagnose, and forecast anomalous events in O&G industry. Nevertheless, these algorithms need a large quantity of annotated dataset and labelling data in real world scenarios is typically unfeasible because of exhaustive work of experts. Therefore, as unsupervised machine learning does not require an annotated dataset, this paper intends to perform a comparative evaluation performance of unsupervised learning algorithms to support experts for anomaly detection and pattern recognition in multivariate time-series data. So, the goal is to allow experts to analyze a small set of patterns and label them, instead of analyzing large datasets. This paper used the public 3W database of three offshore naturally flowing wells. The experiment used real data of production of O&G from underground reservoirs with the following anomalous events: (i) spurious closure of Downhole Safety Valve (DHSV) and (ii) quick restriction in Production Choke (PCK). Six unsupervised machine learning algorithms were assessed: Cluster-based Algorithm for Anomaly Detection in Time Series Using Mahalanobis Distance (C-AMDATS), Luminol Bitmap, SAX-REPEAT, k-NN, Bootstrap, and Robust Random Cut Forest (RRCF). The comparison evaluation of unsupervised learning algorithms was performed using a set of metrics: accuracy (ACC), precision (PR), recall (REC), specificity (SP), F1-Score (F1), Area Under the Receiver Operating Characteristic Curve (AUC-ROC), and Area Under the Precision-Recall Curve (AUC-PRC). The experiments only used the data labels for assessment purposes. The results revealed that unsupervised learning successfully detected the patterns of interest in multivariate data without prior annotation, with emphasis on the C-AMDATS algorithm. Thus, unsupervised learning can leverage supervised models through the support given to data annotation.


Sign in / Sign up

Export Citation Format

Share Document