Large-Scale Item Categorization in e-Commerce Using Multiple Recurrent Neural Networks

Author(s):  
Jung-Woo Ha ◽  
Hyuna Pyo ◽  
Jeonghee Kim
2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
M. Iswarya ◽  
R. Raja ◽  
G. Rajchakit ◽  
J. Cao ◽  
J. Alzabut ◽  
...  

AbstractIn this work, the exponential stability problem of impulsive recurrent neural networks is investigated; discrete time delay, continuously distributed delay and stochastic noise are simultaneously taken into consideration. In order to guarantee the exponential stability of our considered recurrent neural networks, two distinct types of sufficient conditions are derived on the basis of the Lyapunov functional and coefficient of our given system and also to construct a Lyapunov function for a large scale system a novel graph-theoretic approach is considered, which is derived by utilizing the Lyapunov functional as well as graph theory. In this approach a global Lyapunov functional is constructed which is more related to the topological structure of the given system. We present a numerical example and simulation figures to show the effectiveness of our proposed work.


2020 ◽  
Vol 4 (2) ◽  
pp. 448-466
Author(s):  
Amrit Kashyap ◽  
Shella Keilholz

Large-scale patterns of spontaneous whole-brain activity seen in resting-state functional magnetic resonance imaging (rs-fMRI) are in part believed to arise from neural populations interacting through the structural network (Honey, Kötter, Breakspear, & Sporns, 2007 ). Generative models that simulate this network activity, called brain network models (BNM), are able to reproduce global averaged properties of empirical rs-fMRI activity such as functional connectivity (FC) but perform poorly in reproducing unique trajectories and state transitions that are observed over the span of minutes in whole-brain data (Cabral, Kringelbach, & Deco, 2017 ; Kashyap & Keilholz, 2019 ). The manuscript demonstrates that by using recurrent neural networks, it can fit the BNM in a novel way to the rs-fMRI data and predict large amounts of variance between subsequent measures of rs-fMRI data. Simulated data also contain unique repeating trajectories observed in rs-fMRI, called quasiperiodic patterns (QPP), that span 20 s and complex state transitions observed using k-means analysis on windowed FC matrices (Allen et al., 2012 ; Majeed et al., 2011 ). Our approach is able to estimate the manifold of rs-fMRI dynamics by training on generating subsequent time points, and it can simulate complex resting-state trajectories better than the traditional generative approaches.


2018 ◽  
Vol 26 (11) ◽  
pp. 2115-2125 ◽  
Author(s):  
Han Wang ◽  
Kun Xie ◽  
Zhichao Lian ◽  
Yan Cui ◽  
Yaowu Chen ◽  
...  

Author(s):  
Vivek Saraswat ◽  
Udayan Ganguly

Abstract Emerging non-volatile memories have been proposed for a wide range of applications, from easing the von-Neumann bottleneck to neuromorphic applications. Specifically, scalable RRAMs based on Pr1-xCaxMnO3 (PCMO) exhibit analog switching have been demonstrated as an integrating neuron, an analog synapse, and a voltage-controlled oscillator. More recently, the inherent stochasticity of memristors has been proposed for efficient hardware implementations of Boltzmann Machines. However, as the problem size scales, the number of neurons increases and controlling the stochastic distribution tightly over many iterations is necessary. This requires parametric control over stochasticity. Here, we characterize the stochastic Set in PCMO RRAMs. We identify that the Set time distribution depends on the internal state of the device (i.e., resistance) in addition to external input (i.e., voltage pulse). This requires the confluence of contradictory properties like stochastic switching as well as deterministic state control in the same device. Unlike ‘stochastic-everywhere’ filamentary memristors, in PCMO RRAMs, we leverage the (i) stochastic Set in negative polarity and (ii) deterministic analog Reset in positive polarity to demonstrate 100× reduced Set time distribution drift. The impact on Boltzmann Machines’ performance is analyzed and as opposed to the “fixed external input stochasticity”, the “state-monitored stochasticity” can solve problems 20× larger in size. State monitoring also tunes out the device-to-device variability effect on distributions providing 10× better performance. In addition to the physical insights, this study establishes the use of experimental stochasticity in PCMO RRAMs in stochastic recurrent neural networks reliably over many iterations.


2018 ◽  
Vol 27 (01) ◽  
pp. 210-210

Choi S, Lee J, Kang MG, Min H, Chang YS, Yoon S. Large-scale machine learning of media outlets for understanding public reactions to nation-wide viral infection outbreaks. Methods Inf Med 2017;129:50-9 https://linkinghub.elsevier.com/retrieve/pii/S1046-2023(17)30019-1 Dernoncourt F, Lee JY, Uzuner O, Szolovits P. De-identification of patient notes with recurrent neural networks. J Am Med Inform Assoc 2017;24:596-606 https://academic.oup.com/jamia/article-lookup/doi/10.1093/jamia/ocw156


2020 ◽  
Author(s):  
Davide Faranda ◽  
Mathieu Vrac ◽  
Pascal Yiou ◽  
Flavio Maria Emanuele Pons ◽  
Adnane Hamid ◽  
...  

<p>Recent advances in statistical learning have opened the possibility to forecast the behavior of chaotic systems using recurrent neural networks. In this letter we investigate the applicability of this framework to geophysical flows, known to be intermittent and turbulent.  We show that both turbulence and intermittency introduce severe limitations on the applicability of recurrent neural networks, both for short term forecasts as well as for the reconstruction of the underlying attractor. We test these ideas on global sea-level pressure data for the past 40 years, issued from the NCEP reanalysis datase, a proxy of the atmospheric circulation dynamics.  The performance of recurrent neural network in predicting both short and long term behaviors rapidly drops when the systems are perturbed with noise. However, we found that a good predictability is partially recovered when scale separation is performed via a moving average filter. We suggest that possible strategies to overcome limitations  should be based on separating the smooth large-scale dynamics, from the intermittent/turbulent features. </p>


2017 ◽  
Vol 77 (12) ◽  
pp. 15385-15407 ◽  
Author(s):  
Cedric De Boom ◽  
Rohan Agrawal ◽  
Samantha Hansen ◽  
Esh Kumar ◽  
Romain Yon ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document