Time series decomposition over open weather data from Luxembourg

   Assaad Moawad             ·

Time series decomposition is a powerful statistical method that decomposes a signal into several components (usually a trend, a periodic and a random component). These components can be used to do forecasting, prediction or extrapolation of missing data. This topic is relatively old, since the main research paper in the field dates back to 1990. In R language, one of the most famous function for time series decomposition is the stl function. Read More…

     · · · · · · · ·

Neural networks and backpropagation explained in a simple way

   Assaad Moawad             ·

Any complex system can be abstracted in a simple way, or at least dissected to its basic abstract components. Complexity arises by the accumulation of several simple layers. The goal of this post, is to explain how neural networks work with the most simple abstraction. We will try to reduce the machine learning mechanism in NN to its basic abstract components. Unlike other posts that explain neural networks, we will try to use the least possible amount of mathematical equations and programming code, and focus only on the abstract concepts. Read More…

     · ·

The magic of LSTM neural networks

   Assaad Moawad             ·

LSTM Neural Networks, which stand for Long Short-Term Memory, are a particular type of recurrent neural networks that got lot of attention recently within the machine learning community. In a simple way, LSTM networks have some internal contextual state cells that act as long-term or short-term memory cells. The output of the LSTM network is modulated by the state of these cells. This is a very important property when we need the prediction of the neural network to depend on the historical context of inputs, rather than only on the very last input. Read More…

     · · · · ·