Discussing the article: "Creating Time Series Predictions using LSTM Neural Networks: Normalizing Price and Tokenizing Time"

 

Check out the new article: Creating Time Series Predictions using LSTM Neural Networks: Normalizing Price and Tokenizing Time.

This article outlines a simple strategy for normalizing the market data using the daily range and training a neural network to enhance market predictions. The developed models may be used in conjunction with an existing technical analysis frameworks or on a standalone basis to assist in predicting the overall market direction. The framework outlined in this article may be further refined by any technical analyst to develop models suitable for both manual and automated trading strategies.

When I started searching on the internet, I stumbled across some articles describing the use of LSTMs for time series predictions. Specifically, I came across a blog post by Christopher Olah, "Understanding LSTM Networks" on colah's blog. In his blog, Olah explains the structure and function of LSTMs, compares them to standard RNNs, and discusses various LSTM variants, such as those with peephole connections or Gated Recurrent Units (GRUs). Olah concludes by highlighting the significant impact of LSTMs on RNN applications and pointing towards future advancements like attention mechanisms.

In essence, traditional neural networks struggle with tasks requiring context from previous inputs due to their lack of memory. RNNs address this by having loops that allow information to persist, but they still face difficulties with long-term dependencies. For example, predicting the next word in a sentence where relevant context is many words back can be challenging for standard RNNs.  Long Short Term Memory (LSTM) networks are a type of recurrent neural network (RNN) designed to better handle long-term dependencies lacking in RNNs.

LSTMs solve this by using a more complex architecture, which includes a cell state and three types of gates (input, forget, and output) that regulate the flow of information. This design allows LSTMs to remember information for long periods, making them highly effective for tasks like language modeling, speech recognition, and image captioning. What I was interested in exploring was whether LSTMs can help predict price action today based on prior price action on days with similar price action due to their natural ability to remember information for longer periods of time. I came across another helpful article by Adrian Tam, astutely titled "LSTM for Time Series Prediction in PyTorch" that demystified math and the programming aspects for me with a practical example. I felt confident enough to take on the challenge of applying them in attempting to predict the future price action for any given currency pair.

Author: Shashank Rai