Quantitative trading - page 23

 

Volatility Trading: Trading The Fear Index VIX



Volatility Trading: Trading The Fear Index VIX

The session began with the host and guest speaker providing an agenda for the webinar, which aimed to enhance participants' understanding of volatility in financial markets. They started by defining volatility and its association with the VIX, also known as the "fear index." The speaker delved into the different types of VIX and VIX-based derivatives, shedding light on their significance in trading. The session also included a practical approach to trading the VIX and concluded with a Q&A session to address any queries from the audience.

To illustrate the concept of volatility, the host used Tesla as an example of a highly volatile stock, explaining how its daily returns fluctuate between -20% to +20%. This level of volatility makes it a risky asset to handle. The host emphasized that merely looking at the price graph of an asset does not provide a clear idea of its volatility. Instead, it is the daily returns that offer a better indication of an asset's volatility.

The video further explored the application of volatility beyond options trading and its usefulness in making decisions about purchasing assets as a whole. The speaker categorized volatility based on the magnitude of an asset's fluctuations, ranging from high to low volatility. A comparison between Tesla and the S&P 500 was made, with the S&P 500 being considerably lower in volatility. Various methods of measuring volatility were discussed, including standard deviation and beta, which provide historical values of volatility. The concept of implied volatility was introduced, representing the market's expectation of an asset's future movements without specifying the direction of those movements.

The webinar then focused on explaining the calculation of the VIX, or volatility index, and its utilization of implied volatility from different types of index options to gauge the potential for sharp changes. The VIX is commonly referred to as the "fear index" and is graphed in relation to the S&P 500. While the VIX typically aims to stay low, unexpected events can cause it to spike, leading to increased fear in the market. The actual calculation of the VIX is conducted by the CBOE, providing traders with the figures they need to track the VIX's journey and its relationship with the underlying index. Overall, the VIX serves as an essential tool for traders seeking to mitigate risk in the market.

The speaker further discussed the relationship between the VIX and the S&P 500, emphasizing that the VIX reflects the market's expectation of volatility in the index's future and how it reacts during times of uncertainty when the S&P 500 experiences declines. The speaker cited examples such as the US-China trade war and the COVID-19 pandemic to illustrate the correlation between the VIX and the S&P 500. While the VIX strives to remain low, unexpected events can lead to a sharp increase in volatility. However, as traders process new information and uncertainty diminishes, volatility also decreases.

The concept of the fear index or VIX was introduced as a measure of traders' fear regarding negative news impacting the market. It was highlighted that the VIX is not limited to the S&P 500 but can be applied to other geographical areas, such as the Australian Stock Exchange, Eurozone stocks, and the Hang Seng Index, as well as other asset classes like commodities and currencies. The need for the VIX arises because traders may have expectations of market volatility, but it is not the sole factor in determining trading decisions since options Greeks also play a role. Therefore, the VIX serves as a tool for traders to trade options based on market volatility. Although the VIX itself does not have a trading instrument, derivatives such as futures and options enable the estimation of future volatility, facilitating trading strategies.

The different types of VIX futures available for trading were discussed, including standard, near month, next month, far month expiries, and weekly expiries. The video highlighted that while VIX futures can be expensive, there are mini-futures available at one-tenth of the value, providing a more accessible option for traders. Additionally, VIX ETFs (Exchange-Traded Funds) were introduced as an alternative to trading VIX futures. These ETFs derive their value from VIX futures and offer different options based on traders' preferences. Short-term VIX ETFs, such as VIXY, track near month and next month futures, while medium-term VIX ETFs, like VIXM, track medium-term futures. Inverse VIX ETFs, such as SVXY, were also mentioned, as they move in the opposite direction of VIX futures, increasing in value when futures decline. Traders can choose from these various types of VIX futures and ETFs based on their market outlook and trading strategies.

Moving on, the video explored other VIX-based derivatives, including VIX ETFs and VIX ETNs (Exchange-Traded Notes). VIX ETFs were explained to have underlying VIX futures, providing exposure to volatility in the market. On the other hand, VIX ETNs were highlighted as not having an underlying asset. The speaker mentioned the popular VXX as an example of a VIX ETN. It was emphasized that trading VIX-based derivatives comes with risks, and it is crucial for traders to understand these risks before engaging in such trading activities. Testing and backtesting strategies in a paper trading environment were recommended before trading with real capital. ETNs, in particular, carry issuer risk, meaning that if the company issuing the ETNs fails to fulfill its obligations, investors' capital could be at risk. Additionally, VIX futures were noted to have a contango effect that introduces certain risks and considerations for traders.

The speaker delved into the topic of VIX futures convergence as they approach their expiry date. They explained that as the expiration date nears, VIX futures prices tend to converge. It was stressed that being on the right side of the trade before this convergence is crucial for traders involved in VIX futures trading. The video then introduced a simple VIX-based strategy that involves using the VIX to hedge a portfolio during declining times by going long on VIX futures. This strategy was tested and found to yield three times higher returns between 2011 and 2021 when combined with a portfolio of the S&P 500. The importance of backtesting ideas and practicing them in a paper trading environment was emphasized as a means to gain confidence before implementing them in real trading scenarios.

The webinar hosts shared information about a course they have developed called "Volatility Trading Strategies for Beginners." The course focuses on teaching traders various methods of measuring volatility, including ATR (Average True Range), standard deviation, VIX, and beta. They emphasized the significance of equipping oneself with the right tools and knowledge to trade without fear of volatility. The hosts mentioned that the course is currently available at a 67% discount for a limited time. Additionally, attendees of the webinar were offered an additional 10% discount on the course using the coupon code VTS10. The hosts also took the opportunity to address some questions from the audience, including inquiries about the focus on the US market when analyzing the VIX and whether the VIX acts as a leading or lagging indicator of price movements.

The speaker further explained the near-instantaneous reaction of the VIX to the S&P 500. While the specific VIX range was not discussed, it was noted that the 30-day volatility is annualized and falls within a range of 0 to 100. The speaker highlighted different phases of the VIX, such as the low to medium phase ranging from 10 to 20 and the medium phase from 20 to 25. The speaker acknowledged that herding, or the tendency of market participants to act collectively, can impact the VIX. The video also mentioned the availability of futures options for India VIX, although liquidity in those options is limited due to high capital requirements.

During the Q&A session, the video addressed several questions related to trading volatility and the VIX. One question inquired about the possibility of trading VIX-based derivatives while being based in India. The response indicated that while it is an emerging practice, some trading platforms do allow for trading VIX-based derivatives in India. Another question raised the idea of including sentiment of news as an additional parameter in option pricing models. The speaker explained that the VIX belongs to a different asset class and does not use the same models as other options. However, the video acknowledged that sentiment analysis can play a role in understanding market dynamics. Additionally, the video briefly mentioned UVIX and SVIX as underlying assets that can be treated similarly to other assets when considering trading strategies.

The discussion then turned to the rules of a combined portfolio strategy, which was mentioned earlier in the video. The speaker explained the criteria for entry and exit rules in this strategy. The entry rule focuses on the behavior of the S&P 500, where if it is declining, traders can reserve capital to go long on the VIX. It was noted that the VIX generally rises when the S&P 500 falls. On the other hand, the exit rule considers the behavior of the S&P 500 to determine whether it has transitioned out of a bear market and if the overall economy is performing well, indicating a bull market. Traders were advised to evaluate the conditions of the market before making decisions on entering or exiting trades.

The webinar provided detailed insights into volatility trading, with a particular emphasis on the VIX as a key indicator. It covered topics such as understanding volatility, measuring and categorizing volatility, the calculation of the VIX, different types of VIX-based derivatives, and strategies for trading volatility. The hosts also offered a course on volatility trading strategies for beginners, encouraging traders to equip themselves with the necessary knowledge and tools to navigate the market with confidence. The webinar concluded with an interactive Q&A session, addressing various questions from the audience and providing further clarity on the topics discussed.

  • 00:00:00 The host and guest speaker provide an agenda for the session, starting with defining and understanding volatility in the financial markets. The speaker continues to explain why the VIX is referred to as the "fear index" and the different types of VIX and VIX-based derivatives. The session also includes a practical approach to trading the VIX before ending with a Q&A session. The host describes how people associate volatility with unstable chemicals or liquids and explains how it applies to trading.

  • 00:05:00 Tesla is a good example of a highly volatile stock, with fluctuations in its daily returns ranging between -20% to +20%. This sharp fluctuation makes it a risky asset to handle. However, looking at its price graph alone does not provide a clear idea of how volatile an asset is. It's the daily returns that give a better indication of an asset's volatility.

  • 00:10:00 The video discusses the use of volatility beyond just options trading and how it can be useful in making decisions on whether to buy an asset as a whole. The video explains that volatility can be categorized based on how much an asset fluctuates, ranging from high to low volatility. The S&P 500 is used as a comparison to Tesla, as it is considerably lower in volatility. The video discusses methods used to measure volatility, including standard deviation and beta, which provide historical values of volatility. The concept of implied volatility is also introduced, which is the market's expectation of how much an asset will move in the future but does not provide an idea of which direction that move will be.

  • 00:15:00 A clear understanding of how the VIX, or volatility index, is calculated and how it uses implied volatility of different types of index options to give an idea of how sharp changes may be. The VIX is often referred to as the "fear index" and is graphed in relation to the S&P 500. The VIX typically tries to stay low, but unexpected events can cause it to shoot up, hence the fear aspect. The hard work behind calculating the VIX is done by the CBOE, which gives the figures to traders, allowing them to focus on the VIX's journey and its relation to the underlying index. Overall, the VIX is an important tool for traders looking to mitigate risk in the market.

  • 00:20:00 The speaker discusses the relationship between the VIX, also known as the fear index, and the S&P 500. They explain that the VIX is the market's expectation of how volatile the index will be in the future and how it reacts when the S&P 500 declines due to uncertainty. The speaker uses several examples, such as the US-China trade war and the COVID-19 pandemic, to demonstrate the correlation between the two. They clarify that the VIX tries to stay low but can experience a sharp increase due to unexpected events, resulting in increased volatility. However, as traders process new information, the uncertainty level decreases, and so does the volatility.

  • 00:25:00 The concept of the fear index or VIX is introduced as a measure of how much traders are fearful regarding negative news affecting the market. The VIX is not only applicable to the S&P 500 but may also be applied to other geographical areas such as the Australian Stock Exchange, Eurozone stocks, and the Hang Seng Index and even to other asset classes such as commodities and currencies. The need for the VIX arises because traders may have expectations of market volatility but it won't be the only factor in determining trading decisions since the options Greeks factor in as well. As such, the VIX serves as a tool for traders to trade options based on market volatility, and although VIX has no trading instrument, it has derivatives that enable the estimation of future volatility to facilitate trading. These derivatives include futures and options.

  • 00:30:00 The speaker explains the different types of VIX futures available for trading, which include standard, near month, next month, and far month expiries, as well as weekly expiries. While VIX futures can be expensive, there are mini-futures available at one-tenth of the value. Additionally, VIX ETFs can be used as an alternative and derive their value from VIX futures. Short-term VIX ETFs, such as VIXY, track near month and next month futures, while medium-term VIX ETFs, like VIXM, track medium-term futures. The speaker also mentions inverse VIX ETFs, such as SVXY, which are completely inverse to VIX futures and increase in value when futures decline. Ultimately, traders can use these different types of VIX futures and ETFs depending on their view of the market.

  • 00:35:00 The different types of VIX-based derivatives are discussed, including VIX ETFs and VIX exchange-traded notes (ETNs). VIX ETFs have underlying VIX futures, while VIX ETNs have no underlying. The VXX is an example of a popular VIX ETN. However, it is important to note that there are risks with VIX-based derivatives, and it is essential to understand them before trading. It is advisable to test and backtest strategies before trading with real capital. ETNs come with issuer risk, which means that if the company that issues the ETNs cannot hold its promise, the investor's capital is at stake. Additionally, VIX futures have a contango effect that can lead to risks.

  • 00:40:00 The speaker discusses the convergence of VIX futures prices as they draw closer to their expiry date and the importance of being on the right side of the trade before trading VIX futures. They then explain a simple VIX-based strategy, involving using the VIX to hedge a portfolio during declining times by going long on VIX futures. This strategy was tested in a course on volatility trading and resulted in three times higher returns between 2011 and 2021, using a combined portfolio of the S&P 500 and VIX futures. The speaker emphasizes the need to backtest ideas and try them in a paper trading environment before trading blindly.

  • 00:45:00 The webinar hosts discuss a course they have developed called "Volatility Trading Strategies for Beginners", focused on teaching traders how to measure volatility using various methods such as ATR, standard deviation, VIX, and beta. They emphasize the importance of having the right tools and knowledge to trade without fear of volatility. The course is available at a 67% discount for a limited time and attendees of the webinar receive an additional 10% discount with the coupon code VTS10. The hosts also answer some audience questions, including why they focus on the US market when analyzing VIX and whether VIX is a leading or lagging indicator of price movements.

  • 00:50:00 The speaker explains that the VIX has a near-instantaneous reaction to the S&P 500. The VIX range is not discussed, as the 30-day volatility is annualized and shown, but it is from 0 to 100. The VIX tends to have a different phase between 10 to 20, which is the low to medium phase, and 20 to 25, which is the medium phase. Additionally, herding can impact the VIX, and there are futures options for India VIX, but there is not much liquidity in it due to high capital requirements.

  • 00:55:00 The video discusses various questions related to trading volatility and the VIX. One question addresses the possibility of trading on VIX-based derivatives while based in India, and the response suggests that some trading platforms allow for this, though it is still an emerging practice. Another question asks if sentiment of news can be included as an additional parameter in option pricing models. The response notes that the VIX is a different asset class and does not use the same models as other options. Additionally, the video discusses the underlying assets of UVIX and SVIX and suggests that they can be treated like other assets to be considered for trading strategies. Finally, a question addresses the rules for the combined portfolio strategy, which involves reserving a portion of capital and reinvesting as the S&P 500 declines.

  • 01:00:00 The speaker explains the criteria for entry and exit rules in a combined portfolio strategy. The entry rule is based on the behavior of the S&P 500; if it's declining, a trader can reserve capital to go long on the VIX. The VIX generally rises as the S&P 500 falls. The exit rule, on the other hand, looks at the behavior of the S&P 500 to determine whether it's out of the bear market and if the economy is doing well (indicating a bull market). The speaker also answers a question about whether the VIX follows the S&P 500 or vice versa, explaining that the VIX derives its value from the S&P 500 and generally follows it, but traders may make decisions based on the VIX levels that can affect the index.
Volatility Trading: Trading The Fear Index VIX
Volatility Trading: Trading The Fear Index VIX
  • 2022.05.10
  • www.youtube.com
00:00 Introduction02:30 Agenda03:45 What is volatility12:17 How do you measure volatility?18:30 Why is VIX called the fear index?39:07 Risks related to VIX d...
 

Big Data And The Future Of Retail Investing


Big Data And The Future Of Retail Investing

Financial markets generate enormous amounts of data each day. In this webinar, the speaker will discuss the importance of working with it in the context of investing and trading. He will also set forth on how we can harness it to suit different investment styles. In the process, he will cover how you can cultivate the knowledge and skills needed to thrive and prosper in this field.

00:00 - Introduction

04:00 - Disclaimer

05:44 - Agenda

11:04
- Data

14:31 - Big Data

20:01 - The dawn of data analytics

23:29 - Current trading and investment landscape

23:36 - Classical data analysis approach

27:43 - Modern data analysis

31:29 - Why and how is analytics used in financial markets

37:00 - Types of data

43:58 - Challenges for the retail investors

52:38 - Q&A

Big Data And The Future Of Retail Investing
Big Data And The Future Of Retail Investing
  • 2022.04.26
  • www.youtube.com
00:00 - Introduction04:00 - Disclaimer05:44 - Agenda11:04 - Data14:31 - Big Data20:01 - The dawn of data analytics23:29 - Current trading and investment land...
 

Pairs Trading in Brazil and Short Straddles in the US Markets [Algo Trading Projects]



Pairs Trading in Brazil and Short Straddles in the US Markets [Algo Trading Projects]

The webinar begins with the host introducing Dr. Luis Guidas, an EPAT alumni, who presents his project on pairs trading in the Brazilian stock markets. Dr. Guidas is an experienced software developer in the payment card industry and a faculty member teaching compilers and programming languages at the Universidade Federal Fluminense. He has worked extensively on cryptographic algorithms, security communication protocols, and secure electronic transactions. After completing the EPAT program in July 2021, he is currently the head of quantitative analysis at oCam Brazil.

Dr. Guidas starts by introducing the concept of statistical arbitrage, which involves using statistical models to find asset pairs that neutralize each other's risk. He explains how co-integrated pairs can be used to create a stationary time series with a constant mean and variance. To illustrate this, he uses the example of two ETFs that track the same index, which are almost perfectly co-integrated and create a horizontal spread with a constant mean and variance. He mentions that this process involves a training period and a test period to back-test the strategy.

Next, Dr. Guidas delves into the process of pairs trading and how they utilize a Bollinger band trading strategy. They select tickers and sectors, find quantitative pairs, and calculate the hedge ratio to create their spread. For each pair, they calculate the spread and employ a mean-reverting trading strategy, buying when the spread is below the mean and selling when it is above the mean. He also discusses the use of stop-loss in mean-reverting algorithms and highlights that as the price deviates further from the mean, the probability of it returning to the mean increases.

The speaker introduces a strategy called stop time, which involves exiting a spread trade after a certain number of days if it doesn't close, helping to prevent losses. They provide an example of a Bollinger Band strategy for pairs trading in Brazil, showcasing its profitability over a one-year period. However, due to limited data, they mention the bias that may arise from using only companies existing in the current time period. To address this, they incorporated another training period from 2018 to 2020, which resulted in a higher number of pairs due to the emergence of new companies and sectors.

Dr. Guidas shares insights into their experience with pairs trading in Brazil and discusses their methodology. They simplify the analysis of the spread and determine the ideal simple moving average period length by examining the spread's half-life. They also highlight the challenges faced while trading in the Brazilian stock market, particularly its liquidity, which limits the number of viable pairs after analyzing the top 100 companies. The speaker provides performance metrics but acknowledges the need for improvement and suggests approaches such as hyper-parameter tuning, stationarity checks, and merging small sectors. They recommend reading literature on the topic, specifically mentioning the books by Dr. Chang and Dr. Hippish.

During the Q&A session, Dr. Grace answers questions from the audience regarding the strategies presented in the video. She explains that the period of Bollinger Bands is a hyperparameter that can be dynamically set based on a grid test of the spread's half-life periods. When asked about using Bollinger Bands for straddles and strangles, she suggests seeking insights from derivatives experts as these are structured operations. Dr. Grace also addresses the issue of non-mean reverting trades and suggests making non-reverting series mean-reverting by calculating their first moment. Another question pertains to the correlation between Indice Futuro VINFUT and BOVA11, to which she recommends studying the relationship between the two for trading decisions.

Following that, Dr. Lewis Elton shares his experience with the Quantum Trading EPAD program and how it met his expectations in understanding why technical analysis doesn't always work in trading. He emphasizes the importance of studying and taking courses to gain knowledge and advises against trying to recreate humanity's knowledge alone. The webinar also announces the launch of their first contra course in Portuguese on momentum trading.

Siddharth Bhatia takes the floor to discuss short straddles in the US markets. He explains that a short straddle involves selling a call and put in equal amounts at the money and making a profit if the underlying asset moves less than the sold strike level. While the strategy is touted as an income trading strategy, Bhatia cautions that the potential losses can be much larger than the profits, especially during times of market volatility. He cites instances of firms getting wiped out during periods like the COVID pandemic due to short straddle trades.

The speaker shares their own experience with backtesting a short straddle trading strategy using a mechanical approach. They sold 100 units of at-the-money straddle at the beginning of each DTE (Days to Expiry) period and held the positions until expiry without implementing stop losses or nuanced entry and exit points. They conducted the backtesting using two sets of data, one being delta hedged and the other unhedged, and utilized two different versions with 7 DTE and 60 DTE to cover different time periods. They retrieved the necessary data for backtesting through the RATS API and processed it using Python pandas to obtain buy and sell prices. However, the speaker highlights the challenge of creating the data frame, as each line required individual attention to ensure accuracy.

The speaker proceeds to discuss the results of backtesting short straddle trading strategies in both the Brazilian and US markets. They reveal that the strategy performed poorly in both markets, resulting in significant drawdowns and a low Sharpe ratio. While delta hedging helped reduce the standard deviation of the P&L (Profit and Loss), it did not transform losing trades into profitable ones. The speaker notes that stop-loss orders are crucial in this type of trading and mentions academic papers suggesting the use of entry filters based on the VIX index and the term structure of VIX futures. The short straddle strategy is considered profitable but risky, requiring effective management of losses through various methods.

During the Q&A session, the speaker addresses several viewer questions. One question pertains to why positions for the strategy are not hedged at the end of the day. The speaker explains that the common practice is to hedge once a day at the market close as it helps reduce the standard deviation of P&L and minimize long-term volatility. However, they emphasize that hedging techniques are subject to testing and research. The speaker also touches on topics such as calculating CAGR (Compound Annual Growth Rate), transaction costs, and the advantages of holding positions for seven to ten days instead of daily selling in the short straddle strategy. Additionally, they emphasize the importance of previous experience in manual and non-algorithmic trading, as it prepares traders for market volatility and the acceptance of short-term losses.

The speakers continue to field questions from the audience, addressing queries related to pairs trading in Brazil and short straddles in the US markets. One listener asks whether they should take a long straddle if the VIX is around 20, to which the speaker advises against it, noting that it would usually result in a loss and suggests shorting the index if the VIX is above 20. Another question pertains to reconciling opposing entry strategies when the VIX is above 30. The recommendation is to always be short and disregard the backwardation suggestion. The speakers also receive questions about book recommendations, with one of the speakers highly recommending Eun Sinclair's three books.

The speaker then shares their experience with the Quantum City's ePAD program, highlighting how it helped bridge the gaps in their knowledge about coding and algorithmic trading concepts. They emphasize the importance of studying and becoming a student of the markets. The speaker encourages newcomers to open demo accounts and gain experience of taking losses in the market, emphasizing that mastering a skill requires delving deeper and taking more courses. They emphasize that Quantum City's ePAD program is an excellent starting point for those looking to enhance their understanding of the markets. The speaker echoes Dr. Luis Guidas' advice regarding the significance of studying and continuously learning from the market.

As the webinar draws to a close, the hosts express their gratitude to Dr. Luiz for sharing his valuable insights on pairs trading in Brazil. They also extend their appreciation to the audience for actively participating in the webinar and providing suggestions for future topics. The hosts acknowledge the challenges involved in launching a course in Portuguese but express their excitement about the numerous developments happening within their community. They encourage the audience to share their feedback through a survey, allowing them to gather valuable input and ideas for future sessions.

With warm appreciation, the hosts bid farewell to Dr. Luiz and the audience, expressing their enthusiasm for upcoming webinars and their commitment to providing valuable knowledge and insights to the trading community. They look forward to exploring new topics, sharing expertise, and fostering a thriving learning environment for all participants.

The webinar offered a comprehensive overview of pairs trading in Brazilian stock markets and the challenges associated with short straddle trading strategies in the US markets. The speakers shared their experiences, strategies, and insights, encouraging continuous learning and research to navigate the dynamic landscape of trading effectively.

  • 00:00:00 The host introduces Dr. Luis Guidas, an EPAT alumni who presents his project on pairs trading in Brazilian stock markets. Dr. Guidas has extensive experience in software development, specifically in the payment card industry. He is also a faculty member who teaches compilers and programming languages at the Universidade Federal Fluminense. Dr. Guidas has used an innovative problem-solving approach in his software development career and extensively worked on cryptographic algorithms, security communication protocols, and secure electronic transactions. He is currently the head of quantitative analysis at oCam Brazil after completing the EPAT program in July 2021.

  • 00:05:00 The speaker introduces the concept of statistical arbitrage, which is a kind of trading where a trader uses statistical models to find asset pairs that neutralize each other's risk. The speaker explains how co-integrated pairs can be used to produce a stationary time series, which has a constant mean and variance. They use the example of two ETFs that track the same index, which are almost perfectly co-integrated and produce a horizontal spread that has a constant mean and variance. The speaker explains that this process involves a training period and a test period and is used to back-test the strategy.

  • 00:10:00 The speaker explains the process of pairs trading and how they use a Bollinger band trading strategy. They select tickers and sectors and find quantitative pairs to get the hedge ratio to combine to make their spread. For each pair, they calculate the spread and use a mean-reverting trading strategy that involves buying when the spread is below the mean and selling when it is above the mean. The speaker also discusses the use of stop-loss in mean-reverting algorithms and why it may not be a good approach as the further the price goes from the mean, the higher the probability that it will go back to the mean.

  • 00:15:00 The speaker discusses a strategy called stop time, which involves exiting a transaction for a spread trade after a certain number of days if it doesn't close, which can help in preventing losses. They also share an example of a Bollinger Band strategy for pairs trading in Brazil and how it makes a decent profit with a one-year transaction. However, due to limited data, the speaker had to use companies that existed in the current time period, which might cause bias in the results of their back test. Therefore, they also utilized another training period of 2018 to 2020 with fresh data, which resulted in a higher number of pairs due to the emergence of new companies and sectors.

  • 00:20:00 The speaker discusses their experience with pairs trading in Brazil and provides insights into their methodology. They talk about using a simplified approach to analyze the spread and the half-life of the trade to determine the ideal simple moving average period length. They also highlight the challenges faced while trading in the Brazilian stock market due to its liquidity, explaining how only a few pairs survived after analyzing the top 100 companies. The speaker shares some performance metrics but acknowledges that there is always room for improvement and suggests hyper-parameter tuning, stationarity checks, and merging small sectors as possible approaches. They recommend reading literature on the topic, notably the books by Dr. Chang and Dr. Hippish.

  • 00:25:00 The presenter responds to several questions from the audience about the strategies presented in the video. When asked about the period of Bollinger Bands, she explains that it is a hyperparameter that can be set dynamically based on a grid test of the half-life periods of the spread. In response to whether Bollinger Bands can be used for straddles and strangles, she notes that these are structured operations with derivatives and suggests that working with derivatives experts may provide better insights. She also explains that when trades are no longer mean reverting, she closes the position and suggests that instead of pair trading, non-reverting series can be made reverting by calculating their first moment. Finally, when asked about the correlation between Indice Futuro VINFUT and BOVA11, she recommends
    studying the relationship between the two and using that information for trading decisions.

  • 00:30:00 The presenter discusses his experience with the Quantum Trading EPAD program and how it fulfilled his expectations in understanding why technical analysis does not always work in trading. He recommends studying and taking courses to gain knowledge and not being arrogant enough to try to recreate humanity's knowledge alone. The webinar also announces the launch of their first contra course in Portuguese on momentum trading.

  • 00:35:00 Siddharth Bhatia discusses short straddles in the US markets. Short straddles involve selling a call and put in equal amounts at the money and making money if the underlying moves less than the sold wall level. The strategy is proven to be profitable and is sold as an income trading strategy but Bhatia warns that the losses are much bigger than the profits, especially during times of market volatility. He cautions that short straddles can lead to huge losses and mentions firms that got wiped out during times like the COVID pandemic.

  • 00:40:00 The speaker talks about a short straddle trade and their experience backtesting it using a mechanical strategy where they sold 100 units at the money straddle at the start of each DTE period and held until expiry, with no stop losses or nuanced entries or exits. They used two sets, one being delta hedged and the other unhedged, and two different versions with 7 DTE and 60 DTE to sample different periods. They used the RATS API to retrieve the data for their backtesting and used Python pandas to process the data to obtain buy and sell prices. The real challenge of the project was creating the data frame as each line needed individual attention to ensure the data was correct. After the backtesting, they obtained the results, and it is evident that the weekly DT with no delta hedging incurred large drawdowns.

  • 00:45:00 The speaker discusses the results of backtesting short straddle trading strategies in the Brazilian and US markets. The strategy performed poorly in both markets, with a significant drawdown and low sharp ratio. Delta hedging helped reduce the standard deviation of the P&L, but it did not make a losing trade profitable. The speaker notes that stop-loss orders are mandatory for this type of trading, and also mentions academic papers suggesting the use of entry filters based on the VIX index and the term structure of VIX futures. The strategy is considered profitable but risky, requiring managing losses through various methods.

  • 00:50:00 The speaker addresses several questions from viewers, including why positions for the strategy are not hedged at the end of the day. He explains that the simplest and common way to hedge is to do it once a day at close because it helps reduce the P&L standard deviation and minimize volatility in the long run. However, he mentions that the techniques for hedging are subject to testing and research. The speaker also mentions the calculation of CAGR, transaction costs, and the advantages of holding positions for seven to ten days instead of selling them daily in the short straddle strategy. Additionally, he emphasizes the importance of having previous experience in manual and non-algo trading, as it prepares traders for the market's volatility and acceptance of short-term losses.

  • 00:55:00 The speakers answer more questions from the audience about pairs trading in Brazil and short straddles in the US markets. One listener asked if they could take long straddle if VIX is around 20, to which the answer was that it would usually result in a loss, and it's better to short the index if the mix is above 20. Another question was about how to reconcile opposing entry strategies while entering trades when the VIX is above 30. The recommendation here was to always be short and disregard the backwardation suggestion. The speakers also received questions about book recommendations, with Eun Sinclair's three books being highly recommended by one of the speakers.

  • 01:00:00 The speaker discusses his experience with the Quantum City's ePAD program and how it helped him to fill the gaps in his knowledge about coding and algorithmic trading concepts. He emphasizes the importance of studying and being a student of the markets and advises newcomers to open demo accounts and gain experience of taking losses in the market. He also mentions that mastering a skill requires going deeper and doing more courses, and that Quantum City's ePAD program is a great place to start. The speaker echoes Dr. Luis Gide's advice about the importance of studying and being a student of the markets.

  • 01:05:00 The hosts thank Dr. Luiz for sharing his experience on pairs trading in Brazil, as well as thanking the audience for participating and suggesting future topics for webinars. The hosts mention the challenge of starting a course in Portuguese but are excited for the many things going on in their community. They encourage the audience to share their feedback through a survey, suggesting topics for future sessions. The hosts express their appreciation and bid farewell to Dr. Luiz and the audience.
Pairs Trading in Brazil and Short Straddles in the US Markets [Algo Trading Projects]
Pairs Trading in Brazil and Short Straddles in the US Markets [Algo Trading Projects]
  • 2022.04.12
  • www.youtube.com
This session has project presentations by two of our esteemed EPAT alumni.00:00 Introduction - Project 104:45 Presentation - Pairs Trading In the Brazilian S...
 

Certificate In Sentiment Analysis And Alternative Data For Finance - CSAF™ [FREE INFO SESSION]



Certificate In Sentiment Analysis And Alternative Data For Finance - CSAF™ [FREE INFO SESSION]

The webinar hosts begin by introducing the Certificate in Sentiment Analysis and Alternative Data for Finance (CSAF) program. They highlight that the program is led by two experienced faculty members, Professor Gautam Mitra and Professor Christina Alvin Sayer. The program spans over five months and includes a series of lectures aimed at providing both foundational theory and practical use cases presented by guest lecturers who are professionals in the finance industry.

The hosts provide an overview of the program's modules, starting with the first two modules that focus on the basics of sentiment and sentiment data. Modules 3 and 4 delve into alternative data sources and their relevance for financial prediction and modeling, including satellite and email data, as well as text analysis. The course also covers modeling basics, various financial models, and the application of sentiment data to areas such as risk management, portfolio optimization, and automated trading. Additionally, there is a module specifically dedicated to alternative data, emphasizing the role of AI, machine learning, and quantitative models in sentiment analysis.

To further enrich the webinar, two special guests, Amit Arora and Abhijit Desai, who are CSAF alumni, are introduced. They share their experiences of taking the previous version of the course called EPAT NSA. Amit explains how the practical orientation of the course helped him develop his own trading ideas, leading him to dedicate more time to actual trading, which yielded better-than-expected results. Abhijit emphasizes the importance of commitment, dedication, and curiosity in getting the most out of the course.

The webinar also includes discussions with various individuals who have experienced the CSAF program. They share their challenges and successes in understanding and applying sentiment analysis and alternative data in their trading strategies. The speakers address questions from the audience, covering topics such as combining sentiments and volatility trading, the meaning of alternative data, the importance of certification in investing and trading, the inclusion of sentiment analysis in trading strategies, and real-time notification of news in trading.

Throughout the webinar, the speakers stress the significance of structured learning through certification courses like CSAF to develop a comprehensive perspective and approach. They highlight the importance of understanding financial markets and models in effectively applying sentiment analysis and alternative data. The speakers also emphasize the practical application of knowledge, the use of quantitative frameworks, and the value of case studies in showcasing the use of sentiment data.

The hosts express their gratitude to the audience for participating in the webinar and actively engaging with the information about the CSAF program. They encourage viewers to provide their feedback and questions through a survey and thank the speakers and each other for their contributions to the webinar's success. The hosts express their enjoyment in sharing knowledge and their commitment to fostering a learning environment for all participants.

  • 00:00:00 The webinar hosts introduce the CSAF program, which stands for Certificate in Sentiment Analysis and Alternative Data for Finance. The program is led by two experienced faculty members: Professor Gautam Mitra and Professor Christina Alvin Sayer. The CSAF program provides lectures that span over five months, covering both foundation lectures for presenting theory and use case lectures given by guest lecturers who are professionals in the finance industry. The hosts also mention that there will be a Q&A session at the end of the webinar and introduce two special guests, Amit Arora and Abhijit Desai, who will share their experiences as CSAF alumni.

  • 00:05:00 The speaker describes a Certificate in Sentiment Analysis and Alternative Data for Finance program and its modules, which focus on teaching participants about sentiment, its various types, and the usage of alternative data. The modules are delivered by core faculty members and guest faculty members, such as Antonio Gerni and Classifying Ironing, who share their practical knowledge of finance and sentiment analysis. The program also includes nine foundation lectures that help explain the concepts in greater detail. The lectures are supported by lecture notes, and at the end of the program, an exam is taken.

  • 00:10:00 Christina provides an overview of the Certificate in Sentiment Analysis and Alternative Data for Finance (CSAF) program, highlighting key modules in the course. The first two modules focus on teaching the basics of sentiment and sentiment data. Moving on to modules 3 and 4, the course delves into alternative data sources and their relevance for financial prediction and modelling, including satellite and email data and text analysis. The course also covers modelling basics and frameworks, various financial models, and how sentiment data can be applied to risk management, portfolio optimization, and automated trading. Lastly, the course includes a module on alternative data and emphasizes the role of AI, machine learning, and quantitative models in sentiment analysis.

  • 00:15:00 An alumnus named Amit shares his experience of taking the previous version of the course called EPAT NSA. He joined the course out of interest and did not expect much from it, but the practical orientation of the course helped him develop his own trading ideas. After finishing the course, he moved away from active change management consulting and devoted more time to developing his own ideas. Over the last three months, he has been dedicating most of his time to actual trading, and the results have been better than expected. Another alumnus named Avirup also shares his experience and emphasizes the importance of commitment, dedication, and curiosity in getting the most out of the course.

  • 00:20:00 Various individuals discuss their experiences with the Certificate in Sentiment Analysis and Alternative Data for Finance (CSAF) course. One individual explains that they were looking for something challenging in terms of algorithm trading and found that sentiment data and news is difficult to analyze and distinguish what knowledge is useful for making money. However, the course helped them to understand Python and develop their own models. The language used for machine learning modules is primarily Python, with some individuals using R as well. The webinar was also recorded and will be shared with registered participants who were unable to attend.

  • 00:25:00 The speakers discuss the primer, which is a set of topic areas necessary to have a background to apply sentiment analysis or some data to trading. It includes information about authorities regarding anomaly prediction or how to do performance measurement. The primer has no set duration as it is given to students before the course starts. Each module, on the other hand, has a duration of about three hours per Saturday lecture, which is backed up by lecture notes. The use case lectures vary in duration from one to two hours and include Q&A sessions with guest faculty members. In response to a viewer's question about whether sentiment analysis is necessary for trading, the speakers explain that sentiment analysis can help in finding sources of alpha or making returns on investments, even if market efficiency ultimately assimilates all sentiment and news.

  • 00:30:00 The speakers discuss how sentiment analysis provides valuable data for trading decisions due to its ability to quickly and quantitatively analyze news items that impact market activities. They note that sentiment analysis has become increasingly important with the abundance of data available from sources like Twitter and other social media outlets. The speakers also address the question of what kind of data sources are generally used for sentiment analysis and mention that news outlets and social media platforms are common sources, but that using this data requires permission from providers. They also touch on the topic of using Vader plots for sentiment analysis.

  • 00:35:00 The speakers discuss sentiment analysis and natural language processing in regards to financial analysis. They explain how sentiment data, which has already been analyzed and calculated by sentiment providers, can be used in quantitative ways to optimize portfolios and make asset allocation decisions. They also mention major players in the industry such as Bloomberg and Graffiti that provide such data. The speakers caution against using natural language processing solely for trading purposes and stress the importance of understanding financial markets to effectively use data analysis. In response to a question about pursuing a career in data analysis or AI, the speakers emphasize the need to have a strong understanding of financial markets and models in order to apply data analysis effectively.

  • 00:40:00 The speakers answer questions from viewers. The first question is about combining sentiments and volatility trading, and while it isn't directly covered in the course, the instructors provide tools and methods for achieving this. They mention that trading involved in this index, or equivalent in other markets, is an important topic, but it's in the domain of cutting-edge research. The next question asks about what is meant by alternative data, which the speakers explain is a new growth area in the market, referring to data provided by market participants that affect the market, such as sentiment data or news data. They add that satellite data, email inboxes, and orders from companies like Amazon or pizza suppliers are all examples of alternative data.

  • 00:45:00 The speakers discuss the importance of certification in investing and trading. While there is value in learning from all sources, structured learning through certification courses is necessary to develop a perspective and approach that unstructured learning cannot provide. However, the certificate itself is not always relied upon by trading companies. They also address a question on the importance of daily political news and other news in trading. While technical knowledge is important, keeping up with current events can give traders a better understanding of market tendencies and help them make more informed decisions.

  • 00:50:00 The speakers discuss the inclusion of sentiment analysis in trading strategies. They explain that while technical analysis and trading are well-known, the effect of news and sentiment is also taken into account in various strategies. Informed traders take news items and analyze them before using discretion to make trades, while noise traders immediately react to news items. They also suggest that combining different models and information, including sentiment analysis, can lead to more informed decisions. When it comes to individual sentiment, the sentiment provider may have a pool of people that are relevant for the market, and it's often useful to filter out financial market professionals for social media sentiment analysis.

  • 00:55:00 The speakers address whether the course covers real-time notification of news and press releases, which is important in automated or systematic trading. They explain that while news arrival is crucial in sentiment analysis and can affect returns quickly, it cannot dominate a trading strategy. The course is application-oriented and practical, but foundational theory is also important to provide a structured way of representing information. The speakers emphasize the use of quantitative frameworks and interesting case studies to highlight the use of sentiment data.

  • 01:00:00 The speakers discuss how academic rigor can apply to trading and how the CSAF course differentiates itself from the EPAT course. The EPAT course covers machine learning and Python skills, but the CSAF course adds additional knowledge in sentiment analysis and alternative data in the context of use cases and case studies. The speakers also answer a final question on how the CSAF course can benefit someone who has already taken the EPAT course, with Amit and Abhijit highlighting that the CSAF course builds on the groundwork provided by the EPAT course and provides additional knowledge and skills to develop profitable trading ideas. The session ends with a reminder to ask any additional questions in the survey and a thank you to the speakers for their time.

  • 01:05:00 The speakers express their gratitude to the audience for attending the information session about the Certificate in Sentiment Analysis and Alternative Data for Finance (CSAF) program. They encourage viewers to voice their questions and concerns about the program and thank everyone for their participation. The speakers end the video by thanking one another for making it successful and express their enjoyment in sharing knowledge with others.
Certificate In Sentiment Analysis And Alternative Data For Finance - CSAF™ [FREE INFO SESSION]
Certificate In Sentiment Analysis And Alternative Data For Finance - CSAF™ [FREE INFO SESSION]
  • 2022.03.29
  • www.youtube.com
00:00 Introduction02:30 CSAF overview by Prof Mitra10:40 Detailed course overview by Prof Christina15:45 Amit Arora sharing his CSAF experience19:20 Abhijit ...
 

How To Set Up Automated Trading



How To Set Up Automated Trading

During the presentation, the speaker delves into the advantages of automated trading and the reasons why automation is necessary. They highlight that automated trading allows traders to handle a larger number of assets simultaneously and execute trades based on predefined rules. This approach helps reduce the risk of errors and eliminates emotion-driven trading. The speaker emphasizes that automation simplifies the process by automatically placing orders once the specified rules are satisfied, eliminating any time lag. Additionally, they explain that automation frees up traders' time and resources, enabling them to focus on developing better trading strategies.

The speaker addresses a common misconception about automation completely replacing human intervention. They stress the importance of regularly analyzing the performance of sophisticated automated trading systems to make adjustments to the trading strategy when necessary. They emphasize that automation empowers traders to explore other tasks or assets that they might not have attempted manually. The presentation then moves on to discuss the three essential steps in trading: data acquisition, analysis (which can be rule-based or discretionary), and trade execution.

To automate a part of the trading process, the speaker recommends using data and coding to retrieve historical data for preferred assets. They mention that Google Finance has integrated its API into Google Sheets, allowing users to easily retrieve data by specifying parameters such as the ticker symbol, start and end dates, and data type. This collected data can be utilized to create price graphs, perform calculations (e.g., generating custom indicators or calculating percentage changes), and automate the data collection process, streamlining trading strategies.

A demonstration in the video showcases the process of backtesting a trading strategy using the Relative Strength Index (RSI) indicator on past data. The RSI value, ranging from 0 to 100, determines the action taken. If the RSI value is less than 30, indicating that the asset is oversold, it becomes attractive to buyers, prompting them to buy the asset. A value between 30 and 70 suggests no action, while a value above 70 indicates that the asset is overbought, prompting a sell-off. The speaker validates the effectiveness of these rules by automating backtesting on past data, utilizing visual programming on a US equities dataset.

The speaker introduces the Blue Shift platform for automated trading, which offers features such as backtesting, paper trading, and live trading. They highlight that the platform provides visual programming options that do not require coding knowledge. The speaker demonstrates setting up a trading strategy using the RSI indicator and explains the conditions for taking long and short positions. Finally, they present the backtest results, which exhibit a 14% return, a Sharpe ratio of 1.22, and a maximum drawdown of minus 13%. Overall, Blue Shift is praised as a user-friendly platform for creating and testing automated trading strategies.

The speaker moves on to discuss the process of implementing an automated trading strategy in live trading. They recommend starting with paper trading, which utilizes real-time data but not real money, to observe the strategy's performance in the current market environment. The speaker guides the audience through the steps of setting up paper trading and transitioning to live trading, including selecting a broker, determining capital allocation, and confirming orders. They stress the significance of regularly monitoring the strategy's performance and making necessary adjustments. The speaker also mentions that previous sessions covering live trading using other platforms are available on their YouTube channel.

Although not all brokers offer APIs for automated trading, the speaker highlights Interactive Brokers as a platform available in most regions, providing API support. They mention that using an IBridge Py bridge with Interactive Brokers enables trade automation from anywhere globally, including Singapore. The speaker notes that while obtaining data for NSE stocks is possible, it is essential to find the appropriate ticker symbol and use Yahoo Finance to access the necessary historical data.

The speaker explains that minute-level data is not widely available for free and points out that the data requirements become more demanding at that level. To obtain minute-level data, the speaker suggests opening an account with a broker like Interactive Brokers. However, they mention that depending on the geography and chosen broker, a fee may be required. The speaker briefly mentions the trade frequency function and directs the audience to consult the Blue Shift documentation for more information on creating a trading strategy. They also emphasize the importance of setting stop-loss levels when developing a trading strategy.

Moving on, the speaker discusses the significance of setting appropriate stop-loss levels for different types of assets. They recommend using different stop-loss values based on the volatility of the assets, with higher stop losses for assets that experience significant price fluctuations, such as Tesla. The speaker also notes that determining the ideal values for alpha and beta depends on the trader's goals and the desired timeframe to achieve a specific percentage of profit. Additionally, they respond to questions regarding automating trading in Indian markets, monitoring strategies, and creating option strategies using the platform. Lastly, the speaker underscores the importance of remaining vigilant during unexpected market events and determining whether to pause trading or continue based on the strategy's ability to withstand volatility.

The speaker further expands on automation in trading and how it operates. They explain that automation is available for Indian markets through the Blueshift platform, which facilitates backtesting strategies and live trading through partnerships with various brokers. Emphasizing the significance of having predefined rules in trading, the speaker highlights the value of testing these rules through backtesting and paper trading, which uses virtual money to evaluate strategy performance in the current market conditions. The speaker also mentions that machine learning can be applied in trading and is supported by Blueshift for developing trading strategies.

Addressing the possibility of automated trading on mobile devices, the speaker acknowledges that while mobile-based platforms may not be as feature-rich as web-based platforms, automated trading on mobile phones may become more prevalent as the industry moves towards cloud-based solutions. They suggest that beginners start small and gradually expand their knowledge by learning more and establishing a trading rule or strategy. The speaker highlights that Blue Shift, a learning, backtesting, and trading platform, is completely free and can be utilized to experiment with trading strategies. They also respond to questions regarding the platform's features and mention plans to add more brokers in the future. Finally, the speaker acknowledges a query about auto-trading Bitcoin on any platform.

Regarding broker support for automated trading, the speaker clarifies that not all brokers offer this functionality, and users should verify if their chosen platform supports it. They explain that the industry is increasingly shifting towards automated trading, with the majority of orders being executed with the assistance of automated trading systems. In terms of combining machine learning, neural networks, and AI for algorithmic trading, the speaker describes the process of training and testing data on a machine learning model and leveraging the predicted output for algorithmic trading. Lastly, they address a question from a working professional, noting that automated trading can assist professionals in managing trading activities while minimizing screen time, allowing them to focus on their job's demands.

The speaker reiterates that automating a trading strategy is feasible for working professionals, but it is crucial to periodically review the performance of the automated system as market conditions can change. They suggest that while it is possible to create a trading strategy without learning Python or any coding language using various platforms, advanced strategies may require proficiency in Python or other programming languages. The speaker reassures the audience that learning Python is not as challenging as it may seem and can provide an added advantage. They stress the importance of regularly evaluating performance to modify the strategy accordingly.

Finally, the speaker invites the audience to fill out a survey for any unanswered questions and encourages them to take advantage of a limited-time offer, providing a 70% discount and an additional 25% discount for enrolling in all courses. They express gratitude for the support received and assure the audience of their commitment to organizing more webinars in the future. The speaker asks for suggestions on potential topics to plan better sessions that cater to the audience's interests and needs. Concluding the presentation, the speaker extends warm wishes for a happy Holi and expresses appreciation to all attendees for their participation in the session.

  • 00:00:00 The speaker discusses the benefits of automated trading and why automation is required. With automation, traders can manage a larger number of assets in parallel and execute trades based on predetermined rules, reducing the risk of errors and avoiding emotion-driven trading. The process is simplified as the system places the order automatically once the rules are satisfied, avoiding any time lag. Furthermore, the speaker explains that automation can free up time and resources for traders to focus on developing better trading strategies.

  • 00:05:00 The speaker discusses the misconception about automation completely eliminating human intervention, and emphasizes the importance of regularly analyzing the performance of sophisticated automated trading systems to adjust the trading strategy when necessary. The use of automation enables traders to focus on other tasks or assets that they wouldn't have otherwise tried manually. The speaker then moves on to discussing the three steps in trading, starting with the acquisition of data, followed by analysis, which can be either rule-based or discretionary, and finally, the execution of trades.

  • 00:10:00 If you want to automate part of your trading process, you can use data and coding to retrieve historical data of your favorite assets. Google Finance has integrated their API into Google Sheets, making it easy to retrieve data simply by typing in parameters such as the ticker symbol, start and end dates, and data type. This data can then be used to create price graphs or perform calculations, such as creating your own indicators or calculating percentage changes. With this tool, traders can automate their data collection process and streamline their trading strategies.

  • 00:15:00 The video demonstrates how to back test or try out a trading strategy by using the Relative Strength Index (RSI) indicator on past data. The RSI value ranges between 0 and 100, and depending on its value, a different action is taken. If the RSI value is less than 30, it means that the asset is sold by many people, making the price attractive to buyers, so they buy the asset. If the RSI value is between 30 and 70, no action is taken, and if the RSI value is more than 70, it means that people have bought the asset, pushing the price level high, so it's a good time to exit the trade by selling the asset. The effectiveness of these rules is then checked by automating back testing on past data using visual programming on a US equities data set.

  • 00:20:00 The speaker discusses using the Blue Shift platform for automated trading, which allows users to backtest, paper trade, and go live. The platform offers visual programming that doesn't require coding. The speaker demonstrates setting up a trading strategy using the RSI indicator and explains the long and short conditions. Finally, he shows the backtest results, which offer a return of 14 percent, a Sharpe ratio of 1.22, and a maximum drawdown of minus 13. Overall, Blue Shift is a user-friendly platform for creating and testing automated trading strategies.

  • 00:25:00 The speaker discusses the process of going live with an automated trading strategy. He recommends starting with paper trading, using real-time data but not real money, to see how the strategy performs in the current market environment. The speaker walks through the process of setting up paper trading and then going live, including selecting a broker, setting capital, and confirming orders. He emphasizes the importance of regularly monitoring the performance of the strategy and adjusting as needed. The speaker also mentions that there are previous sessions available on their YouTube channel covering live trading using other platforms.

  • 00:30:00 While not all brokers offer APIs, Interactive Brokers is a platform that is available almost everywhere and offers API for automated trading. An IBridge Py bridge can be used with Interactive Brokers to automate trades from anywhere in the world, including Singapore. It's important to note that getting data for NSE stocks is also possible, but it's necessary to look up the appropriate ticker symbol and use Yahoo Finance to obtain the necessary historical data.

  • 00:35:00 The speaker explains how minute level data is not widely available for free and that data requirements become high at that level. He suggests opening an account with a broker like Interactive Brokers to obtain minute level data but mentions that a fee may be required depending on your geography and chosen broker. The speaker briefly touches on the trade frequency function and recommends Blue Shift documentation for more information on creating a trading strategy. They also clarify that Blue Shift can be used for visual programming or coding and that stop loss levels should be set when creating a trading strategy.

  • 00:40:00 The speaker discusses the importance of setting appropriate stop-loss levels for different types of assets. He recommends using different stop losses for different assets based on how volatile they are, with higher stop losses for assets with high fluctuation rates like Tesla. The speaker also notes that the ideal values for alpha and beta depend on the trader's goals and the timeframe they want to achieve a particular percentage of profit. Additionally, the speaker answers questions about automating trading in Indian markets, monitoring strategies, and the ability to create option strategies with the platform. Finally, the speaker emphasizes the importance of remaining vigilant in unexpected market events, and to determine whether to stop trading altogether or continue based on the strategy's ability to withstand volatility.

  • 00:45:00 The speaker discusses automation in trading and how it works. They explain that automation is available for Indian markets through the Blueshift platform, which allows users to backtest strategies and pay-per-trade or live trade through partnerships with various brokers. The speaker emphasizes the importance of having certain rules in trading and being able to test them through backtesting and paper trading, which uses virtual money to see how the strategy performs in the present market. The speaker also mentions that machine learning can be applied in trading and is supported by Blueshift for trading strategies.

  • 00:50:00 The speaker discusses the possibility of using automated trading on a mobile phone, noting that while mobile-based platforms are not as feature-rich as web-based platforms, automated trading may soon come to mobile phones as everything is moving towards being more cloud-based. The speaker suggests that beginners can start off small and build their way up by learning more and having a trading rule or strategy in place. The speaker also mentions that Blue Shift, a learning, backtesting, and trading platform, is completely free and can be used to try out trading strategies. Additionally, they address questions about the platform and note that more brokers will be added in the future. Finally, the speaker acknowledges a question about using any platform to auto trade bitcoin.

  • 00:55:00 The speaker addresses the question of whether automated trading is supported by all brokers and clarifies that not all brokers offer support for automated trading, and users will need to verify if the platform being used supports it. The speaker notes that the industry is largely moving towards automated trading, with the majority of orders being placed with the help of automated trading systems. Regarding combining machine learning, neural networks, and AI for algo trading, the process involves training and testing data on the machine learning model, using the predicted output for algo trading. Lastly, the speaker responds to a question from a working professional and notes that automated trading can be used to help them focus on their job's demands by taking care of trading activities while minimizing screen time.

  • 01:00:00 The goal is to automate your trading strategy, it is doable even for working professionals. However, it's important to periodically review the performance of the automated system because the scenario can change, and what worked earlier might not work now. While it is possible to create a trading strategy without learning Python or any coding language using various platforms, if you want to fine-tune or try out more advanced strategies, you might have to learn Python or other programming languages. Learning Python can be an added advantage and you will find that it is not as difficult as people think it is. In any case, periodic review of performance is essential to modify the strategy accordingly.

  • 01:05:00 The speaker reminds the audience to fill out a survey for any unanswered questions and encourages them to take advantage of the limited time offer for a 70% discount and an additional 25% if enrolling in all courses. They express gratitude for the support and plan to continue webinars, asking for suggestions for future topics to plan better sessions. The speaker ends by wishing everyone a happy Holi and thanking the audience for attending the session.
How To Set Up Automated Trading
How To Set Up Automated Trading
  • 2022.03.17
  • www.youtube.com
Automation is everywhere! We live in a world where you can obtain certain products which are manufactured without human supervision. Automation is transformi...
 

Quantitative Data Analysis Of Cryptocurrencies



Quantitative Data Analysis Of Cryptocurrencies

In this informative session on quantitative data analysis for cryptocurrencies, the speaker, Udisha Alook, introduces herself as a quant researcher at Quant Institute, specializing in blockchain, Bitcoin, Ethereum, and Ripple. She highlights the importance of conducting due diligence before investing in cryptocurrencies and outlines the agenda for the session.

The speaker begins by providing an overview of cryptocurrencies, emphasizing that they are digital or virtual currencies secured by cryptography and lack a physical form. She explains that cryptocurrencies ensure security through cryptography, operate in a decentralized manner using blockchain technology, and eliminate the risk of double-spending.

Next, the speaker delves into the main topics to be covered in the session. She mentions that the session will explore the top cryptocurrencies, discuss where to obtain data on cryptocurrencies, and provide insights into trading in the cryptocurrency market. The speaker emphasizes that the central focus will be on analyzing data for the top cryptocurrencies.

Moving forward, the speaker introduces Quantinsti, a quantitative trading firm, and its offerings. She highlights the professional certification program in Algorithmic Trading (EPAT), the certificate in Sentiment Analysis and Alternative Data for Finance (CSAF), and the self-paced courses available under Quantra. Additionally, the speaker introduces BlueShift, a cloud-based platform for strategy development, research, backtesting, paper trading, and live trading.

Returning to the main topic of cryptocurrencies, the speaker discusses the top six cryptocurrencies based on their market capitalization and provides a brief overview of their functionalities. Bitcoin, the first and most widely known cryptocurrency, is mentioned as the only one currently adopted as legal tender by El Salvador. Ethereum, ranked second in terms of market capitalization, is highlighted for introducing smart contract functionality. Ripple, designed as an intermediate mechanism of exchange, is mentioned as the sixth cryptocurrency on the list. The speaker also introduces Binance Coin, which has transitioned to its own blockchain, and Tether and USD Coin, stable coins pegged to the US dollar that offer cryptocurrency functionality with the stability of fiat currencies.

Regarding data sources for cryptocurrencies, the speaker mentions CryptoWatch and CoinAPI as reliable sources of historical crypto data. She also provides a list of major global crypto trading platforms, including Binance, Coinbase, Etoro, Gemini, and Kraken.

Continuing with the session, the speaker compares the prices of various cryptocurrencies and illustrates their performance on a logarithmic scale. Bitcoin emerges as the dominant cryptocurrency in terms of price, followed by Ethereum and Binance Coin. Ripple is noted to have experienced a decline in performance, while stable coins remain stable due to their nature. The speaker further calculates cumulative returns, highlighting that Binance Coin has exhibited the highest returns, followed by Ethereum and Bitcoin. Volatility in the top four cryptocurrencies is described as fluctuating significantly, with spikes occurring during certain periods, whereas stable coins consistently maintain stability.

The video then focuses on analyzing the volatility and associated risks of investing in cryptocurrencies. The speaker observes that cryptocurrency returns display high kurtosis, indicating the likelihood of extreme returns, both positive and negative. This is attributed to momentum-based trading, where investors tend to buy when prices are rising and panic sell when prices decline. Box plots of daily returns are presented to demonstrate the presence of numerous outliers, further supporting the notion that cryptocurrencies entail a significant level of risk. Stable coins, however, are noted to exhibit less volatility.

In the subsequent segment, the speaker examines the impact of removing outliers on the median values of popular cryptocurrencies such as Bitcoin, Ethereum, Binance Coin, Ripple, USD Coin, and USDC. Stable coins are highlighted as designed to maintain a value close to one US dollar, making them particularly attractive for many users. Ripple, on the other hand, is distinguished from other cryptocurrencies due to its unique permission blockchain designed for financial institutions. The ongoing SEC case against Ripple's founders is mentioned as a factor that has caused fluctuations and uncertainty for investors.

Moving on, the speaker groups the factors that influence cryptocurrencies into five major categories. These include the law of supply and demand, which impacts the scarcity and value of cryptocurrencies. The perception of value, driven by market sentiment and investor sentiment, also plays a significant role. Technological advancements, such as updates to blockchain protocols and improvements in scalability, can affect the performance of cryptocurrencies. Government regulations and policies, including legal frameworks and regulatory actions, have a considerable impact on the cryptocurrency market. Finally, market sentiment, shaped by media coverage, political events, and overall market trends, can greatly influence cryptocurrency prices.

The speaker explores the influence of media, political events, regulatory changes, and blockchain modifications on cryptocurrency prices. Positive or negative news coverage is highlighted as having a significant impact on cryptocurrency prices, as it can either encourage or deter people from investing. Endorsements of cryptocurrencies by reputable companies or individuals are also noted to increase their reliability and trustworthiness. Political events and regulatory changes, such as economic crises or government interventions, can influence investors' trust in traditional currency and drive them towards cryptocurrencies. The speaker mentions the high correlation between various cryptocurrencies, especially with Bitcoin as the dominant cryptocurrency. However, stable coins are observed to be uncorrelated with traditional cryptocurrencies, making them a unique asset class.

The video further discusses the process of exchanging cryptocurrencies for fiat currency. It is explained that most exchanges support the trading of major cryptocurrencies such as Bitcoin and Ethereum. Therefore, it is often necessary to exchange altcoins for one of these top cryptocurrencies before converting them into fiat currency. The video also explores trading strategies suitable for cryptocurrencies, including momentum indicator-based strategies and arbitrage, taking advantage of the high volatility in the market. Coding examples using indicators like the Relative Strength Index, Moving Average Convergence Divergence, and the Awesome Oscillator are presented to illustrate momentum-based strategies.

Towards the end of the session, the presenter recaps the main points covered and emphasizes the potential of stable coins for portfolio diversification due to their low volatility and lack of correlation with other cryptocurrencies. Additional resources for learning about algorithmic trading and cryptocurrency are provided, including free books and courses, as well as the Blue Shift research and trading platform. The speaker mentions the Executive Program in Algorithmic Trading, tailored for individuals interested in starting their own algorithmic trading desk or pursuing a career in algorithmic trading with mentorship from industry practitioners. The availability of early bird discounts for the program is also highlighted.

In the concluding portion, the speaker addresses several audience questions related to cryptocurrency and blockchain. The long-term viability of cryptocurrencies without regulatory backing is discussed, with the speaker highlighting that some countries have already passed laws regulating them, treating them as long-term investments. The growing acceptance and development of blockchain technology also contribute to people's comfort with cryptocurrencies. The future of decentralized finance (DeFi) is acknowledged as an evolving space with various concepts and types of arbitrage yet to be explored. The speaker emphasizes that crypto trading goes beyond data mining and technical indicators, underscoring the importance of understanding blockchain technology and its applications.

Furthermore, the potential impact of upcoming US regulations on the crypto market is discussed. The speaker acknowledges that the government could regulate blockchain in the US but highlights the challenge of controlling the decentralized nature of the technology. Therefore, while regulatory decisions may impact cryptocurrency prices, complete control over the market may be difficult to achieve. The minimum capital required for crypto trading and the potential use of cryptocurrencies in real-world transactions are also addressed. Finally, the rise of central bank digital currencies (CBDCs) and their potential impact on the decentralized nature of cryptocurrencies are briefly mentioned.

In the closing remarks, the speakers emphasize the increasing exploration of blockchain technology for solving problems such as identity issuance and supply chain management. They anticipate a high demand for blockchain developers in the future due to ongoing development in the field. The advantage of cryptocurrencies, such as their ability to be traded around the clock, is highlighted. The audience is encouraged to provide feedback and pose any unanswered questions for future discussions.

As the session concludes, the speaker summarizes the key takeaways, emphasizing the need for proper data analysis and quantitative techniques to navigate the high volatility of cryptocurrencies. Technical and quantitative analysis, along with backtesting, are highlighted as essential tools to mitigate risk. The speaker also addresses a question regarding the impact of geopolitical interventions on cryptocurrency markets, noting that government decisions do have an impact, but the decentralized nature of cryptocurrencies may lead people to turn to them in situations where trust in traditional currency or government is low. Lastly, the benefits of stable coins are emphasized, as they offer a more stable and predictable value compared to other cryptocurrencies, making them more suitable for everyday transactions.

In response to a question about the potential impact of upcoming US regulations on the crypto market, the speaker acknowledges the possibility of government regulation but emphasizes the challenges in fully controlling the decentralized nature of cryptocurrencies. While regulations may impact cryptocurrency prices, the speaker suggests that complete control over the market might be difficult to achieve. The rise of central bank digital currencies (CBDCs) is also mentioned, and their potential impact on the decentralized nature of cryptocurrencies is briefly discussed.

In the final part, the speakers discuss the increasing exploration of blockchain technology for solving real-world problems such as identity issuance and supply chain management. They express optimism about the future demand for blockchain developers and the continued growth of the blockchain industry. The advantages of cryptocurrencies, such as their ability to be traded 24/7, are highlighted. The audience is encouraged to provide feedback and share any remaining questions for future sessions.

The session conducted by Udisha Alook provides valuable insights into quantitative data analysis for cryptocurrencies. It emphasizes the importance of due diligence before investing, provides an overview of cryptocurrencies and their functionalities, explores data sources and trading platforms, analyzes price movements and volatility, discusses factors influencing cryptocurrency prices, and addresses audience questions related to regulations, trading strategies, and the future of cryptocurrencies. The session serves as a comprehensive introduction to quantitative analysis in the cryptocurrency market, equipping participants with the knowledge necessary to make informed investment decisions.

  • 00:00:00 The speaker introduces the topic of quantitative data analysis for cryptocurrencies. The session is conducted by Udisha Alook, who works as a quant researcher at Quant Institute and is an expert in blockchain, Bitcoin, Ethereum, and Ripple. The speaker emphasizes the importance of due diligence before investing in cryptocurrencies and explains the agenda for the session, which includes an overview of cryptocurrencies, top cryptocurrencies, and where to get data and trade in cryptocurrencies. The main part of the session focuses on analyzing the data for top cryptocurrencies.

  • 00:05:00 The video introduces the Quantitative Trading Firm, Quantinsti, and its various offerings, including the professional certification program in Algorithmic Trading (EPAT), the certificate in Sentiment Analysis and Alternative Data for Finance (CSAF), and the self-paced courses under Quantra. Additionally, the video discusses BlueShift, a cloud-based strategy development platform for research, backtesting, paper trading, and live trading. The main topic of the video is cryptocurrencies, defined as digital or virtual currencies secured by cryptography, with no physical medium of existence. Cryptocurrencies are secure because they use cryptography, are decentralized through blockchain technology, and avoid double-spending.

  • 00:10:00 The speaker discusses the top six cryptocurrencies in terms of market capitalization and briefly explains their functionalities. Bitcoin is the first cryptocurrency and the only one that has been adopted as legal tender by El Salvador. Ethereum is second to Bitcoin in terms of market capitalization and introduced the smart contract functionality. Ripple, designed as an intermediate mechanism of exchange, is sixth on the list. Binance Coin, issued by the Binance exchange, has moved to its own blockchain. Tether and USD Coin, both stable coins pegged to the US dollar, offer the functionality of cryptocurrencies but the stability of fiat currencies. The speaker also mentions that there are good sources of historical crypto data, such as CryptoWatch and CoinAPI, and lists major global crypto trading platforms as Binance, Coinbase, Etoro, Gemini, and Kraken.

  • 00:15:00 The speaker compares the prices of various cryptocurrencies and shows how they perform on a logarithmic scale. Bitcoin dominates all other cryptocurrencies in terms of prices, followed by Ethereum and Binance Coin. Ripple has not been doing well, and stable coins remain stable due to their nature. The speaker then calculates the cumulative returns and shows that Binance Coin has the highest returns followed by Ethereum and Bitcoin. The volatility of the top four cryptocurrencies has been all over the place, with spikes in some periods, while stable coins remain stable.

  • 00:20:00 The video analyzes the volatility and risk associated with investing in cryptocurrencies. It observes that the returns of cryptocurrencies have high kurtosis, indicating that extreme returns, both positive and negative, can be expected. This is due to momentum-based trading, where investors tend to buy when prices are going up and panic sell when prices go down. The video also shows box plots of daily returns of cryptocurrencies, which have numerous outliers. This historical data proves that cryptocurrencies are a risky investment, although stable coins are less risky.

  • 00:25:00 The speaker discusses how removing outliers affects the median value of popular cryptocurrencies such as Bitcoin, Ethereum, Finance Coin, Ripple, USD, and USDC. Stable coins are created to maintain their value close to one US dollar, which is the main focus for most of these stable coins. Ripple, on the other hand, is distinguished from other cryptocurrencies because it is a different kind of blockchain that is a permission blockchain designed for financial institutions. The speaker also discusses how the ongoing SEC case against Ripple's founders has caused fluctuations and uncertainty for investors. Finally, the speaker groups the factors that affect cryptocurrencies into five major factors: the law of supply and demand, the perception of value, technological advancements, government regulations, and market sentiment.

  • 00:30:00 The influence of media, political events, regulatory changes, and blockchain modifications on cryptocurrency prices are discussed. It is noted that media has a significant impact on cryptocurrency prices as positive news can encourage people to buy while negative press can deter them. Additionally, the endorsement of cryptocurrencies by reputable companies or individuals can increase their reliability and trustworthiness. Political events and regulatory changes, such as the Greek crisis in 2015, can also influence investors' trust in governments and drive them towards cryptocurrency. The correlation between various cryptocurrencies, such as bitcoin and ethereum, is high, as most cryptocurrencies are blockchain-based and draw heavily from bitcoin. Finally, stable coins are observed to be uncorrelated with traditional cryptocurrencies.

  • 00:35:00 The video discusses the process of exchanging cryptocurrencies for fiat currency. Most exchanges only support the exchanging of major cryptocurrencies such as Bitcoin and Ethereum, making it necessary to first exchange altcoins for one of these top cryptocurrencies before exchanging for fiat currency. The video also explores trading strategies that may work well for cryptocurrencies, including momentum indicator-based strategies and arbitrage due to the volatility of cryptocurrencies. The video presents coding for momentum-based strategies using indicators such as the Relative Strength Index, Moving Average Convergence Divergence, and the Awesome Oscillator.

  • 00:40:00 The presenter recaps the main points covered in the video and emphasizes the potential for stable coins to be good candidates for portfolio diversification due to their low volatility and lack of correlation with other cryptocurrencies. The presenter also provides additional resources for those interested in learning more about algorithmic trading and cryptocurrency, including free books and courses, as well as a research and trading platform called Blue Shift. The section ends with a discussion of the Executive Program in Algorithmic Trading, which is designed for individuals who want to start their own algo trading desk or develop a career in algorithmic trading with mentorship from industry practitioners. Early bird discounts are currently available.

  • 00:45:00 The speaker discusses several questions related to cryptocurrency and blockchain. When asked about the long-term viability of cryptocurrencies without regulatory backing, the speaker notes that some countries, such as Malta, have already passed laws regulating them and treating them as long-term investments. Blockchain technology has also grown and gained acceptance in recent years, making people more comfortable with cryptocurrencies. The speaker believes that it may be difficult to control cryptocurrencies, but governments and regulators are taking steps to regulate them. When asked about the future of decentralized finance, the speaker acknowledges that it is catching up, but there are still different types of arbitrage and other concepts to consider. Finally, when asked about crypto trading, the speaker mentions that it is not just about data mining and technical indicators but also involves understanding blockchain technology and its uses.

  • 00:50:00 The speaker discusses the importance of conducting proper data analysis before investing in cryptocurrencies due to their high volatility. She emphasizes the use of technical and quantitative analysis, as well as backtesting, to mitigate risk. The speaker also addresses a question regarding the impact of geopolitical interventions on cryptocurrency markets, emphasizing that government decisions do have an impact, but the decentralized nature of cryptocurrencies means that people may turn to them if their trust in traditional currency or government is lower. Lastly, the speaker discusses the benefits of stable coins, which alleviate some of the volatility associated with cryptocurrencies, making them more useful in daily transactions.

  • 00:55:00 The speaker discusses the potential impact of upcoming US regulations on the crypto market. While it's true that the government could regulate blockchain in the US, it may prove challenging to control the decentralized nature of the technology. As a result, the government's decision on crypto regulation could impact the price but may not necessarily control it completely. The speaker also touches on questions regarding the minimum capital required to trade crypto and the potential for cryptocurrencies to be used in real-world transactions. Finally, the speaker talks about the rise of central bank digital currencies and the possible impact on the decentralized nature of cryptocurrencies.

  • 01:00:00 The speakers discuss the increasing exploration of blockchain technology for solving problems such as identity issuance and supply chain management. They believe that there is still a lot of development and work to be done in the blockchain space and that there will be a good demand for blockchain developers. Cryptocurrencies can be traded around the clock, which is one of their advantages. The speakers also encourage the audience to provide feedback on the session and mention any unanswered questions, which they will aim to answer in the future.
Quantitative Data Analysis Of Cryptocurrencies
Quantitative Data Analysis Of Cryptocurrencies
  • 2022.02.24
  • www.youtube.com
There has been a lot of buzz about crypto these days.In this webinar, the speaker will,- Explore what cryptocurrencies are,- Touch upon the top cryptocurrenc...
 

Hands-On Introduction To Quantitative Trading | Yale School of Management



Hands-On Introduction To Quantitative Trading | Yale School of Management

In the seminar on introductory quantitative trading, the speaker delves into the creation, evaluation, and deployment of trading algorithms using code examples. The session begins by introducing the concept of quantitative trading, which involves using mathematical and statistical models to identify trading opportunities and execute trades. Various types of quantitative trading strategies are explained, including momentum trading, mean diversion trading systems, mathematical models, high-frequency trading, and news-based trading systems. The speaker emphasizes that algorithms are not only used for trading but also for market-making and taking advantage of price inefficiencies to generate profit.

The basic structure of a quantitative trading system is then explained. It includes data collection, the creation of a trading strategy, backtesting, execution, and risk management. Price, fundamental, economic, and news data are commonly used for trading algorithms. Technical, statistical, and mathematical analysis can be employed to design trading rules for the strategy. Backtesting involves testing the rules on historical data to evaluate their performance. Execution can be manual or automatic, and risk management is crucial for capital allocation and setting risk parameters such as stop loss. The speaker provides live examples of quantitative trading strategies to illustrate these concepts.

The trend-based strategy is highlighted, and technical indicators such as exponential moving average (EMA), parabolic SM, and stochastic oscillator are used to design the algorithm. The Contra platform is introduced, which offers video tutorials, interactive exercises, and practical exposure without requiring software installation. Python modules are imported to assist in creating the algorithm, and data is imported from a CSV file to define trading rules and monitor strategy performance. The TLA Python module is utilized to set the parameters for the technical indicators, simplifying the design process.

The instructor explains how to define trading rules and generate trading signals using technical indicators such as EMA, Stochastic fast, and Stochastic slow oscillators. Five trading conditions are outlined for generating buy signals, and trading rules for short positions are also designed. The next step is to backtest the strategy using a Python notebook to assess its practical performance. The plot of strategy returns demonstrates that the algorithm initially incurred losses but gained momentum from 2018, ultimately generating a profit by the end of the testing period. BlueShift, a platform that enables research, construction, and backtesting of algorithms with ease, is introduced.

A demonstration of backtesting on Bank of America stock using the BlueShift platform follows. The platform provides data maintenance and a simple line of code for importing data into Python. Indicators and trading rules are defined, and trades are executed automatically based on the fulfillment of long and short conditions. The backtest is conducted from January 2020 to October 2021 with a capital of $10,000, and the performance is compared to the S&P 500 benchmark. The results reveal a 113% return on investment. Detailed backtest results can be obtained to analyze monthly returns, trades executed, and margin used, facilitating better trading decisions.

The speaker demonstrates how to access comprehensive backtest results on the BlueShift platform, including visual representations of performance metrics such as algorithm returns and monthly returns heat maps. The positions taken by the algorithm are analyzed, and key metrics such as total profit from long and short sides are examined. Risk parameters and order limits can be configured before deploying the strategy in real-time, either through paper trading or with real capital.

The process of selecting a broker and specifying capital and algorithm parameters for paper trading using the BlueShift trading platform is explained. Users can choose from various options such as Alpaca for US equities, OANDA for forex, and Master Trust for trading in Indian markets. The speaker demonstrates how BlueShift is used to specify the risk matrix with a drawdown limit of 30% and order and size limits of 1,000 and 10,000, respectively. Users have the flexibility to opt for auto-execution or the one-click confirmation method based on their preference. Once the user clicks on confirm, the algorithm starts running, and BlueShift establishes a connection with the Alpaca paper trading fraction. The dashboard continuously updates trading capital, trades, positions, and other relevant information in real-time.

The speaker highlights two products essential for quantitative trading: Conda and BlueShift. Conda is utilized to obtain data from various sources, including stock prices, cryptocurrencies, news, and social media. The course explains how to access fundamental reports or extract social media data into trading systems using APIs. BlueShift, the second product, is used for designing and testing strategies, employing econometric models and time series analysis. The course provides examples and code for various trading strategies such as mean diversion trading strategies, momentum trading strategies, and day trading strategies. Additionally, the course covers "Portfolio Management using Machine Learning Hierarchical Disparity" to facilitate portfolio management and risk control using machine learning methods. BlueShift enables backtesting of trading strategies on a wide range of datasets.

The availability of different datasets for practicing quantitative trading is discussed, encompassing US equities, cryptocurrencies, forex, Indian equities, and property data. Cloud-based and desktop-based deployments are explained, with cloud-based execution being handled by the broker. Desktop-based integration can be achieved using IBridgePy software, which connects to brokers like Interactive Brokers or eTrade. The students attending the session are offered a code for a 60% discount on all courses available on the ContraQuant website. The website offers courses suitable for beginners, intermediate traders, and advanced traders, covering a wide range of concepts such as neural networks, natural language processing (NLP), momentum strategies, options, futures, and pairs trading.

  • 00:00:00 A seminar on introductory quantitative trading is discussed, covering the creation, evaluation, and deployment of trading algorithms using code examples. The session introduces the concepts of quantitative trading, including the use of mathematical and statistical models to identify trading opportunities and execute trades. Various types of quantitative trading strategies are explained, such as momentum trading, mean diversion trading systems, mathematical models, high frequency trading, and news-based trading systems. Lastly, it is noted that algorithms are also used in market-making and to take advantage of inefficiencies in prices to make profit.

  • 00:05:00 The speaker explains the basic structure of a quantitative trading system, which includes data collection, creation of a trading strategy, backtesting, execution, and risk management. The most commonly used data for trading algorithms are price, fundamental, economic, and news data. Technical, statistical, and mathematical analysis can be used to design trading rules for the strategy. In backtesting, the rules are tested on historical data to evaluate their performance. The execution can be manual or automatic, and risk management helps with capital allocation and setting risk parameters like stop loss. The speaker also provides live examples of quantitative trading strategies.

  • 00:10:00 The speaker discusses the trend-based strategy used in quantitative trading and how it can be designed using technical indicators such as exponential moving average, parabolic SM, and stochastic oscillator on the Contra platform. The platform offers video tutorials, interactive exercises, and practical exposure without requiring the user to install any software. The speaker imports Python modules to help create the algorithm and imports data from a CSV file which is used to define trading rules and monitor strategy performance. The technical indicator parameters are set using TLA Python module which eases the design of these indicators.

  • 00:15:00 The instructor explains how to define trading rules and generate trading signals using technical indicators like the EMA, Stochastic fast, and Stochastic slow oscillators. They outline five trading conditions that need to be met to generate a buy signal and also design trading rules for short positions. The next step is to backtest the strategy to see how well it performs in practice, which they do using a Python notebook. The plot of strategy returns shows that the algorithm made a loss at the beginning of 2017 but picked up from 2018 and generated a profit by the end of the testing period. They also introduce BlueShift, a platform that allows users to research, construct, and backtest algorithms with just a click of a button.

  • 00:20:00 We see a demonstration of backtesting on the Bank of America stock using the Blue Shift platform. The platform provides data maintenance and a simple line of code for importing data into Python. Indicators and trading rules are defined and trades are taken automatically based on long and short conditions being met. The backtest is run from January 2020 to October 2021 with a capital of $10,000 and performance is compared to the S&P 500 benchmark. The results show a 113% return on investment. A more in-depth backtest can be run to obtain details about monthly returns, trades taken, and margin used, allowing for better trading decisions.

  • 00:25:00 The speaker demonstrates how to access the full backtest results on the Blueshift platform, including visual representations of performance metrics such as algorithm returns and monthly returns heat maps. They also explain how to analyze the positions taken by the algorithm and examine key metrics such as total profit made from long and short sides. The speaker then shows how to configure risk parameters and order limits before deploying the strategy in real-time, either through paper trading or with real capital.

  • 00:30:00 The speaker explains how to select a broker and specify the capital and algorithm parameters for paper trading using the BlueShift trading platform. The user can select from various options such as Alpaca for US equities, OANDA for forex, and Master Trust for trading in Indian markets. The speaker demonstrates how to use BlueShift to specify the risk matrix with a drawdown limit of 30% and order and size limits of 1,000 and 10,000, respectively. Users have the option of auto-execution or the one-click confirmation method based on their preference. The algorithm starts running once the user clicks on confirm and BlueShift starts connecting with the Alpaca paper trading fraction. The dashboard displays the trading capital, trades, positions, and updates every millisecond.

  • 00:35:00 The speaker discusses the two products to be used for quantitative trading, Conda and Blueshift. Conda is utilized to obtain data from various sources ranging from stock prices and cryptocurrency to news and social media. The course describes how to access fundamental reports or extract social media data into trading systems using APIs. The second product, Blueshift, is for designing and testing the strategies, using econometric models and time series analysis. The course offers examples and code for various trading strategies like mean division trading strategies, momentum trading strategies, and day trading strategies. Additionally, to perform portfolio management and control risk, they offer "Portfolio Management using Machine Learning Hierarchical Disparity" using machine learning methods. Blueshift enables backtesting of the trading strategies on a wide range of datasets.

  • 00:40:00 The speaker discusses the different datasets available for practicing quantitative trading, including US equities, cryptocurrencies, forex, Indian equities, and property data. There are two types of deployments available, cloud-based and desktop-based, with cloud-based execution being taken care of by the broker. The desktop-based integration can be done using IBridgePy software to connect to brokers like Interactive Brokers or eTrade. The students attending the session are given a code for a 60% discount on all courses available on the ContraQuant website, which offers courses suitable for beginners, intermediate and advanced traders and covers a wide range of concepts like neural networks, NLP, momentum strategies, options, futures, and pairs trading.
Hands-On Introduction To Quantitative Trading | Yale School of Management
Hands-On Introduction To Quantitative Trading | Yale School of Management
  • 2022.02.18
  • www.youtube.com
This is a 60-min session that introduces you to the world of quantitative trading. It covers the components of quantitative trading and explains the process ...
 

Predict Daily Stock Prices And Automate A Day Trading Strategy



Predict Daily Stock Prices And Automate A Day Trading Strategy

In the introductory webinar, the host introduces the main topic of the session, which is predicting daily stock prices and automating a day trading strategy. The session includes two project presentations. The first presentation is by Renato Otto from the UK, who discusses predicting daily stock prices using a random forest classifier, technical indicators, and sentiment data. Renato Otto is introduced as an experienced individual involved in the development of software and tools for quantitative analysis and systematic identification of market manipulation in the UK energy market.

Renato Otto shares the motivation behind completing the project, explaining that it was an opportunity to consolidate his knowledge in Python programming, data engineering, and machine learning into an end-to-end project. The project aimed to improve his skills and explore the power of machine learning and natural language processing in trading. Additionally, the goal was to create something reusable for others to use in their own analysis or strategy implementations. The project involves nine steps, starting with defining the analysis details in a dictionary and initializing a pipeline. The program then runs to obtain the dataset required for backtesting calculations. The presenter emphasizes the importance of testing the program's usability and ensuring the reliability of the final figures.

The speaker explains the methods involved in backtesting a day trading strategy. They discuss the back-test strategy class, which consists of various methods for data pre-processing, model training and testing, and strategy performance analysis. The output of the backtesting process includes tables and plots that show return on investment, sharp ratio, maximum drawdown, and other relevant parameters. While backtesting helps determine the potential profitability of the strategy, the speaker cautions that it simplifies certain aspects that may not hold true in live trading. The speaker mentions the latest improvement to the program, which involves updating the parameters to reflect real trading conditions, including transaction fees and account size.

During the presentation, the speaker also discusses the challenges faced during the development of the program. One challenge was implementing an interactive menu that prompts users to input data, which required extra thinking and development effort. However, the speaker states that it was worth it as it made the program more user-friendly. Other challenges included finding solutions for performance metrics computation and maintaining a work-life balance. To overcome these challenges, the presenter recommends strategies such as drawing diagrams, writing comments as a stepping stone to code, taking breaks, conducting online searches, and consolidating knowledge. The presenter also highlights the achievements gained through the project, such as consolidating knowledge in quantitative finance and programming skills, gaining confidence in managing a project from start to finish, and demonstrating the power of machine learning in predicting stock prices.

The speaker discusses their plans for future projects after completing the current one. They mention their intention to study new strategies with different assets, expand their knowledge through their blog and interactions with other enthusiasts, research new strategies and machine learning models, and eventually implement profitable strategies in live trading. The speaker shares their contact information for further questions or inquiries about the project. The audience asks several questions, including the number of late nights spent on the project and whether the program can be used for cryptocurrency trading.

Regarding the data used for the project, the creator explains that they trained the model using daily Tesla prices since the inception of the company in 2009. The training process took five months, and the model was tested for a couple of years. In terms of risk reduction, the creator mentions that there isn't much that can be done on a machine learning model to reduce risk, but they assessed a reasonable amount of trades to ensure that most of them were profitable. The creator also answers questions about the time frame for predicting prices and the need for a high-powered PC for training the model.

The speaker explains the process of training a model and discusses the advantages of algorithmic trading over discretionary systems. They mention that it is possible to train a model using a computer without a GPU, although it may take several hours to arrive at a working model. However, they advise against relying on this approach regularly. When discussing the benefits of algorithmic trading, the speaker emphasizes the statistical confidence in most trades being profitable, making it more lucrative compared to discretionary trading. Lastly, the speaker expresses their expectations from the EPAC program, stating that it provided them with the fundamentals to understand algorithmic trading and the necessary tools to choose their specialization.

Next, the second speaker, Usual Agrawal from India, is introduced as a quantitative trader and business owner. Agrawal shares their experience of trading in the Indian markets for the past four years and the challenges they faced while managing their business alongside full-time trading. To overcome these challenges, Agrawal decided to automate their trading setups with the help of the EPAT course and the unconditional support from the Quantum City team. In their presentation, Agrawal showcases their fully automated trading setup called "Intraday Straddles," which combines uncorrelated setups to generate decent returns with minimum drawdowns. They discuss their approach to data collection, backtesting, front testing, deployment, and performance evaluation of their trading strategy.

During the presentation, the speaker dives into the details of the data, systems, and parameters used to backtest their day trading strategy. Their strategy involves creating straddles and strangles for the Nifty and Bank Nifty futures and options data using a one-minute timeframe. The speaker used two years' worth of data from March 2019 to March 2021, which covered both a low volatility period and the COVID-19 pandemic. They explain the different classes utilized for backtesting and the parameters tested, including variations in stop loss levels. Finally, the speaker presents the results of the backtesting process.

The presenter proceeds to discuss the outcomes of their backtesting and front testing of the day trading strategy. During the backtesting phase, they achieved a net return of 3.15 lakhs, equivalent to a 52.9% annual return. The hit ratio was calculated both normally and normalized, with the latter providing a more realistic picture. The sharp ratio was determined to be 3.78, and the equity curve received good support from a three-month simple moving average. However, during the front testing phase, the strategy did not perform as expected, earning only 70,000 rupees in 11 months, which corresponds to a 25% annual return. The equity curve remained flat, indicating that the strategy may not be performing well currently and requires further analysis. The presenter also shares the key challenges faced and lessons learned throughout the project, with major difficulties arising during data collection.

The speaker discusses some of the challenges encountered while developing the day trading strategy. One major obstacle was obtaining reliable intraday options data, which necessitated purchasing it from third-party vendors. Another challenge was the potential sampling bias due to focusing solely on the last two years of data, which might not accurately represent the overall performance of the strategy. Additionally, the speaker notes an overcrowding effect in the market, with many traders employing similar strategies. The speaker explains their decision to develop the strategy independently, allowing for custom adjustments. Finally, ongoing assessments of the strategy and efforts to diversify it for improved efficiency are highlighted.

The speaker addresses audience questions, including whether the program is executed manually or automated using cloud platforms, and how they selected the stocks for selling straddles and the typical stop-loss distance relative to the premium. The strategy applies only to the Nifty index and the Bank Nifty index due to liquidity issues, and the speaker cleans the data through trial and error, rectifying format changes and removing days with data errors.

The speaker answers two additional questions related to their day trading strategy. They discuss the stop loss percentage used for testing and the challenges they faced in programming without a background in computer engineering. They explain how they overcame these challenges with the help of the EPAT program and mentorship from Quadency. Furthermore, the speaker offers advice to aspiring quants and algorithmic traders, emphasizing the importance of exercising caution and implementing proper risk management when applying any trading strategy in practice.

The speaker highlights the significance of diversifying trading strategies and how it can help navigate drawdown phases in one strategy while others continue to perform well. They emphasize the need for thorough testing and spending time with each strategy to learn its nuances and effectively combine them. It is important to note that the information shared during the session is not intended as trading advice.

The host concludes the webinar by expressing gratitude to the speaker, Visual, for sharing their project and experiences. They inform the audience that the session recording will be available on their YouTube channel and that participants will receive an email containing necessary codes and GitHub links related to the discussed strategies. The host looks forward to hosting more interesting sessions in the upcoming months, further enriching the knowledge and understanding of the audience.

The webinar provided valuable insights into predicting daily stock prices and automating day trading strategies. The first presentation by Renato Otto focused on predicting stock prices using a random forest classifier, technical indicators, and sentiment data. The second presentation by Usual Agrawal showcased their fully automated trading setup, "Intraday Straddles," which combined uncorrelated setups to generate returns with minimum drawdowns. Both presenters shared their challenges, achievements, and learnings, offering valuable lessons to the audience. The webinar served as a platform to explore the power of machine learning and natural language processing in trading and provided a glimpse into the exciting world of algorithmic trading.

  • 00:00:00 The host introduces the topic of the webinar which is to predict daily stock prices and automate a day trading strategy. Two project presentations will be given, the first being on predicting daily stock prices with random forest classifier technical indicators and sentiment data, presented by Renato Otto from the UK, and the second being on how to automate an option day trading strategy, presented by Usual Agrawal from India. The host introduces Renato Otto and gives a brief background on him, including his experience and involvement in the development of software and tools for quantitative analysis and systematic identification of market manipulation in the UK energy market.

  • 00:05:00 The presenter discusses their motivation behind completing a project that involves predicting daily stock prices and automating a day trading strategy. They wanted to consolidate their knowledge in Python programming, data engineering, and machine learning into an end-to-end project that would improve their skills and explore the power of machine learning and natural language processing in trading. Additionally, they aimed to build something reusable for others to use in their own analysis or implementing strategies. The program involves nine steps, starting with providing details in a dictionary to define the analysis, followed by initializing a pipeline and running the program to obtain the data set for backtesting calculations. The presenter touches on the importance of testing the program's usability and ensuring that the figures at the end are reliable.

  • 00:10:00 The speaker explains the different methods involved in back-testing a day trading strategy. The back-test strategy class consists of several methods that can pre-process data, train and test models, and analyze the performance of the strategy. The output comprises tables and plots showing the return on investment, the sharp ratio, and the maximum drawdown, among other parameters. While the back-testing method is useful for determining the potential profitability of the strategy, the speaker cautions that it makes several simplifications that may not hold true for live trading. The latest improvement to the program includes updating the parameters to include transaction fees and account size to reflect real trading conditions.

  • 00:15:00 The presenter discusses the challenges that he encountered while developing the program for predicting daily stock prices and automating day trading. One of the challenges was the complexity of implementing an interactive menu that prompts users to input data. This required extra thinking and development, but it was worth it in the end because the program is user-friendly. Other challenges included finding solutions for performance metrics computation and maintaining work-life balance. To overcome these challenges, the presenter recommends drawing diagrams, writing comments as a stepping stone to actual code, taking breaks, Googling problems, and consolidating knowledge. The presenter also discusses the achievements gained through this project, such as consolidating knowledge in quantitative finance and programming skills, gaining confidence in managing a project from start to finish, and demonstrating how machine learning can be powerful in predicting the next day's stock price.

  • 00:20:00 The speaker discusses his plans for future projects after completing the current project on predicting daily stock prices and automating a day trading strategy. He mentions studying new strategies with different assets, expanding knowledge with other enthusiasts through his blog, researching new strategies and machine learning models, and eventually implementing profitable strategies in a live trading setting. Additionally, the speaker shares his contact information for those who want to ask questions or learn more about the project. The audience also asks several questions, including how many late nights the speaker had during the project and whether the program can be used in cryptocurrency.

  • 00:25:00 The creator used daily Tesla prices since the inception of the company in 2009 to train the model. The training process took five months, and the model was tested for a couple of years. Regarding reducing risk, the creator mentioned that there isn't much one can do on a machine learning model to reduce risk. Still, they assessed a reasonable or acceptable amount of trades to ensure that most of them were profitable. The creator also answered questions relating to the time frame for predicting prices and the need for a high-powered PC for training the model.

  • 00:30:00 The speaker discusses the process of training a model and the advantages of algorithmic trading over discretionary systems. He explains that it is possible to train a model using a computer without a GPU, and it may take several hours to arrive at a model that works. He notes that it is feasible to do this once, but not recommended for regular use. When asked about the benefits of algorithmic trading, the speaker states that there is a statistical confidence in most trades being profitable, making it more profitable than discretionary trading. Finally, the speaker shares his expectations from the epac program, stating that it provided him with the fundamentals to understand algo trading and the instruments to choose his specialization.

  • 00:35:00 The second speaker of the video, Usual Agrawal, is introduced as a quantitative trader and business owner from India. Agrawal has been trading in the Indian markets for the past four years and faced difficulties managing his business while trading full-time. This led him to automate his trading setups with the help of the EPAD course and unconditional support from Quantum City team. During the second presentation, Agrawal showcases his fully automated trading setup "Intraday Straddles," which combines uncorrelated setups to generate decent returns with minimum drawdowns. He also describes his approach to data collection, backtesting, front testing, deployment, and performance evaluation of his trading strategy.

  • 00:40:00 The speaker discusses the data, systems, and parameters used to backtest a basic day trading strategy that involves creating straddles and strangles for the Nifty and Bankruptcy futures and options data using a one-minute timeframe. The speaker used two years' worth of data from March 2019 to March 2021, which included both a low volatility period and the COVID-19 pandemic. The speaker then goes on to explain the different classes used for backtesting and the parameters tested, including varying stop loss levels. Finally, the speaker presents the results of the backtesting.

  • 00:45:00 The presenter discusses the results of their backtesting and front testing of a day trading strategy. In the backtesting phase, they earned a net return of 3.15 lakhs, which translates to a 52.9% annual return. The hit ratio was calculated both normally and normalized, with the latter giving a more realistic picture. The sharp ratio was 3.78 and the equity curve had good support from a three-month simple moving average. However, during the front testing phase, the strategy did not perform as expected, earning only 70,000 rupees in 11 months, which comes to a 25% annual return. The equity curve was flat, indicating that the strategy may not be performing well currently and needs to be analyzed. The presenter also shares their key challenges and learnings during this project, with major problems arising during data collection.

  • 00:50:00 The speaker discusses some of the challenges faced while developing a day trading strategy. One major issue was obtaining reliable intraday options data, which required purchasing it from third-party vendors. Another challenge was sampling bias, as the analysis only focused on the last two years of data, which may not accurately represent the overall performance of the strategy. Additionally, the speaker noted an overcrowding effect in the market as many traders are employing similar strategies. The speaker then shares the reason for choosing to develop the strategy independently, which allowed for custom adjustments. Finally, the speaker discusses ongoing assessments of the strategy and efforts to diversify it for greater efficiency.

  • 00:55:00 The speaker answers audience questions, including whether the program is executed manually or automated using cloud platforms, and how they chose which stock to sell straddle and how far the typical stop-loss was relative to the premium. The strategy only applies to the Nifty index and the Bank Nifty index due to liquidity issues, and the speaker cleans the data through trial and error, rectifying format changes and removing days with data errors.

  • 01:00:00 The speaker answers two questions about their day trading strategy, including the stop loss percentage they used for testing and the challenges they faced in programming without a background in computer engineering. They discuss how they overcame these challenges with the help of the EPAT program and mentorship from Quadency. The speaker also gives advice to aspiring quants and algorithmic traders, emphasizing that although the strategy presented may seem simple, it is important to exercise caution and proper risk management when applying it in practice.

  • 01:05:00 The speaker discusses the importance of diversifying trading strategies and how it can help when one strategy is in a drawdown phase while the others are performing well. He emphasizes the need to test and spend time with the strategies in order to learn and combine them in a way that works best. He provides a disclaimer that this is not trading advice, and expresses gratitude to Visual for sharing their project and experiences. The session recording will be available on their YouTube channel and participants will receive an email with necessary codes and GitHub links. The host looks forward to more interesting sessions in the upcoming months.
Predict Daily Stock Prices And Automate A Day Trading Strategy
Predict Daily Stock Prices And Automate A Day Trading Strategy
  • 2022.02.08
  • www.youtube.com
This session has project presentations by two of our esteemed EPAT alumni. First on “Predict daily stock prices with random forest classifier, technical indi...
 

Implementing Pricing Model and Dynamic Asset Allocation: Algo Trading Project Webinar



Implementing Pricing Model and Dynamic Asset Allocation: Algo Trading Project Webinar

During the webinar, the presenter introduces the first speaker, Evgeny Teshkin, a senior quantitative analyst from Russia. Teshkin presents his project on implementing a pricing model using Kalman filtering adaptive to market regimes. He explains that the project serves as an educational example of how to use quantitative techniques of online machine learning in creating strategies development.

Teshkin emphasizes the advantages of online learning techniques, which enable deeper automation and real-time trading, making it more efficient than traditional model retraining. The main objective of his project is to create trading strategies that improve simple sector investing, with a specific focus on the big tech sector of the USA stock market, including companies like Facebook, Apple, Netflix, Google, Amazon, and Microsoft.

The speaker goes on to discuss the approach he used to implement a pricing model and dynamic asset allocation for his algo trading project. He explains that he employed statistical and quantitative techniques for long-only positions, selecting entry and exit points, and determining undervalued or overvalued prices relative to other stocks in the sector.

To achieve this, Teshkin utilized various models such as linear regression, principal component analysis (PCA), and Kalman filter. These models helped calculate residuals and find optimal coefficients for the statistical linear spread between correlated stocks within the sector. He highlights the importance of relative value and explains that the online learning approach used a look-back window of one year, taking inputs such as stock price and the dentists' index into account.

The speaker delves into the different models he employed to address data analysis problems in his algo trading project. He mentions using techniques such as the extraction of orthonormal non-correlated components of variance, the Kalman filter, and hidden Markov models. He explains how these models were incorporated into his approach and provides resources for further learning. Additionally, he discusses the results of his project and shares some tricks he utilized to increase potentially profitable positions.

Next, the speaker discusses how he managed to beat the market by buying and selling stocks based on simple end-of-day quotes and deltas. He explains that the risks associated with this strategy were overcome by using multiple entries and exits determined by online relative price techniques. He explores the concept of stock relative pricing for determining entries and exits, along with the use of online machine learning to build automated real-time pricing models.

The speaker encourages the audience to explore their project online, offering the opportunity to download the code and contact them for further questions. They also mention that the webinar will be recorded and made available on their YouTube channel, along with the presentation file and relevant links. During the session, the speaker engages with the audience, answering questions about their participation in algo trading competitions and clarifying whether the results presented were from actual trading or just backtesting.

Following the presentation, the webinar presenter addresses several questions from viewers regarding the algo trading project. They cover topics such as the use of linear regression for optimal correlation, the performance of the buy and hold strategy compared to the optimized trading strategy, and the inclusion of hidden states in the statistical model. The presenter provides insightful responses, expanding on the project details and explaining the decision-making behind their approach.

The webinar then moves on to the introduction of the next project, which focuses on dynamic asset allocation using neural networks. The speaker explains that their project aims to build an automated system for the "buy today sell tomorrow" strategy on banking stocks with minimal manual intervention. They discuss the model development, strategy implementation, and risk management aspects of their project, emphasizing the use of deep learning models trained on historical data for nifty bank stocks.

The speaker elaborates on the strategy, which involves combining the outputs from different models to determine the expected return for each stock. Based on these ratios, funds are distributed into respective stocks. The risk management part of the project deals with issues such as transaction cost and automation. The speaker emphasizes the importance of effectively managing risk in the trading algorithm.

Moving on, the speaker provides further insights into the strategy, risk management, and challenges faced during the development of the trading algorithm. They explain the implementation of a convergent architecture for both the probabilistic return model and the return model. The strategy involves calculating the expected return for each stock and dividing it by the return volatility to obtain a ratio. Available funds are then allocated proportionately to stocks with positive ratios, while portfolios are sold proportionately to expected losses. The algorithm is continuously updated, and stop-loss mechanisms are applied to mitigate risk. The speaker acknowledges challenges in automating the updating process and mentions the absence of a market microstructure strategy to determine optimal buying or selling prices.

The speaker proceeds to discuss the results of their backtesting efforts and the selection of a 20-day combination as the most appropriate for their model. They also mention upcoming steps in the project, including the integration of textual news scores for banking stocks and the development of an Android app-based solution for further automation. The audience has the opportunity to ask questions, leading to discussions on topics such as backtesting results and the use of stop-loss mechanisms in the model. The speaker shares that the backtesting returns have been decent, providing approximately 5% patterns over a specific time period. They also mention a beta testing phase that yielded a return close to 10% over the past six months.

In response to an audience question about the implementation of a stop loss, the speaker explains that they have incorporated a five percent stop loss of the portfolio value per investment value for each stock. When a stock's loss reaches five percent of the investment, it is automatically removed from the portfolio to limit the maximum loss to five percent. The speaker further addresses inquiries regarding the performance of dynamic asset allocation compared to a simple buy and hold strategy. They highlight that benchmarking against the Nifty Bank showed reasonable performance, close to five percent returns. The speaker also explains their decision to focus on the banking sector due to its reflection of overall market conditions and mentions that their background in machine learning facilitated their upskilling for the project.

Following the project presentations, a participant shares their positive experience with EPAT, emphasizing its value in terms of theoretical learning and practical implementation. They express appreciation for gaining a mathematical understanding of options and futures pricing and commend the program's support system and dedicated performance manager, who provided valuable guidance. Although the course was challenging, the participant believes it was essential for personal and professional growth. They encourage aspiring traders to explore and expand their knowledge beyond their current strengths, as they will gradually become adept in trading operations.

In the final part, the speakers stress the significance of applying the acquired knowledge in real-life scenarios as quickly as possible. They recommend utilizing the iPad course for daily trading experiments, facilitating continuous learning and growth. The webinar concludes with gratitude extended to the speakers and audience, along with a request for topic suggestions for future webinars.

  • 00:00:00 The webinar presenter introduces the first speaker, Evgeny Teshkin, a senior quantitative analyst from Russia, who presents his project on implementing a pricing model using Kalman filtering adaptive to market regimes. Teshkin explains that the project is an educational example of how to use quantitative techniques of online machine learning in creating strategies development. He emphasizes that the online learning technique allows for deeper automation and real-time trading, which is more efficient than traditional model retraining. The objective of the project is to create trading strategies that improve simple sector investing, focusing on the big tech sector of the USA stock market, such as Facebook, Apple, Netflix, Google, Amazon, and Microsoft.

  • 00:05:00 The speaker explains their approach to implementing a pricing model and dynamic asset allocation for an algo trading project. The approach involved using statistical or quantitative techniques for long-only positions, picking up entries and exit points, and determining undervalued or overvalued prices relative to other stocks in the sector. The speaker used linear regression, principal component analysis, and Kalman filter models to calculate residuals and find optimal coefficients for the statistical linear spread between correlated stocks of the sector. The key point was the relative value of the stock, and the online learning had a look-back window of one year using inputs such as stock price and the dentists' index.

  • 00:10:00 The speaker discusses different models he used for solving data analysis problems for his algo trading project, including the extraction of orthonormal non-correlated components of variance, the Kalman filter, and hidden Markov models. He explains how he uses these models in his approach and provides resources for further learning. Additionally, he discusses the results of his project and the tricks he used to increase potentially profitable positions.

  • 00:15:00 The speaker discusses the approach used to beat the market by buying and selling stocks based on simple end-of-day quotes and deltas. They also explain how they have managed to overcome the risks associated with this strategy by using multiple entries and exits determined by online relative price techniques. The concept of using stock relative pricing for determining entries and exits is explored, as well as the use of online machine learning to build automated real-time pricing models. The speaker encourages the audience to check out their project online and to feel free to download their code and contact them for further questions. The webinar will be recorded and made available on the YouTube channel, along with the presentation file and links. The speaker also responds to questions from the audience about their participation in algo trading competitions and whether the results presented were from actual trading or just backtesting.

  • 00:20:00 The webinar presenter answers several questions from viewers about their algo trading project. One viewer asked about the use of linear regression for optimal correlation with target VR, with the presenter explaining that the inputs for the regression model were simply price deltas for other stocks. Another viewer asked about why the buy and hold strategy seemed to work the best, to which the presenter responded that while it may provide the most total profit, the goal of the project was to beat the risk, and risk-adjusted return was actually higher for the optimized traffic strategy. The presenter also addressed a question about hidden states in the statistical model used in the project.

  • 00:25:00 The speaker explains the states and features he used for his analysis in developing an algo trading project. He chose two to three states as market regimes, calculated by parameters such as price deltas for sector ETF and big deltas as observable market indicators. The features he used were simple, such as price deltas and their moving averages, and he also extracted the first and second components from these deltas for linear regression. In terms of selecting principal components to use for PCA, the strategy was to use the first and at least one other component, as they explain most of the variation in the sector. The speaker also mentions that while predicting volatility is another area to explore, this project focused on predicting price to improve trading risk.

  • 00:30:00 The presenter answers a couple of questions from the audience. One question is about whether the pricing model has been backtested on other instruments such as crypto or forex, to which the presenter explains that they have not yet, but that the concepts could be applied to a variety of financial instruments. Another question is about whether trading futures is easier to forecast than stocks using machine learning, and the presenter explains that it depends on the model, but that the principles are the same and recommends keeping it simple to avoid overfitting. The presenter then introduces the next project, which is about dynamic asset allocation using neural networks.

  • 00:35:00 The presenter discusses his project on "dynamic asset allocation using neural networks" aimed at building an automated system for the "buy today sell tomorrow" strategy on banking stocks with minimal manual intervention. The solution consists of model development, strategy, and risk management parts. Model development involves developing a set of three deep learning models, including a probabilistic model and two return-based models, by training them on five years of data for 12 nifty bank stocks. The strategy involves combining the output from these models to arrive at an expected return for the stock and then distributing funds into respective stocks based on ratios. Lastly, the risk management part includes dealing with issues like transaction cost and automation.

  • 00:40:00 The speaker explains the strategy, risk management, and challenges they faced in developing their trading algorithm. They used a convergent architecture to build both their probabilistic return model and their return model. The strategy involved calculating the expected return for each stock and dividing it by the return volatility to get a ratio. They then distributed their available cash in proportion to the positive s ratios and sold the portfolios proportionately to the expected losses. The algorithm was updated dynamically, and they applied stop-losses to stocks. One challenge was automating the updating process, and another was not having a market microstructure strategy to suggest the best price to buy or sell.

  • 00:45:00 The speaker discusses the results of their backtesting and how they arrived at using a 20-day combination as the most appropriate for their model. They also mention upcoming steps such as integrating textual news scores for banking stocks and further automating the model into an Android app-based solution. The speaker also answers questions from the audience, including questions about the backtesting results and the use of stop loss in their model. The backtesting returns have been decent, giving around 5% patterns over a period of time, and the beta testing has given a return of close to 10% in the past six months.

  • 00:50:00 The speaker explains that they implemented a stop loss of five percent of the portfolio value per investment value on the stock. When a stock loses five percent of what is invested in it, it is removed from the portfolio to limit the maximum loss to five percent for any stock. The speaker then answers questions about whether the dynamic asset allocation performs better than simple buy and hold, and they explain that they benchmarked it against the Nifty Bank and found that it performs reasonably well, close to five percent. The speaker also explains that they did not use hybrid parameter tuning for the neural network, and they chose the topic of the project to combine deep learning and trading, focusing on the banking sector as the market reflects the condition of banks. They also mention that their background in machine learning helped them upskill for the project.

  • 00:55:00 A participant shares their positive experience with EPAT, stating that it has been useful in terms of both theoretical learning and practical implementation. They note that it has helped them gain a mathematical understanding of how options and futures are priced. The participant also praises the program's support system and the dedicated performance manager who helped monitor their progress. While they found the course challenging, they believe that it was important for their growth as a creator and professional. Aspiring traders are encouraged to explore and not limit themselves to their current strengths, as they will eventually get a hang of how things operate.

  • 01:00:00 The speakers emphasize the value of practical knowledge over theoretical knowledge and urge participants to apply what they have learned in real life as fast as possible. They recommend using the iPad course for everyday experiments with trading to help participants grow by implementing and learning more. The webinar concludes with a thank-you to the speakers and audience and a request for topic suggestions for future webinars.
Implementing Pricing Model and Dynamic Asset Allocation: Algo Trading Project Webinar
Implementing Pricing Model and Dynamic Asset Allocation: Algo Trading Project Webinar
  • 2021.11.16
  • www.youtube.com
This session has project presentations by two of our esteemed EPAT alumni. First on “Implementing pricing (or market-making) model using Kalman filtering ada...
 

Applying machine learning in trading by Ishan Shah and Rekhit Pachanekar | Algo Trading Week Day 7



Applying machine learning in trading by Ishan Shah and Rekhit Pachanekar | Algo Trading Week Day 7

Ishan Shah and Rekhit Pachanekar, the presenters of the webinar, begin by introducing themselves and expressing their excitement for the final day of the algo trading week. They announce the winners of the algo trading competition and commend their achievements. They mention that the focus of the day's presentation will be on machine learning and its applications in trading. They also inform the audience that there will be a Q&A session at the end of the presentation.

Rekhit Pachanekar takes the lead in starting the webinar and dives into the basics of machine learning. He uses image recognition as an example to explain how machine learning allows algorithms to learn from data and make decisions without extensive programming. He then discusses the role of machine learning in trading and investment, particularly in creating personalized investment portfolios based on various data points such as salary, profession, and region. Machine learning also helps assign weights to assets in a portfolio and assists in developing trading strategies. Pachanekar highlights the speed and data analysis capabilities of machine learning, which are utilized by hedge funds, pension funds, and mutual funds for investing and trading decisions.

Moving forward, Ishan Shah and Rekhit Pachanekar delve into the seven steps involved in building a machine learning model for trading. They emphasize that even individual retail traders can leverage machine learning technology to create their own trading strategies. The first step they discuss is defining the problem statement, which can range from a general desire for positive returns to more specific goals like determining the right time to invest in a particular stock such as JP Morgan. The second step involves acquiring good quality data, ensuring there are no missing or duplicate values and no outliers. The presenters stress the significance of data quality in constructing an accurate machine learning model.

Shah and Pachanekar proceed to explain the process of selecting input and output variables for a machine learning model in trading. They highlight the output variable, or the target variable, which represents the future return on a stock. They mention that a signal variable is assigned a value of 1 when future returns are predicted to be positive and 0 when they are predicted to be negative. The input variables, or features, must possess predictive power and meet the stationarity requirement, meaning they exhibit a mean and constant variance. They emphasize that variables like open, low, high, and close are not stationary and cannot be used as input features.

Next, the presenters discuss the selection of input features for their machine learning model in trading. They explain the need for stationary input features and achieve this by using percentage-change values for different time periods. They also stress the importance of avoiding correlation among input variables and demonstrate the use of a correlation heat map to identify and eliminate highly correlated features. The final selection of input features includes percentage-change values for different time periods, RSI (Relative Strength Index), and correlation. Before using the model for live trading, they split the dataset into training and testing sets to evaluate its performance.

The importance of ensuring the quality and relevance of data sets used in machine learning models is emphasized by the speakers. They introduce the concept of decision trees and inquire about attendees' personal decision-making processes when it comes to buying stocks or assets, mentioning responses ranging from technical indicators to recommendations from friends. They assert the need to establish a mental model for decision-making based on personal experiences when using such features. They introduce random forests as a way to overcome issues of overfitting and explain the use of Bayesian trees as a foundation for decision trees.

Shah and Pachanekar explain how machine learning algorithms, specifically decision trees, can be utilized to create rules for trading. These rules, incorporating technical indicators like ADX (Average Directional Index) and RSI, enable traders to make decisions based on predefined conditions. To ensure that these rules are not solely based on luck, the presenters introduce the concept of a random forest. They explain that a random forest combines multiple decision trees to create a more generalized and reliable trading strategy. By randomly selecting a subset of features for each tree, the random forest reduces the chances of overfitting and provides more accurate predictions. The presenters discuss various parameters required for the random forest algorithm, including the number of estimators, maximum features, and maximum depth of the tree.

Moving on, the presenters delve into the implementation of a random forest classifier for applying machine learning in trading. They emphasize the importance of controlling the depth of the decision tree and randomly selecting features to avoid overfitting and ensure consistent outputs. The random forest classifier learns rules from input features and expected outputs, which are then used to make predictions on unseen data. They also mention that the performance of the model can be measured using various metrics.

The presenters then discuss the significance of evaluating the effectiveness of a machine learning model before making real-money investments based on its recommendations. They introduce the concept of accuracy, which involves verifying whether the model's predictions align with the actual market outcomes. They highlight that the accuracy of a model typically ranges from 50% to 60% and caution that a high accuracy rate does not guarantee good results. They suggest using a confusion matrix to compare actual versus predicted labels and calculate performance metrics such as precision, recall, and F1 score to assess the model's performance.

In detail, the accuracy of the model is thoroughly discussed, and a poll is conducted to establish its accuracy rate, which is calculated to be 60%. However, when checked label-wise, the accuracy for the long signal drops to 33%. This raises the question of whether an increase in overall accuracy will result in a profitable trading model. The presenters emphasize that accuracy is a crucial factor in determining the effectiveness of a model in predicting the market. They point out that a high overall accuracy does not necessarily lead to profitability and that other factors need to be considered.

Shah and Pachanekar then shift their focus to discussing different metrics used to evaluate the performance of a trading model, including precision, recall, and the F1 score. They note that while recall can help overcome issues with imbalanced data, it can be an unreliable metric when used on its own. Instead, they recommend using a combination of precision and recall to calculate the F1 score, which provides a more comprehensive evaluation of the model's performance. They highlight the importance of backtesting the model to ensure its effectiveness in real-world trading scenarios and caution against overfitting the model.

The presenters address the concerns of overfitting in real-world settings and suggest strategies to handle it based on the specific machine learning model used. They stress the significance of understanding the model's parameters, limiting the number of features, and working on different hyperparameters for each type of machine learning model. They emphasize the importance of using real-world data without manipulation. Additionally, they discuss the applications of machine learning in trading beyond generating signals, such as its potential in risk management. They also touch upon the use of clustering algorithms to identify profitable opportunities in the market.

Ishan Shah and Rekhit Pachanekar conclude the webinar by discussing the advantages of using machine learning in trading, particularly in deciphering complex patterns that may be challenging for humans to identify. They suggest using machine learning as a complementary tool in the alpha identification process. The session ends with the presenters expressing their gratitude to the speakers and participants of the Algo Trading Week, and they invite any unanswered questions to be submitted through the survey.

  • 00:00:00 The presenters, Ishan Shah and Rekhit Pachanekar, introduce themselves and discuss the final day of the algo trading week. They highlight the winners of the algo trading competition and introduce the two speakers for the day. They mention that the presentation will focus on machine learning and that there will be a Q&A session at the end. Rekhit Pachanekar will begin the webinar and then pass it off to Ishan Shah.

  • 00:05:00 The video introduces the basics of machine learning, using image recognition as an example. Machine learning allows algorithms to learn from data and make decisions, unlike conventional computer programs that require extensive programming. The video then explains the role of machine learning in trading and investment, particularly in creating investment portfolios for individuals based on data such as salary, profession, region, etc. Machine learning also assigns weights to assets in a portfolio and assists in creating trading strategies. Hedge funds, pension funds, and mutual funds utilize machine learning's speed and ability to analyze large amounts of data for investing and trading decisions.

  • 00:10:00 The presenters discuss the seven steps to build a machine learning (ML) model for trading and how even individual retail traders can utilize ML technology to create their own trading strategies. The first step involves defining the problem statement which can be as simple as wanting to make positive returns, but with further refining, it can become more specific such as determining the right time to invest in a particular stock like JP Morgan. The second step is to obtain good quality data and ensure that there are no missing or duplicate values, as well as no outliers in the data. The presenters emphasize the importance of data quality in building an accurate ML model.

  • 00:15:00 Ishan Shah and Rekhit Pachanekar explain the process of selecting input and output variables for a machine learning model in trading. The output variable, or the target variable, is the future return on a stock, and a signal variable is assigned a value of 1 when future returns are predicted to be positive and 0 when they are predicted to be negative. The input variables, or features, must have predictive power and meet the stationarity requirement, meaning that they have a mean and constant variance that swings back and forth like a pendulum. The open, low, high, and close variables are not stationary, so they cannot be used as input features.

  • 00:20:00 The speakers discuss the process of selecting input features for their machine learning model in trading. They note that the model requires stationary input features, which they achieve by taking percentage-change values for various time periods. They also emphasize the importance of avoiding correlation among input variables and use a correlation heat map to remove features that are highly correlated. The final selection of input features includes percentage-change values for different time periods, RSI, and correlation. Before using the model for live trading, they split their data set into training and testing sets to evaluate the performance of the model.

  • 00:25:00 The speakers discuss the importance of ensuring the quality and relevance of data sets used in machine learning models before determining which model to use. They also introduce the concept of decision trees and ask attendees how they personally decide on whether or not to buy a particular stock or asset, with responses ranging from technical indicators to recommendations from friends. The speakers state that it is important to establish a mental model for decision-making based on personal experiences when using such features. They introduce the concept of random forests and the use of Bayesian trees as a basis for decision trees.

  • 00:30:00 The speakers explain how to use machine learning algorithms, specifically a decision tree, to create rules for trading. These rules, which could include technical indicators like ADX and RSI, allow traders to make decisions based on predefined conditions. To ensure that these rules are not created based solely on luck, the speakers introduce the concept of a random forest, which uses multiple decision trees to create a more generalized and reliable trading strategy. By selecting a subset of features randomly for each tree, the random forest reduces the chances of overfitting and provides a more accurate prediction. The speakers discuss the various parameters required for the random forest algorithm, including the number of estimators, maximum features, and the maximum depth of the tree.

  • 00:35:00 The speakers discuss the parameters and code involved in implementing a random forest classifier to apply machine learning in trading. They explain the importance of controlling the depth of the decision tree and randomly selecting features to avoid overfitting and ensure consistent outputs. The random forest classifier requires input features and expected outputs to learn rules and create decision trees which are then used to make predictions on unseen data. The performance of the model can be measured using various metrics.

  • 00:40:00 The presenters discuss the importance of evaluating the effectiveness of a machine learning model before investing real money based on its recommendations. They introduce the concept of accuracy, which involves verifying whether the model's predictions match what actually happened in the market. They highlight that the accuracy of a model typically ranges from 50% to 60% and that a high accuracy rate does not necessarily guarantee good results. To determine a model's performance, presenters suggest using a confusion matrix to compare actual versus predicted labels and calculate performance metrics such as precision, recall, and F1 score.

  • 00:45:00 The accuracy of the model is discussed in detail, with a poll conducted to establish it. The model's accuracy is calculated to be 60%, although when checked label-wise, the accuracy for the long signal drops to 33%. This raises the question of whether an increase in accuracy will result in a profitable trading model. The accuracy of the model is important as it helps to determine how effective it is in predicting the market, and in this case, a high overall accuracy may not necessarily lead to profitability.

  • 00:50:00 Shah and Pachanekar discuss the different metrics used to evaluate the performance of a trading model, such as precision, recall, and F1 score. They note that while recall can help overcome issues with imbalanced data, it can also be an unreliable metric on its own. Instead, they recommend using a combination of precision and recall to calculate the F1 score. This score can easily be constructed using a confusion matrix, and a high F1 score indicates a model that is worthy of trading. They also discuss the importance of backtesting the model to ensure that it performs well in practice and caution against overfitting the model.

  • 00:55:00 Models can overfit, which means they are too closely fit to the training data and may not work well on new data. On the other hand, over-optimization is the result of repeatedly backtesting and tweaking a trading strategy to get the desired outcome. This can lead to finding one special case that works well on training and testing data, but may not work on live data. To avoid over-optimization, it is important to have robust models that work across multiple asset classes, use risk management tools like stop-loss mechanisms, and not overfit or over-optimization during backtesting.

  • 01:00:00 Overfitting occurs when the model tries to fit to the training dataset too closely, which is indicated by a high accuracy rate on the training data. On the other hand, underfitting happens when the model cannot learn from the data as expected, as evidenced by a very low accuracy rate. One way to quantify this is by measuring the accuracy rate of the model, with an accuracy rate of 100 indicating overfitting and a very low accuracy rate indicating underfitting.

  • 01:05:00 The speakers address the concern of overfitting in real-world settings and suggest ways to handle it based on the specific model used. They emphasize the importance of understanding the parameters of the model, limiting the number of features, and working on different hyperparameters for each type of machine learning model. They also state that working with real-world data and not manipulating it is essential. Additionally, they discuss the applications of machine learning in trading, remarking that it is much better than just generating signals and has plenty of room in risk management. Lastly, they touch upon discovering alpha signals with machine learning models by using clustering algorithms to identify profitable poles in the market.

  • 01:10:00 Ishan Shah and Rekhit Pachanekar discuss the advantages of using machine learning in trading, particularly in deciphering complex patterns that humans may struggle to identify. Machine learning can produce more sustainable and robust alphas that decay over a longer period of time rather than immediately. They suggest using machine learning as a complement to the alpha identification process. The session ends with a thank you to the speakers and participants of the Algo Trading Week, and an invitation to ask any unanswered questions in the survey.
Applying machine learning in trading by Ishan Shah and Rekhit Pachanekar | Algo Trading Week Day 7
Applying machine learning in trading by Ishan Shah and Rekhit Pachanekar | Algo Trading Week Day 7
  • 2021.09.30
  • www.youtube.com
There has been a steady and quick growth when it comes to using Machine Learning, especially in the domain of trading. If you're a beginner, an aspirant, a p...