High frequency trading - page 12

 
The foreign exchange (FX) spot markets are well suited to high frequency trading. They are highly liquid, allow leverage, and trade 24 hours a day, 5 days a week. This paper documents and tests the stylized facts known about high-frequency FX markets. It then postulates a high frequency trading system on the basis of these stylized facts. Benchmarking confirms the robustness of the approach, demonstrating the role algorithmic trading has to play in higher frequency trading environments.
 

The SEC's Former Top "HFT Expert" Joins HFT Titan Citadel


Last April, we commented on the most blatant (pre) revolving door we had ever seen at the SEC (and there have been many): the departure of the SEC's head HFT investigator, Gregg Berman, who during his tenure at the agency (whose alleged purpose is to keep the "market" fair, efficient and unmanipulated) did everything in his power to draw attention away from HFTs. He did that, for example, by blaming Waddell and Reed for the May 2010 flash crash. This is what Berman, whose full title was the SEC's "Associate Director of the Office of Analytics and Research in the Division of Trading and Markets" said in the final version of the agency's Flash Crash report:

At 2:32 p.m., against this backdrop of unusually high volatility and thinning liquidity, a large fundamental trader (a mutual fund complex) [ZH: Waddell and Reed] initiated a sell program to sell a total of 75,000 E-Mini contracts (valued at approximately $4.1 billion) as a hedge to an existing equity position."

Several years later, when the HFT lobby made a coordinated push to eliminate human spoofers (which algos were apparently helpless against without regulatory intervention), the SEC changed its story entirely and blamed the flash crash on one solitary trader, Navinder Sarao. By then the SEC had lost all credibility. It had also lost Gregg Berman, who six months after quitting the SEC ended up taking a nondescript job at EY, where he joined the Financial Services Organization (FSO) of Ernst & Young LLP as a Principal focusing on market risk and data analytics.

We, for one, were surprised: having expended so much energy to cater to the HFT lobby, we were confident Berman would end up collecting a 7-figure paycheck from one of the world's most prominent high frequency frontrunning parasite firms. As a reminder, this is what we predicted when the creator of Midas, and Eric Hunsader's archnemesis, quit the SEC:

Gregg will find a hospitable and well-paid position after spending 6 years defending the well-paying HFTs lobby. In all likelihood after taking a 2-4 month break from the industry, he will pull a Bart Chilton, and will join either HFT powerhouse Virtu, perennial accumulator of former government staffers, Goldman Sachs, or - most likely - the NY Fed's shadow trading desk and the world's most leveraged hedge fund, Citadel itself. Because for every quo there is a (s)quid.

In retrospect, Berman's detour into E&Y ended up being just that: an attempt to mask his true career intentions by taking a less than 1 year "sabbatical" from his true calling: getting compensated from the very HFT industry whom he did everything in his power to reward generously during his tenure at the SEC.

Well, as it turns out, we were right after all, because lo and behold, as the WSJ first reported, Gregg Berman is now director of market-structure research at the world's most levered hedge fund, HFT powerhouse and massive electronic market-making firm: Citadel, which also happens to be the entity through which the NY Fed intervenes in the market.

And just like that all is well again in the corrupt world, in which the market "regulators" pretends to protect the little guy, when in reality all they only cater to the most criminal with the simple hope of landing a job there one day and getting paid in 1 year what they make in 10 at the SEC or any other government agency.

 

Forex prices to get even faster! EBS set to launch ultra fast data service

EBS says it is about to launch what it calls its "Ultra" data service, a high-speed feed

  • "EBS Live Ultra" service
  • It will update EBS snapshot of the market five times more regularly than the existing EBS Live feeds
  • EBS Chief Strategy Officer Tim Cartledge says launch is "imminent"
  • To be priced the same as the current EBS Live feed
I've posted before on the increasing presence of high speed algos in the FX price-making market. While not the exclusive driver of speedier feeds such as this 'Ultra' feed, you can expect faster and faster speeds as more of these types of firms enter.

Rival firm Currenex is expected to launch a competitive produce ('NOW') soon also.

-
Electronic Broking Services (EBS) is an electronic trading platform
  • Used by market-making banks & institutions to trade forex 
  • Its an anonymous matching platform
 
on my own:
Any book with HFT applied to mt4?

I don't think that it is possible to HFT from MT

 
The objective of this paper is to calculate, model and forecast realized volatility, using high frequency stock market index data. The approach taken differs from the existing literature in several aspects. First, it is shown that the decay of the serial dependence of high frequency returns with the sampling frequency, is consistent with an ARMA process under temporal aggregation. This finding has important implications for the modelling of high frequency returns and the optimal choice of sampling frequency when calculating realized volatility. Second, motivated by the outcome of several test statistics for long memory in realized volatility, it is found that the realized volatility series can be modelled as an ARFIMA process. Significant exogenous regressors include lagged returns and contemporaneous trading volume. Finally, the ARFIMA's forecasting performance is assessed in a simulation study. Although it outperforms representative GARCH models, the simplicity and flexibility of the GARCH may outweigh the modest gain in forecasting performance of the more complex and data intensive ARFIMA model.
 

Bundesbank says HFT market makers typically pull out during periods of high volatility


A report from Germany's central bank, the Bundesbank, studied data from Bund and DAX futures markets

These are the two most liquid German investment instruments in which HFT makes up a significant portion of trading activity
The BUBA divided high-frequency firms into two broad types:
  • those that trade actively on news
  • those that act as market-makers
The first type were particularly active during periods of high market volatility, and therefore contributed to that volatility
While the second group tended to withdraw from markets during periods of high market stress, i.e. not making markets when needed the most
(Sounds to me like the BUBA is confusing a market maker with a charitable institution, but so be it)

The Wall Street Journal and Bloomberg both have more,
 

Automated high-frequency trading has grown tremendously in the past 20 years and is responsible for about half of all trading activities at stock exchanges worldwide. Geography is central to the rise of high-frequency trading due to a market design of “continuous trading” that allows traders to engage in arbitrage based upon informational advantages built into the socio-technical assemblages that make up current capital markets. Enormous investments have been made in creating transmission technologies and optimizing computer architectures, all in an effort to shave milliseconds of order travel time (or latency) within and between markets. We show that as a result of the built spatial configuration of capital markets, “public” is no longer synonymous with “equal” information. High-frequency trading increases information inequalities between market participants.



 

Quantitative tools have been widely adopted in order to extract the massive information from a variety of financial data. Mathematics, statistics and computers algorithms have never been so important to financial practitioners in history. Investment banks develop equilibrium models to evaluate financial instruments; mutual funds applied time series to identify the risks in their portfolio; and hedge funds hope to extract market signals and statistical arbitrage from noisy market data. The rise of quantitative finance in the last decade relies on the development of computer techniques that makes processing large datasets possible. As more data is available at a higher frequency, more researches in quantitative finance have switched to the microstructures of financial market. High frequency data is a typical example of big data that is characterized by the 3V’s: velocity, variety and volume. In addition, the signal to noise ratio in financial time series is usually very small. High frequency datasets are more likely to be exposed to extreme values, jumps and errors than the low frequency ones. Specific data processing techniques and quantitative models are elaborately designed to extract information from financial data efficiently. In this chapter, we present the quantitative data analysis approaches in finance. First, we review the development of quantitative finance in the past decade. Then we discuss the characteristics of high frequency data and the challenges it brings. The quantitative data analysis consists of two basic steps: (i) data cleaning and aggregating; (ii) data modeling. We review the mathematics tools and computing technologies behind the two steps. The valuable information extracted from raw data is represented by a group of statistics. The most widely used statistics in finance are expected return and volatility, which are the fundamentals of modern portfolio theory. We further introduce some simple portfolio optimization strategies as an example of the application of financial data analysis. Big data has already changed financial industry fundamentally; while quantitative tools for addressing massive financial data still have a long way to go. Adoptions of advanced statistics, information theory, machine learning and faster computing algorithm are inevitable in order to predict complicated financial markets. These topics are briefly discussed in the later part of this chapter.



ATTENTION: Video should be reuploaded

 
Given a stationary point process, an intensity burst is defined as a short time period during which the number of counts is larger than the typical count rate. It might signal a local non-stationarity or the presence of an external perturbation to the system. In this paper we propose a novel procedure for the detection of intensity bursts within the Hawkes process framework. By using a model selection scheme we show that our procedure can be used to detect intensity bursts when both their occurrence time and their total number is unknown. Moreover, the initial time of the burst can be determined with a precision given by the typical inter-event time. We apply our methodology to the mid-price change in FX markets showing that these bursts are frequent and that only a relatively small fraction is associated to news arrival. We show lead-lag relations in intensity burst occurrence across different FX rates and we discuss their relation with price jumps.
 
Using high frequency data from the London Stock Exchange (LSE), we investigate the relationship between informed trading and the price impact of block trades on intraday and inter-day basis. Price impact of block trades is stronger during the first hour of trading; this is consistent with the hypothesis that information accumulates overnight during non-trading hours. Furthermore, private information is gradually incorporated into prices despite heightened trading frequency. Evidence suggests that informed traders exploit superior information across trading days, and stocks with lower transparency exhibit stronger information diffusion effects when traded in blocks, thus informed block trading facilitates price discovery.