A new factor model
consisting of the market factor, an investment factor, and a
return-on-equity factor is a good start to understanding the
cross-section of expected stock returns. Firms will invest a lot when
their profitability is high and the cost of capital is low. As such,
controlling for profitability, investment should be negatively
correlated with expected returns, and controlling for investment,
profitability should be positively correlated with expected returns. The
new three-factor model reduces the magnitude of the abnormal returns of a wide range of anomalies-based trading strategies, often to insignificance. The model's
performance, combined with its economic intuition, suggests that it can
be used to obtain expected return estimates in practice.
Files:
We
study the performance of mean-variance optimized (MVO) equity portfolios
for retail investors, in various markets in the U.S. and around the
world. Actively managed equity mutual funds have relatively high
fees and tend to underperform their benchmark. Index funds such as ETFs
still charge appreciable fees, and only deliver the performance of the
benchmark. We find that an MVO is relatively easy to manage by a retail
investor, and that they tend to outperform their benchmark or, at worst,
equal its performance, even after adjusting for risk. Moreover, we show
that the performance of these funds is not particularly sensitive to
the frequency at which they
are rebalanced so that, in the limit, an investor might have to
rebalance her portfolio only once per year. This last finding translates
into very low trading
costs, even for a retail investors. Thus, we conclude that MVOs offer an
easy, cheap alternative for a retail investors to invest in the world’s
equity markets.
Implementations of the Standard Initial Margin Model (SIMM) and the Sensitivity Based Approach (SBA) in the Fundamental Review of the Trading
Book (FRTB), both call for the calculation of sensitivities with
respect to a standardised set of risk factors. Since standard factors
are generally collinear and pricing functions are possibly rough,
finding sensitivities qualifies as a mathematically ill-posed problem
for which analytical derivatives do not provide a robust solution.
Numerical instabilities are particularly problematic since they hamper
reconciliation and make collateral optimisation strategies inefficient.
In this article, we introduce a method for calculating sensitivities based on ridge regressions to keep sensitivities small and stable. We find that a drift term and FX cross-gammas significantly improves the accuracy of the P&L explain achieved in the SIMM methodology. The method implies rigorous upper bounds on errors in P&L explain, on which basis we adjust Initial Margin conservatively in order to pass back-testing benchmarks.
In this article, we introduce a method for calculating sensitivities based on ridge regressions to keep sensitivities small and stable. We find that a drift term and FX cross-gammas significantly improves the accuracy of the P&L explain achieved in the SIMM methodology. The method implies rigorous upper bounds on errors in P&L explain, on which basis we adjust Initial Margin conservatively in order to pass back-testing benchmarks.
Inspired
by visualization techniques à la Feynman, we introduce Stochastic Flow
Diagrams (SFDs), a new mathematical approach to represent complex
dynamic systems into a single weighted digraph. This topological
representation provides a way to visualize what otherwise would be a
morass of equations in differences. SFDs model
the propagation and reverberation that follows a shock. For example,
reverberation explains how a shock to a financial system can initiate a
sequence of events that lead to a crash long after the occurrence of the
shock. SFDs can simulate systems in stable, steady or explosive state.
SFDs add Topology to the Statistical and Econometric toolkit. We believe
that SFDs will help policy makers, investors and researchers
communicate and discuss better the complexity of dynamic systems.
Files:
Stochastic Flow Diagrams.pdf
2267 kb
Academics
and practitioners have extensively studied Value-at-Risk (VaR) to
propose a unique risk management technique that generates accurate VaR
estimations for long and short trading
positions and for all types of financial assets. However, they have not
succeeded yet as the testing frameworks of the proposals developed,
have not been widely accepted. A two-stage backtesting procedure is
proposed to select a model
that not only forecasts VaR but also predicts the losses beyond VaR.
Numerous conditional volatility models that capture the main
characteristics of asset returns (asymmetric and leptokurtic
unconditional distribution of returns, power transformation and
fractional integration of the conditional variance) under four
distributional assumptions (normal, GED, Student-t, and skewed
Student-t) have been estimated to find the best model for three financial markets, long and short trading
positions, and two confidence levels. By following this procedure, the
risk manager can significantly reduce the number of competing models
that accurately predict both the VaR and the Expected Shortfall (ES)
measures.
In this thesis, problems in the realm of high frequency trading
and optimal market making are established and solved in both single
asset and multiple asset economies. For an agent that is averse to
holding large inventories for long periods of time, optimal high frequency trading
strategies are derived via stochastic control theory and solving the
corresponding Hamilton-Jacobi-Bellman equations. These strategies are
analyzed and it is shown that both inventory control and accounting for
adverse selection play critical roles in the success of an algorithmic trading strategy.
In the single asset problem, a market maker actively modifies her limit quotes in an economy with asymmetric information. She attempts to keep her inventory small and posts her limit orders in the limit order book at a depth that mitigates her adverse selection risk, while not posting too deep in the book as to generate no trade flow. In addition to this behaviour, a profit maximizing investor trading in multiple assets also seeks out statistical arbitrage opportunities and acts aggressively via the submission of market orders when it is deemed optimal to do so.
Throughout this thesis, numerical and practical considerations are made a priority. Full scale calibration and estimation methods are given in detail, as well as dimensional reductions for large scale numerical procedures, where appropriate. The bridge from abstract mathematical theory to practical real-time implementation is made complete as an entire chapter is dedicated to applications on real data.
In the single asset problem, a market maker actively modifies her limit quotes in an economy with asymmetric information. She attempts to keep her inventory small and posts her limit orders in the limit order book at a depth that mitigates her adverse selection risk, while not posting too deep in the book as to generate no trade flow. In addition to this behaviour, a profit maximizing investor trading in multiple assets also seeks out statistical arbitrage opportunities and acts aggressively via the submission of market orders when it is deemed optimal to do so.
Throughout this thesis, numerical and practical considerations are made a priority. Full scale calibration and estimation methods are given in detail, as well as dimensional reductions for large scale numerical procedures, where appropriate. The bridge from abstract mathematical theory to practical real-time implementation is made complete as an entire chapter is dedicated to applications on real data.
a lot of usefull information, but you can know all the theory and not have success in trading. The most hard thing is ti know who to use all these knowledge..and even in such cases you need some luck and intuition
Breakout detection with energy stats: Breakout detection from Twitter developers: https://blog.twitter.com/2014/breakout-detection-in-the-wild
http://www.slideshare.net/kuma0177/velocity-ny-2014v5-39160794
Technical
Analysis (TA) is a security analysis methodology based on the study of
past market data. Although it has been criticized by academics and the
profitability of many related strategies has been statistically
rejected, TA remains highly popular among practitioners and retail
investors, in particular. We analyze the role of TA for retail investors
trading structured products (knock-outs and warrants) on Stuttgart Stock Exchange. We find a 35% increase in trading activity on days of chart pattern trading signals and an 11% increase for moving average signals. The increase in activity typically reverses on the following trading days. Furthermore, we identify trading characteristics of round-trip trades and find that trades associated with TA trading
signals differ. First, we find significantly higher raw returns in
TA-related trades while leverage levels at purchase as well as holding
duration appear to be lower. Second, the shape of the realized return
distribution of trades in accordance to TA signals is distinct from
their peer groups. Specifically, realized returns are significantly less
left-skewed (more right-skewed). In this regard, retail investors using
TA methods might be less prone to the disposition effect due to the system-based trading
approach. If we assume a general gambling intention with respect to the
considered products, then TA-related trades tend to reach this goal
more effectively.
We test a Wall Street investment strategy, pairs trading,
with daily data over 1962-2002. Stocks are matched into pairs with
minimum distance between normalized historical prices. A simple trading
rule yields average annualized excess returns of up to 11 percent for
selffinancing portfolios of pairs. The profits typically exceed
conservative transaction costs estimates. Bootstrap results suggest that
the pairs effect differs from previously-documented reversal profits.
Robustness of the excess returns indicates that pairs trading
profits from temporary mis-pricing of close substitutes. We link the
profitability to the presence of a common factor in the returns,
different from conventional risk measures.
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Exponentials of squared returns in Gaussian densities, with their consequently thin tails, are replaced by the absolute return to form Laplacian and exponentially tilted Laplacian densities at unit time. Scaling provides densities at other maturities. Stochastic processes with these marginals are identified. In addition to a specific local volatility model the densities are consistent with the difference of compound exponential processes taken at log time and scaled by the square root of time. The underlying process has a single parameter, the constant variance rate of the process. Delta hedging using Laplacian and Asymmetric Laplacian implied volatilities are developed and compared with Black Merton Scholes implied volatility hedging.The hedging strategies are implemented for stylized businesses represented by dynamic volatility indexes. The Laplacian hedge is seen to be smoother for the skew trade. It also performs better through the financial crisis for the sale of strangles. The Laplacian and Gaussian models are then synthesized as special cases of a model allowing for other powers between unity and the square. Numerous hedging strategies may be run using different powers and biases in the probability of an up move. Adapted strategies that select the best performer on past quarterly data can dominate fixed strategies. Adapted hedging strategies can effectively reduce drawdowns in the marked to market value of businesses trading options.