Market Predictability - page 10

 
This is the specification for the Power Trading Agent Competition for 2012 (Power TAC 2012). Power TAC is a competitive simulation that models a “liberalized” retail electrical energy market, where competing business entities or “brokers” offer energy services to customers through tariff contracts, and must then serve those customers by trading in a wholesale market. Brokers are challenged to maximize their profits by buying and selling energy in the wholesale and retail markets, subject to fixed costs and constraints. Costs include fees for publication and withdrawal of tariffs, and distribution fees for transporting energy to their contracted customers. Costs are also incurred whenever there is an imbalance between a broker’s total contracted energy supply and demand within a given timeslot. The simulation environment models a wholesale market, a regulated distribution utility, and a population of energy customers, situated in a real location on Earth during a specific period for which weather data is available. The wholesale market is a relatively simple call market, similar to many existing wholesale electric power markets, such as Nord Pool in Scandinavia or FERC markets in North America, but unlike the FERC markets we are modelling a single region, and therefore we do not model location-marginal pricing. Customer models include households and a variety of commercial and industrial entities, many of which have production capacity (such as solar panels or wind turbines) as well as electric vehicles. All have “real-time” metering to support allocation of their hourly supply and demand to their subscribed brokers, and all are approximate utility maximizers with respect to tariff selection, although the factors making up their utility functions may include aversion to change and complexity that can retard uptake of marginally better tariff offers. The distribution utility models the regulated natural monopoly that owns the regional distribution network, and is responsible for maintenance of its infrastructure and for real-time balancing of supply and demand. The balancing process is a market-based mechanism that uses economic incentives to encourage brokers to achieve balance within their portfolios of tariff subscribers and wholesale market positions, in the face of stochastic customer behaviors and weather-dependent renewable energy sources. The broker with the highest bank balance at the end of the simulation wins
 
We provide a practical and technical overview of volatility trading strategies:
1) The insight for the design and back-testing of systematic volatility strategies
2) Understanding of risk-reward trade-off and potential pitfalls of volatility strategies

We focus on systematic and rule-based trading strategies that can be marketed as an investable index or a proprietary strategy:
1) Delta-hedged strategies for capturing the volatility and skew risk-premiums
2) Without delta-hedge: CBOE and customized options buy-write indices

We overview important implementation aspects:
1) Measuring the historic realized volatility
2) Forecasting the expected realized volatility
3) Measuring and forecasting implied and realized skew
4) Computing option delta consistently with empirical dynamics
5) Analysis of transaction costs
6) Managing the tail-risk of short volatility strategies
 
We introduce Kinetic Component Analysis (KCA), a state-space application that extracts the signal from a series of noisy measurements by applying a Kalman Filter on a Taylor expansion of a stochastic process. We show that KCA presents several advantages compared to other popular noise-reduction methods such as Fast Fourier Transform (FFT) or Locally Weighted Scatterplot Smoothing (LOWESS): First, KCA provides band estimates in addition to point estimates. Second, KCA further decomposes the signal in terms of three hidden components, which can be intuitively associated with position, velocity and acceleration. Third, KCA is more robust in forecasting applications. Fourth, KCA is a forward-looking state-space approach, resilient to structural changes. We believe that this type of decomposition is particularly useful in the analysis of trend-following, momentum and mean-reversion of financial prices.

An instrument exhibits financial inertia when its price acceleration is not significantly greater than zero for long periods of time. Our empirical analysis of 19 of the most liquid futures worldwide confirms the presence of strong inertia across all asset classes. We also argue that KCA can be useful to market makers, liquidity providers and faders for the calculation of their trading ranges.
 
This paper outlines a financial statement analysis for use in equity valuation. Standard profitability analysis is incorporated, and extended, and is complemented with an analysis of growth. The perspective is one of forecasting payoffs to equities. So financial statement analysis is presented first as a matter of pro forma analysis of the future, with forecasted ratios viewed as building blocks of forecasts of payoffs. The analysis of current financial statements is then seen as a matter of identifying current ratios as predictors of the future ratios that drive equity payoffs. The financial statement analysis is hierarchical, with ratios lower in the ordering identified as finer information about those higher up. To provide historical benchmarks for forecasting, typical values for ratios are documented for the period 1963-1996, along with their cross-sectional variation and correlation. And, again with a view to forecasting, the time series behavior of many of the ratios is also described and their typical "long-run, steady-state" levels are documented.
 
This paper is an investigation into the determinants of asymmetries in stock returns. We develop a series of cross-sectional regression specifications which attempt to forecast skewness in the daily returns of individual stocks. Negative skewness is most pronounced in stocks that have experienced: 1) an increase in trading volume relative to trend over the prior six months; and 2) positive returns over the prior thirty-six months. The first finding is consistent with the model of Hong and Stein (1999), which predicts that negative asymmetries are more likely to occur when there are large differences of opinion among investors. The latter finding fits with a number of theories, most notably Blanchard and Watson's (1982) rendition of stock-price bubbles. Analogous results also obtain when we attempt to forecast the skewness of the aggregate stock market, though our statistical power in this case is limited.
Files:
 
We hypothesize that insiders strategically choose disclosure policies and the timing of their equity trades to maximize trading profits, subject to the litigation costs associated with disclosure and insider trading. Accounting for endogeneity between disclosures and trading, we find that when managers plan to purchase shares, they increase the number of bad news forecasts to reduce the purchase price. In addition, this relation is stronger for trades initiated by chief executive officers than those initiated by other executives. Confirming this strategic behavior, we find that managers successfully time their trades around bad news forecasts, buying fewer shares beforehand and more afterwards. We do not find that managers adjust their forecasting activity when they are selling shares, consistent with higher litigation concerns associated with insider sales. Overall, our evidence suggests that insiders do exploit voluntary disclosure opportunities for personal gain, but only selectively, when litigation risk is sufficiently low.
 
Financial market volatility is an important input for investment, option pricing and financial market regulation. In this review article, we compare the volatility forecasting findings in 93 papers published and written in the last two decades. This article is written for general readers in Economics, and its emphasis is on forecasting instead of modelling. We separate the literature into two main streams; the first consists of research papers that formulate volatility forecasts based on historical price information only, while the second includes research papers that make use of volatility implied in option prices.

Provided in this paper as well are volatility definitions, insights into problematic issues of forecast evaluation, the effect of data frequency on volatility forecast accuracy, measurement of "actual" volatility, and the confounding effect of extreme values on volatility forecasting performance. We compare volatility forecasting results across different asset classes, and markets in different geographical regions. Suggestions are made for future research.
 
Prediction markets are markets for contracts that yield payments based on the outcome of an uncertain future event, such as a presidential election. Using these markets as forecasting tools could substantially improve decision making in the private and public sectors.

We argue that U.S. regulators should lower barriers to the creation and design of prediction markets by creating a safe harbor for certain types of small stakes markets. We believe our proposed change has the potential to stimulate innovation in the design and use of prediction markets throughout the economy, and in the process to provide information that will benefit the private sector and government alike.
 
We use Bayesian model averaging to analyze the sample evidence on return predictability in the presence of uncertainty about the return forecasting model. The analysis reveals in-sample and out-of-sample predictability, and shows that the out-of-sample performance of the Bayesian approach is superior to that of model selection criteria. Our exercises find that term premium and market risk premium are relatively robust predictors. Moreover, small-cap value stocks appear more predictable than large-cap growth stocks. We also investigate the implications of model uncertainty from investment management perspectives. The analysis shows that model uncertainty is more important than estimation risk. Finally, asset allocations in the presence of estimation risk exhibit sensitivity to whether model uncertainty is incorporated or ignored.
 
In this paper I explore the informational content in cross-currency flow maintained by large custodian banks with an objective to design a statistical arbitrage trading system that could exploit such information. After an initial simple test involving one-step ahead forecasts for JPYUSD FX pair with lagged I-Flow data series and concluding that such forecasts don’t measure up to a simple AR(1) model’s forecasts involving the FX pair time series itself, I introduce a 15-day moving standard deviation variables based off the I-Flow time series with a 5-day lag to the one-period I-Flow forecasting model to discover a considerable improvement in forecasts to the original model though still not bettering the simple AR(1) regression model of the FX time series. With the information from the initial tests at hand, we move on to explore the possibility of designing a system to forecast the swings observed in the 15-day moving standard deviation series of the JPYUSD FX pair. A partial dynamic equilibrium regression system involving a transformation of the individual I-Flow series similar to the FX pair series (15-Day moving standard deviation) is then experimented with to capture long-run stable 5-period ahead forecast for the 15-day moving standard deviation swings in the JPYUSD FX pair. Finally, I propose two models to exploit the volatility swing forecasting system; The first model involving volatility trades for the FX pair that could exploit an expected upswing in the short term volatility of the FX pair, and the second model involving an overlay of the swing forecasting system over traditional trend-forecasts involving technical rules to capture profitable long/short trades of the FX pair. Data between May 2007 and May 2009 was employed for the exercise.