Modeling Finance versus Physics

What does it take to model and digitize the activities of fund management? 
The activities of fund management consists of five rational processes: screening, ranking, timing, weighting, and validation. Screening is a process of filtering, and ranking is a process of sorting. Timing and weighting use estimation techniques to compute the optimal timing and weighting conditions based on historical data and an investment objective, possibly in combination with an assumed propagation model for risks and rewards. Validation is the process of back testing the expectation values of risks and rewards over a preset validation period. Ranking may involve mean reversion or trend following. Estimation techniques are search engines that optimize an objective function providing a best fit between historical data and timed and weighted impulse responses. These techniques are the domain of experts in signal processing, econometrics, statistics and mathematics. Risks may also have an irrational component. It is up to the compliance authorities and legislation to delineate the line between irrational and rational battles in the financial markets. 

A center of academic and practical knowledge of quantitative investing.
FQS Capital appears to be a center of academic and practical knowledge of quantitative investing. The CEO and CIO of these funds of hedge funds is Robert J. Frey, a Research Professor in quantitative investing and hedge funds at the University of Stony Brook, NY, USA. FQS Capital distinguish between three steps in quantitative investing: (1) screening, (2) rigorous due diligence revealing the complete spectrum of result drivers, and (3) portfolio allocation and optimization. At first glance, these three processes appear to be similar to our processes of screening, ranking, and weighting, which JPM defined as the three main activities of fund management. This comparison is not possible in detail, as FQS Capital consider these processes as proprietary. However, when one starts to apply scientifically-objective statistical methods to the price movements of share prices or other investments, as a digital fund manager one almost naturally arrives at digitizing screening, ranking, and the optimization of timing and weighting of stocks in portfolios. It is then an almost obvious conclusion that the optimization processes are performed using estimation techniques. From a mathematical point of view, it does not make much of a difference to work with portfolios of funds or with portfolios of single stocks. We wonder why in the current literature, little attention is paid to the validation of the resulting risks and rewards. Perhaps, the reason for this spurious attention may be a proprietary assumption of propagation-like models for risks and rewards that do not allow for a back test beforehand. DigiFundManager allows for a back test over, say, 30 years beforehand, to validate a quantitative strategy under the condition that one uses Market-On-Close orders. That holds for both small and large portfolios of up to over a thousand stocks. In addition, from a mathematical point of view, it does not make much of a difference whether to trade often (HF trading) or not frequent (LF trading).

Are financial markets truly unpredictable or are the causal relations from physics applicable?
In physics, a change in motion is thought to be caused by some kind of force. Random walks are used as simplified models of the so-called Brownian motion and diffusion of atoms and molecules. In 1905, Einstein assumed for the first time that such random atomic and molecular movements are governed by the classical equations of motion of physics linking cause and effect to randomized linear impulse responses. Propagation models are mathematical solutions to these equations. In physics, propagation implies the spreading in time and space of dependent variables based on causality. The financial scientific literature reveals that it may be tempting to apply the models of physics linking cause and effect to the field of finance in an effort to explain the movements of share prices in the financial markets. In our view, that is inappropriate, as causality takes out the fundamental unpredictability of the financial markets. When the distributions of the statistical results of the financial markets are assumed to be time-invariant and Normal (Gaussian), one gets pretty close to stating that one knows at least part of the future behavior of the markets. The inverse of the spread of these distributions can be identified as a measure for risk. It is called the Sharpe ratio, which was a finding ascribed to Sharpe, a Nobel laureate in economics. It is well-known that the Normal distribution is at variance with the empirical results of the financial markets. We impose the extra condition of ergodicity on our estimation technique. Ergodic behavior implies that a time average can be replaced by an ensemble average without any further assumptions on distributions and propagation models. In the fields of economics and econometrics, the conditions of supply- and/or demand-side may be imposed on the estimation techniques.

Does fundamental analysis give you a competitive edge?
Besides assumptions regarding the risk behavior of the financial markets, financial experts often give their Discounted Cashflow (DCF) projections for individual stocks or assets based on whatever information, insight or breaking news they have. These future projections are called analyst estimates. Such future projections or changes thereof can also be imposed on the ranking or estimation techniques as additional conditions. These additional conditions influence the future dynamics of the markets. Hence, the investor who knows and understands these consequences first appears to be at an advantage.

Does technical analysis give you a competitive edge?
Besides assumptions regarding the propagation of general risk behavior and future behavior resulting from fundamental analysis (DCF) of individual assets in the financial markets, some patterns of technical indicators are believed to possibly result in profitable trades. Such trading systems may require a time resolution of less than a quarter of a second, and that kind of playing field is not leveled for just any participant. In addition, the trend-following patterns of technical analysis have yet to be proven to be reliable based on objective scientific rationale. 

Does algorithmic trading give you a competitive edge?
Algorithmic trading digitizes the search for the best deal for the asset transfer at the time of the trade. Time resolution may get as small as a microsecond and becomes an important part of the trade. That makes this playing field not leveled for just any participant. In practice, this implies that the cost of rebalancing a calculated and optimized portfolio may vary per trading algorithm. The outcome of such trading algorithms may be different at different times for the same weighted portfolios. Therefore, the result of such algorithms are difficult to back test, and hence to validate. We prefer to use Market-On-Close orders (see below) that have a predictable cost structure in our field of Low-Frequency trading, but recognize that in the field of HF-trading, algorithmic trading may give a competitive edge for the costs of the trades.

Which market orders allow you to validate your game plan?
Whether in the long term such additional information puts the informed investor at an advantage can be revealed by systematic back testing if the trades can be validated in hindsight. In principle, validation is possible with Market-On-Close (MOC) orders, and the screening for a minimum liquidity can be set so that trading conditions approach the one for Perfect Competition. Such orders appear to warrant a leveled competitive field. As long as the additional information is imposed on the estimation techniques after it is common knowledge, in our view, it will not make much of a difference. In that case, the additional information is already taken care of by the historical exchange data that are used by the estimation techniques. As long as the additional information has not been universally spread, investors who know about it may be at an advantage. The line between foreknowledge and universal knowledge is the field of compliance officers.

Are the weighted averages of the past the expectation values of the future?
Like in the field of insurance, weighted averages of the past are the expectation values of the future. To calculate the expectation values in the financial markets, no distribution functions are required, nor any set of rules governing the dynamics of financial markets is required. Only the clean historical exchange data are needed to provide the best fit between these data and the optimally weighted and timed portfolios. These optimal portfolios are optimally tuned to one’s own investment objective in a computational search to balance risks and rewards. As a first approximation, one may expand the weighted average as a linear superposition of weighted financial results of each asset, assuming ergodicity. Using one’s investment objective as the objective function, estimating the weighting factors will search for weighting factors of the resulting optimum portfolios at any given time. The length of the back test (validation) can be taken as a measure of confidence. Screening is often done for a minimum trading liquidity, for a reasonable growth potential and for non-Penny stocks. For ranking, one may sort the stocks or assets in terms of decreasing likelihood to increase in price using a given set of correlation times. Such a ranking process essentially uses a process of mean reversion. The spread on averages is usually a measure for risks for which one can take the maximum drawdown.

So what does it take to model and digitize the activities of fund management?
You need a web designer, software programmer and system manager. The web designer develops the web layout of the user interface of the program, as all modern applications should interact through a browser. The software programmer writes the codes of the application (the five processes) and manages the cross-platforms events. The so-called quant is the one who understands the five processes of a fund manager and knows how to model these processes and put the models into software code. Usually another software programmer writes the event-driven cross-platform codes. The system manager takes care of the hosting, the setup and maintenance of the various server environments and the Content Management System. The use of Market-On-Close orders and the condition of Perfect Competition allow for reliable validations in foresight and hindsight.

Don’t banks have a public responsibility to not juggle the financial system into instabilities?
When banks want their quants to play with non-linear models and large positions, from a mathematical point of view it is known that optimization routines become prone to instabilities and local optima. Hence, when big players are going to play non-linear modelling, the game-outcome may be an unstable equilibrium.

Each private investor has his own set of choices and circumstances resulting in his own investment fund.
The five rational processes of fund management allow for the existence and computation of so-called optimal portfolios, each one being different for each given investment amount, for each given portfolio size, at each given time, and for each given set of correlation times. These portfolios are optimally tuned to investment objectives of balancing risks and rewards and to one’s own screening and ranking conditions given the quality of the data provider and the height of the broker fees. Practically each individual set of choices and circumstances results in its own unique set of optimal portfolios that can be validated over a preset number of years. It is a transparent process without proprietary hides.

Our findings after extensive back testing a variety of game plans can be summarized as follows:

  • The more you want to invest, the less stocks are available, and the more difficult it is to increase your rewards.
  • The less you want to invest, the faster your broker costs will eat up your rewards.
  • This holds also for more frequent trading and larger portfolio sizes.
  • However, larger portfolio sizes offer you a chance to better spread and hedge your risks.
  • For smaller and mid-sized investments, there often exist game plans that expect better-fitting risks and rewards than those programs offered by professional parties that let you participate in larger funds that accommodate some kind of averaged risk and reward profile.
  • In this field of opposing forces, you have the choice of ending up in a straitjacket of a common denominator or quantifying your own choices to let them optimally bloom.

What is the available software?
We know of four software packages on the market that allow one to set up one’s own game plan. The first three ones are SmartQuant, QuantShare, DLPAL, and are based on technical indicators and focus on intraday or High-Frequency (HF) trading of individual assets. The last one, DigiFundManager, is based on selecting and getting to trade optimal portfolios, using only the clean historical Market-On-Close exchange data for its computations and is based on preset holding periods of multiple of weeks (Low-Frequency trading). Each optimal portfolio can be uniquely tuned to one’s personal choices, to one’s own circumstances and to one’s investment objectives in terms of balancing risks and rewards. The complete set of optimal portfolios is validated over a preset validation period. These four packages appear to us to be transparent in their concepts and modelling. We haven't seen evidence that High-Frequency trading should outperform Low-Frequency trading. That is understandable from the point of view that trend following appears to be  less effective than mean reversion, and that the latter one has little to do with the frequency of trading.  

 

Jan G. Dil and Nico C. J. A. van Hijningen,

August 21, 2018.

Copyright © 2019 EnterErgodics. All Rights Reserved.