Modeling Finance versus Physics

What does it take to model and digitize the activities of fund management? 
The activities of fund management consists of five rational processes: screening, ranking, timing, weighting, and validation. Screening is a process of filtering, and ranking is a process of sorting. Timing and weighting use estimation or regression techniques to compute the optimal timing and weighting conditions using historical exchange data and aiming at an investment objective. These two processes may be combined with assumptions on a probability distribution function and on a propagation model for asset prices. Validation is the process of back testing the expectation values of risks and rewards over a preset validation period. It is also called "In-Sample testing". Ranking may involve mean reversion or trend following. Estimation techniques are search engines that optimize an objective function providing a best fit between historical data and timed and weighted impulse responses. These techniques are the domain of experts in signal processing, econometrics, statistics and mathematics. Risks may also have an irrational component. It is up to the compliance authorities and legislation to delineate the line between irrational and rational battles in the financial markets. 

The five assumptions of quantitative finance
Quantitative finance is also called mathematical finance. The field studies the statistics of historical market prices. It is divided into two segments: (I) derivatives and (II) risk and portfolio management. The practitioner of the field is called a quant. He may invoke one or more of the following assumptions:

  1. Share prices and trading volumes fully reflect all available information up to each trade
  2. Share prices are established in perfect competition
  3. The time average of an individual fluctuation equals an ensemble average at that time
  4. Price fluctuations follow a time-invariant probability distribution function
  5. Each individual fluctuation satisfies a time-invariant equation of motion

Einstein's assumptions made in his theory of Brownian motion in 1905 are similar to the assumptions (4) and (5) for price fluctuations in the financial markets.

Are causal relations like in the Brownian motion governing price fluctuations on the exchanges?
The Brownian motion shows the causal relations between force, mass, and accelleration. Einstein was the first one to clarify these relations by stating that each individual fluctuation satisfies the diffusion equation. Physics is the science of causal motions: forces are considered as causes of motion. Each individual motion is determined in a deterministic manner in classical mechanics. Price movements on the financial markets are fundamentally unpredictable, so that causality cannot play a role in that environment.

How are the five assumptions simplifying the five processes of portfolio management?
Only five rational processes determine the result of investment management. The mathematical modeling of these processes is relatively simple. Only weighting is somewhat complex, as the optimizing process governs as many dimensions as the portfolio size. The five assumptions further simplify the mathematics of successful investing:

The Efficient Market Hypothesis (EMH)
Assumption (1) is called the Efficient Market Hypothesis (EMH). It implies that assumptions (2) - (5) do not add new information when applying statistical techniques to the historical price data. The only output that such assumptions can give is the quantification of so-called nuisance parameters that characterize these assumptions and are obtained by fitting the historical data to these assumptions. A nuisance parameter is any fitted parameter not of immediate interest. Adding nuisance parameters may increase the occurrence of overfitting and suboptimal behavior.

Trading in Perfect Competition
Assumption (2) implies that each individual trade does not influence the share price. This assumption implies that each individual trade involves an investment amount that is very small, say 1%, relative to the daily-dollar volume of the traded asset.

The five processes of portfolio management with four parameters of immediate interest
Portfolio management uses the processes of screening, ranking, timing, weighting, and validation. The parameters of the first four processes are of immediate interest. With the EMH governing the historical exchange data, these four processes in principle only need the exchange data. In the first two processes of screening and ranking, the quant configures the parameters governing these processes. He may use assumption (3) to introduce a time-invariant ranking process. This third assumption reflects ergodic behavior of the statistics. In the processes of weighting and timing, he calculates the weighting and timing parameters in a numerical optimization process with his investment objective being optimized.

Weighting according to Markowitz and risk factors according to Fama-French
The mathematical formulation of weighting was first introduced by Markowitz in 1952. He proposed to minimize the variance of the weighted portfolio results and assumed a Normal probability distribution function for these results. Standard regression then gives the weighting factors. We minimize the Risk/Reward ratio with the maximum drawdown acting as Risk and the sum of weighted annualized results as Reward. Assumptions (4) and (5) are not needed in that approach. By minimizing the Risk/Reward ratio, you calculate optimal portfolios. These optimal portfolios are weighted such that your investment objective is achieved with minimized risk and maximized rewards. The extension from one risk factor to a limited number of risk factors was introduced by Fama-French in 1993. This approach adds nuisance parameters but no new information to the optimization process of the weighting factors.

Machine learning (ML), neurolinguistic programming (NLP), and artificial intelligence (AI)
ML, NLP, and AI are techniques that parametrize assumption (5). As long as price fluctuations remain fundamentally unpredictable in line with assumption (1), employing these techniques result in fitting nuisance parameters that reveal no new information. In Einstein’s formulation of Brownian motion, the connection between past, present, and future of each individual fluctuation was governed by the diffusion equation with the coefficient of thermal diffusivity as fundamental parameter. Equivalent parameters for price movements quantify nuisance parameters, as for portfolio management only the parameters of screening, ranking, weighting, and timing are of immediate interest.

Does fundamental analysis give you a competitive edge?
Financial experts often give their Discounted Cashflow (DCF) projections for individual stocks or assets based on whatever information, insight or breaking news they have. These future projections are called analyst estimates. Such future projections or changes thereof can be imposed on the ranking process as additional conditions. These additional conditions influence the future dynamics of the markets. Hence, the investor who knows and understands these consequences first appears to be at an advantage.

Does technical analysis give you a competitive edge?
Besides DCF projections, some patterns of technical indicators are believed to possibly result in profitable trades. Such trading systems may require a time resolution of less than a quarter of a second, and that kind of playing field is not leveled for just any participant. In addition, the trend-following patterns of technical analysis have yet to be proven to be reliable based on objective scientific rationale. 

Does algorithmic trading give you a competitive edge?
Algorithmic trading digitizes the search for the best deal for the asset transfer at the time of the trade. Time resolution may get as small as a microsecond and becomes an important part of the trade. That makes this playing field not leveled for just any participant. In practice, this implies that the cost of rebalancing a calculated and optimized portfolio may vary per trading algorithm. The outcome of such trading algorithms may be different at different times for the same weighted portfolios. Therefore, the result of such algorithms are difficult to back test, and hence to validate. We prefer to use Market-On-Close orders (see below) that have a predictable cost structure in our field of Low-Frequency trading. We recognize that in the field of HF-trading, algorithmic trading may give a competitive edge for the costs of the trades.

Which market orders allow you to validate your game plan?
Whether in the long term such additional information puts the informed investor at an advantage can be revealed by systematic back testing if the trades can be validated in hindsight. In principle, validation is possible with Market-On-Close (MOC) orders, and the screening for a minimum liquidity can be set so that trading conditions approach the one for Perfect Competition. Such orders appear to warrant a leveled competitive field. As long as the additional information is imposed on the estimation techniques after it is common knowledge, in our view, it will not make much of a difference. In that case, the additional information is already taken care of by the historical exchange data that are used by the estimation techniques. As long as the additional information has not been universally spread, investors who know about it may be at an advantage. The line between foreknowledge and universal knowledge is the field of compliance officers.

Are the weighted averages of the past the expectation values of the future?
Like in the field of insurance, weighted averages of the past are the expectation values of the future. To calculate the expectation values in the financial markets, no distribution functions are required, nor any set of rules governing the dynamics of financial markets is required. Only the clean historical exchange data are needed to provide the best fit between these data and the optimally weighted and timed portfolios. These optimal portfolios are optimally tuned to one’s own investment objective in a computational search to balance risks and rewards. As a first approximation, one may expand the weighted average as a linear superposition of weighted financial results of each asset, assuming ergodicity. Using one’s investment objective as the objective function, estimating the weighting factors will search for weighting factors of the resulting optimum portfolios at any given time. The length of the back test (validation) can be taken as a measure of confidence. Screening is often done for a minimum trading liquidity, for a reasonable growth potential and for non-Penny stocks. For ranking, one may sort the stocks or assets in terms of decreasing likelihood to increase in price using a given set of correlation times. Such a ranking process essentially uses a process of mean reversion. The spread on averages is usually a measure for risks for which one can take the maximum drawdown.

So what does it take to model and digitize the activities of fund management?
You need a web designer, software programmer and system manager. The web designer develops the web layout of the user interface of the program, as all modern applications should interact through a browser. The software programmer writes the codes of the application (the five processes) and manages the cross-platforms events. The so-called quant is the one who understands the five processes of a fund manager and knows how to model these processes and put the models into software code. Usually another software programmer writes the event-driven cross-platform codes. The system manager takes care of the hosting, the setup and maintenance of the various server environments and the Content Management System. The use of Market-On-Close orders and the condition of Perfect Competition allow for reliable validations in foresight and hindsight.

Don’t banks have a public responsibility to not juggle the financial system into instabilities?
When banks want their quants to play with non-linear models and large positions, from a mathematical point of view it is known that optimization routines become prone to instabilities and local optima. Hence, when big players are going to play non-linear modelling, the game-outcome may be an unstable equilibrium.

Each private investor has his own set of choices and circumstances resulting in his own investment fund.
The five rational processes of fund management allow for the existence and computation of so-called optimal portfolios, each one being different for each given investment amount, for each given portfolio size, at each given time, and for each given set of correlation times. These portfolios are optimally tuned to investment objectives of balancing risks and rewards and to one’s own screening and ranking conditions given the quality of the data provider and the height of the broker fees. Practically each individual set of choices and circumstances results in its own unique set of optimal portfolios that can be validated over a preset number of years. It is a transparent process without proprietary hides.

Our findings after extensive back testing a variety of game plans can be summarized as follows:

  • The more you want to invest, the less stocks are available, and the more difficult it is to increase your rewards.
  • The less you want to invest, the faster your broker costs will eat up your rewards.
  • This holds also for more frequent trading and larger portfolio sizes.
  • However, larger portfolio sizes offer you a chance to better spread and hedge your risks.
  • For smaller and mid-sized investments, there often exist game plans that expect better-fitting risks and rewards than those programs offered by professional parties that let you participate in larger funds that accommodate some kind of averaged risk and reward profile.
  • In this field of opposing forces, you have the choice of ending up in a straitjacket of a common denominator or quantifying your own choices to let them optimally bloom.

What is the available software?
We know of four software packages on the market that allow one to set up one’s own game plan. The first three ones are SmartQuant, QuantShare, DLPAL, and are based on technical indicators and focus on intraday or High-Frequency (HF) trading of individual assets. The last one, DigiFundManager, is based on selecting and getting to trade optimal portfolios, using only the clean historical Market-On-Close exchange data for its computations and is based on preset holding periods of multiple of weeks (Low-Frequency trading). Each optimal portfolio can be uniquely tuned to one’s personal choices, to one’s own circumstances and to one’s investment objectives in terms of balancing risks and rewards. The complete set of optimal portfolios is validated over a preset validation period. These four packages appear to us to be transparent in their concepts and modelling. We haven't seen evidence that High-Frequency trading should outperform Low-Frequency trading. That is understandable from the point of view that trend following appears to be  less effective than mean reversion, and that the latter one has little to do with the frequency of trading.  

Jan G. Dil and Nico C. J. A. van Hijningen,
February 26, 2019.

Copyright © 2019 EnterErgodics. All Rights Reserved.