Wednesday, May 27, 2009

If I Were a Computer

If there was a several hundred billion dollar capital fund and you were charged with investing it, you would hire the smartest people you could get your hands on, provide them with a relatively insane budget, and give them huge incentives to perform. Even with a lot of bright people who can easily employ fundamental analysis and technical analysis to make a lot of money, the tendency will still be to try and get ahead of the game. After all, who gets paid to invest several hundred billion dollars with primitive tools used by any multi-billion dollar hedge fund manager? If you were paid to be one of these people, three tools would be made available to you with no expense spared: The best financial market data that can be purchased, every research report written, and a really big computer along with some programmers and engineers to help you run it.

The details of this post are rather daunting. You don't need to understand them down to the gory details in order to see the points of this exercise. Focus on the notions that are implied simply by alluding to the mathematics rather than the actual content of mentioned formulas.

Now for some plot spoilers:

What you're looking at is called a control diagram. Whenever there's a really big system with lots of feedback response that need to be modeled, engineers will draw up a control diagram and use it to write the equations of the system. Looking at that big column of blocks with the arrows going in a circle, does something seem familiar? (This illustration is from Covert Newspaper Interception.)

Here's a brief rundown of the diagram:
  • Emerging financial and economic data (basically any information, including insider information, that will affect the results of fundamental models.) flows in at the far left.
  • This data gets processed by fundamental analysts, who are the vector for that data to start affecting the supply and demand for a stock.
  • This result flows into the summing junction, which is exactly what it sounds like, just a way of indicating that the arrow coming out of the right side is the sum of everything going in.
  • The summed signal goes back into all of the technical analysis branches, which produce new buy and sell signals.
  • The result that pops out at the right is dependent on both the recycled data that's already present in the system and the new data that is introduced by fundamental traders.
Now, to skip the ending and the epilogue, ask yourself why these diagrams are called control diagrams. If hair isn't raising up on your spine, it should be. The very act of writing this diagram like this shows that it's theoretically possible to write some software that to a large degree not only predict, but control the demand for a stock by adding an external signal that either causes or dampens feedback cycles.

It gets better. The task of figuring out how influential technicals and fundamentals are is readily taken care of by the use of empirical constants k1, k2, k3...kn, and multiplying the indicator by this constant, or even a simple empirical formula, all while remaining well within the reach of the computational power of a simple desktop computer.

How is this possible? Well, let's imagine that a set of n traders with mi capital are watching a particular indicator of a particular stock, and that each individual trader is influenced in some arbitrary amount by that indicator. The weight for each individual trader is just some constant factor, like 0.10 for instance or maybe 0.40 if a trader really likes a certain indicator. The summation of all of these traders, weighted by their individual incomes and sensitivity to the indicator, is just a constant times that indicator. One simple number multiplied by one other number can, to a very large degree, represent the aggregate trading influence of the movements of a particular indicator.

In a nutshell, that last paragraph says that you can, to a sufficient degree of accuracy, simplify the stock market to be, instead of n traders with n wallets and n propensities to trade because of a particular indicator, just one big trader with a big wallet and a wallet-weighted average propensity (wallet-weighted average since the amount of money a trader has influences how much their propensity gets expressed in the aggregate).

If you do this for every single indicator imaginable, you can represent the entire feedback portion of the stock market as one gigantic, linear function. Solving for these k constants isn't easy just yet. And we still have to decide on how to empirically represent the fundamentals.

To start, let's assume that, by and large, you can hire a group of true-to-the name analysts who have access to enough of the information that their assessment of the value and future earnings of a stock are a reasonably accurate depiction of all traders who are following a stock. They don't have to be supremely accurate since the bulk of traders trade multiple stocks and don't have time to cover every one of those perfectly themselves. The idea is that you want to have some indication with sensitivity to changes in fundamentals that can be tweaked by a constant or simple function. Hiring traders or polling traders to get an idea of what other traders are doing is a perfectly valid course of action. Again, you're making a reasonable assumption that n traders with n models and n data sets can be approximated as one average trader who used a plausibly average data set and a plausibly average financial model. The accuracy of this approximation is again not that important. That your panel of analysts behaves for the most part just like real traders is all that's important.

Now, we could be heroes and plug and chug these constants all day until the model starts working, or we could just throw the thing into what's called an artificial neural network. It's just a neat type of software that is very good at signal analysis and recognizing patterns in data. It would simply tweak the constants until the model, at any given time in the past, predicted any other time after that with the highest accuracy. The result won't be perfect, but what will eventually get spat out on the other end is a set of these k constants that makes the model the most in line with the data. Not only that, using statistical analysis it would be possible to estimate the accuracy of future predictions, so the software could tell you the best times to pay attention to the result.

And it gets better still. By controlling feedback, a sufficiently large fund could create feedback cycles that would be stable enough to absorb the fund's own trades, and they have absolutely every reason in the world to do so. It's conceivably possible that maximizing the return on investment for such a fund might even involve calculating which feedback cycles to generate at which points in order to maximize the total amount of the fund's own trading activity that the market volatility could accommodate. Given the size of some overgrown funds, once more it can't be more obvious that they in fact have every reason in the world to do so.

This post has cleared the stage for the dark side of the stock market. Get ready to feel more skeptical of the market as a whole than your used to. At the same time, rest assured that you can be on a level playing field as long as you let the underlying implications work for you. The posts following this one will almost entirely be focused on this dark side and how to safely navigate it. Consider this the half-way point for this blog.

No comments:

Post a Comment

Please give yourself a Turing test to verify that you are human.