Slope of Hope Blog Posts

Slope initially began as a blog, so this is where most of the website’s content resides. Here we have tens of thousands of posts dating back over a decade. These are listed in reverse chronological order. Click on any category icon below to see posts tagged with that particular subject, or click on a word in the category cloud on the right side of the screen for more specific choices.

Nathaniel’s Shorting Teenage Vampires (by Nathaniel Goodwin)

By -

Monday was one of those days that make me crazy. Am I the only one that thinks vampires suck? Don't get me wrong, I love creepy stuff; zombies are cool, werewolves are badass and Frankenstein rules. However; nothing pisses me off more than a bunch of young good looking vampires, scoring with hot chicks and living forever. WTF? Screw them.

 If there was a 3X bear ETF of vampires, I would have been all in a month ago. I've been shorting HOTT (who sells dumb vampire licensed merchandise) since October. If you want creepy, look at this chart.

Hott
Last week I also started to short TWX mainly because I saw a good risk-reward setup, but also because they own HBO, and HBO runs that show "True Blood". In at $32.27 stops at $32.84. Vampires suck ass, I'm going to try to short them back to the grave.

Twx

Neural Networks – Part One (by nummy)

By -

In light of Tim’s recent post on old school video games, the advancement in computing continues to amaze me and will probably continue to amaze at the same rate for many decades. Or not … because although sufficient enough to approximate, Moore’s “Law” has to hit a threshold eventually, no system is infinitely in perpetual motion because eventually you succumb to the laws of entropy. Anyway, I’d like to do a series of posts on neural networks. I’ll discuss a bit of history in this first post, and how we can apply them to a financial model that can efficiently estimate market movements.

In the late 1800s and early 1900s neural networks became a concept that tried to explain the functional behavior of the human mind. In the 1980s, the implementation of neural networks in computation experienced a boom with the rediscovery of a computable backpropagation algorithm. The idea of spontaneous order came about from Friedrich Hayek in 1950 suggesting that the brain behaves as a result of decentralized networks of simple building blocks we can call neurons.

Neuron


The mind can be essentially thought of as a network that processes multiple input variables in parallel. A neural network modeled in a computer does the same. Imagine a model of the market that can take into account multiple input variables and learn from itself. One of the strengths of NNs is the ability to find patterns and irregularities as well as being able to learn from the inter-dependence and correlation of all the inputs. Equivalent to a biological neuron, we can create an artificial neuron with a mathematical model.

Neuron_artificial 

This is the basic building block of a neural network, the single neuron. As with a biological neuron, your axon terminals are your n inputs. They have weights w_1 through w_n and transfer signals (or stimuli) u_1 through u_n into the neuron. The weights can be thought of as weights of “importance” to each input. The neuron takes into account the sum of the weighted inputs along with an activation function (in this case a step function) which triggers an output z.

Neurons can be organized in various network structures but for our purposes of financial modeling, I will only consider feed-forward networks and recurrent networks. Feed-forward networks just describe the flow of information. Information is fed forward from the input(s) to output(s). Recurrent networks are networks that allow for feedback loops.

Simple_nn

Let’s consider a simple NN model where we would like the output to be SPX daily. We consider the following three inputs to the model:

-Previous day’s EUR/USD exchange rate

-Previous day’s LIBOR rate

-Previous day’s SPX close

The hidden layer of the network allows the network to learn not only how the inputs affect the output, but how each input affects other inputs. This model would be recurrent since the previous SPX data is fed back as an input. Using MATLAB, you can setup models like this. You can run a NN training where it analyzes past data and tests its forecast performance. Suppose we have 1000 data points of each input. That means 1000 daily closes of the EUR/USD, LIBOR, and SPX. We can set aside the first 70% of the data points for the network to “train” itself and compute all the weights (or importances) of each input. The remaining 30% of the data points can be used as a way to backtest predictions against actual outcomes. The network can analyze it's error rate and re-trains itself if the computed weights are not optimal.

Nn_11_3_2009 

Here is a model I have used so far and takes about 2-3 minutes to compute on my iMac (Intel Core 2 @ 2.4GHz, 2GB RAM). The model is a simple one with SPX past data as the input … no other inputs. The dataset used is on the lower-right chart. This is SPX daily close data since the 1970s (maybe 1979? forgot exactly what year the data started). The first 80% of the data was used for the NN training. The remaining 20% of the data was used to forecast (upper-left chart). The error in forecasting is shown on the lower-left chart. The 2-day forecast made by the NN on 11/13/2009 is on the upper-right chart. The model was saying we may get a slight drop the next day but after that, the trend was to change upward soon. I only chose to plot the output of two days because as you go out further in time, your error becomes exponentionally worse. Ideally, after each day, you would re-train the network so it is up to date with the latest data.

You can imagine a much more complex model with multiple outputs (SPX, Dow, Nasdaq, FTSE) and many more inputs (exchange rates, earnings, emerging market indices, volume, etc.). Other researchers have even used commonly used indicators as inputs to the NN model (MACD, RSI, moving averages, etc.). Increasing the inputs and hidden layers greatly increases the forecasting accuracy of the model while also increasing the computation time required. The computation required for this grows exponentially and you can start to see why the “little guy” is screwed from the start. The only way to profit 95%+ of each trading day with this type of computational edge is by an institution with large amounts of capital, a sick artificial NN model running on huge server farms, and direct fiber optic hookups into the NYSE for ultra-low latency. I am quite sure I don’t need to name names on this one.

In a David vs. Goliath attempt, I want to create a NN model with help from fellow Slopers. Maybe we can have some suggestions and discussions on what inputs ultimately affect the output the most. I don’t want to model something that would take days to compute on my iMac so maybe you can list your top 5 things you think affect SPX daily performance the most. Excluding the LDI (we love you Lester), here is my guess as to which inputs could be most influential on the outcome of SPX each day:

-LIBOR rate

-U.S. 20+ Year Treasury Bond Index (TLT)

-TED Spread

-Price of Gold

-Price of Oil

Goldfinger

By -

The rampant optimism surrounding gold is startling. I, personally, am terrified of the stuff, and I only trade it these days for quickie day trades (like my DZZ purchase early today, closed for a spiffy little profit later). But those are "Scoot 'n' Shoot" trades. I, for one, am not about to try to figure out when gold tops out.

All the same, the ink being spilled over how gold is so wonderful is a sight to behold:

1123-gold

The fever around the metal is having obvious effects on GLD…….

1123-gld

And, most interesting of all to me, it is garnering most of the mindshare too (with the exception of item #1; by the way, isn't that supposed to be spelled Bulltard?)

1123-golddiscussion