Interpretation: Subscripts on decision variables are getting cheaper !!
We have witnessed a 1,000 times improvement in peak flops (floating point operations per second) every ten years for the past three decades. For those unfamiliar with Moore's Law postulated by Gordon Moore, fellow UC Berkeley grad and former CEO of Intel, in his 1965 paper he predicted a doubling in the number of transistors on a computer chip every two years. When combined with faster clock speed, we have witnessed improvements in chip performance that have taken us to the brink of excascale computing (that's ten to the eighteenth power or a quintillion flops) and billion-way concurrency!
- As a college freshman in 1969, armed with a slide rule, I never imagined that this level of computing capacity would exist in my lifetime -- not in my wildest dreams. Allow me to share a personal story that can illustrate the impact of high performance analytics (HPA) on decision-makers and problem-solvers. I hope that it will foster a deeper appreciation of the impact that this technological advancement will have on the way business leaders gain knowledge in order to develop and execute strategies and make key decisions. HPA will surely help them to meet or exceed their corporate goals.
Balance sheet analytics in the 80's
In 1985, as a balance sheet management analyst, I developed strategies to engineer a target balance sheet on an 18 month planning horizon. A primary tool was a large-scale financial optimization system that pulled a half gigabyte of data from all of the bank's transaction systems (commercial loans, SWAPs-collars and caps booked by the investment bank, Eurodollar placements and takings, treasuries, agencies, term repos, reverses, other capital markets securities, consumer certificates of deposit, jumbo and liability management CDs, financial futures, and so on). It also accepted interest rate forecasts for all key market indices supplied by the banks economics unit and risk preferences based upon executive management's risk appetite.
The objective function was to maximize net interest income (NII) plus realized capital gains/losses plus capital appreciation/depreciation. I will not go into the constraint descriptions, but they were considerable. Since the model was a temporal one, the cash flows needed to be preserved, while purchases and sales of securities were permitted for the first six months of an eighteen month horizon. There were also non-linear risk constraints that were varied to generate an efficient frontier of risk-return tradeoffs. Strategy choice was a function of the resulting pay-off matrix under different economic scenarios and the corporate risk appetite (tangent of A/L Management Committee indifference curve with the efficient frontier).
The problem size, expressed in terms of the matrix that was generated from the modeling language and data inputs for input to the optimizer, was 30 thousand rows by 15 thousand columns, with a density of non-zero coefficients of 0.55 percent. It took 50 minutes to generate the matrix, and 10 minutes to solve it on an IBM 3033 mainframe running in OS/MVS operating system batch mode. In those days, great care was taken to manage problem sizes that could otherwise chew up a lot of CPU cycles on expensive computing platforms and pose unacceptably long run-times.
Due to the long processing times, we were compelled to make the models as simple as possible. For example, an instrument was defined based on the type of security and its maturity instead of just the category of security with maturing being a second dimension. This cut down on the number of decision variables, but it also limited the ability to interrogate the model and consider maturity structure independent of the category of security. We made many other compromises in a problem formulation that made the application more challenging to work with from many respects. Those spillover effects included difficulties in data management, constraint specification, infeasibility tracing, model documentation, problem modification, and verification (both of the problem specification and the optimal solution). Despite those and many other barriers, we managed to develop some great balance sheet strategies and the few basis points of improvement we achieved annually for a super-regional bank with $32 billion in assets more than covered our technology investment, staffing costs and overhead by a factor of two (that's an ROI exceeding 100%). We verified the value added, contrasted against both a benchmark "do nothing" strategy and a naive approach based on past performance. We always asked "whether the juice was worth the squeeze" question, and before continuing with my balance sheet formulation story, let me digress for a short tale relative to the bank's trading operation.
Relative to the bank's trading book. I recall the chairman coming down to my office one afternoon. He shut the door to my office and told me he was wondering if our bond trading activities were really delivering for the bank. So he asked me to run a simulation wherein we turned over the bond portfolio every two years (i.e. replace 4 1/6 percent of the portfolio every month) over a five year period, based on purchases at the historical Fed auction prices. He wanted the results on his desk the next morning. I reported to the CEO the next morning, accompanied by my manager, the bank's chief economist. The answer ratified that our traders were consistently beating the market by a statistically significant, and financially material, margin that was well worth the costs of technology and performance-based compensation.
High performance analytics (HPA) in our current decade
- Additional subscripts better capture problem & facilitate solution analysis
You may wonder why a modeler might include legal location of the entity holding a security as a dimension in the framework. Well, it turns out there are different tax treatments for various securities, in different, yes even neighboring, states in the US. If you consider cross border holdings, then geo-political risk and foreign exchange risk come into play. Euro-denominated securities could be put on a USD equivalent basis, but if they are still denominated in euros when a market disruption or failure occurs, the USD cash equivalent value may change. Sure, on the tax treatment issue, you could handle the issue through the ETL, or data input stage, to put all securities on a pre-tax or after-tax equivalent basis. However, you would not be able to perform "what-if" simulations or post-optimality / parametric analysis with an optimization problem that is memory resident with billion-way concurrency. Instead, you would need to reload "big data."
can consume massive volumes of data
is closer to the business reality
can encompass a vast array of possibilities
can surface whole families of solutions + associated trade-offs
can identify and portray the connectedness of solutions
fosters a far deeper understanding of the solution and its sensitivity to model assumptions and uncontrollable forces