STOCHASTIC ANALYTICS: increasing confidence in business decisions
With the increasing complexity of the energy supply chain and markets, it is becoming imperative for businesses to make decisions faster and with more confidence, in the face of increasing uncertainty. In this article, Tomas Simovic and Rashed Haq discuss how today’s technology and advanced models are enabling firms to do more than simply improve decision-making, but rather to reengineer it entirely. They also explain how by solving the right problem, using the right tools, approximating business problems and leveraging visualization, companies can impart valuable insights to their business users.
Whether in financial markets or commodity trading, decision makers are faced with decisions involving a large number of variables and data points, many of them uncertain. These decisions are often about the most efficient use of assets, such as using financial products as collateral, managing a large oil or gas supply chain, balancing renewable portfolios, scheduling or routing fleets, managing hydro generation or gas storage, etc. The uncertainty can come from market prices and spreads or in volumes such as customer demand, weather, technical disruptions, etc. As margins decline and the markets grow more integrated, fast-paced and increasingly competitive, decisions must be made more quickly and with greater confidence, even in the face of increasing uncertainty. Relying on just experience or basic analytics may work some of the time. However, there are times when a robust decision-making framework can have a greater positive business impact.
Skepticism about advanced analytics lingers among business users, due in large part to past difficulties. A decade ago, most successful advanced decision aid tools were built as monolithic bespoke systems, involving teams of PhDs, partnerships with the academic world, big budgets and long project lead times. Oftentimes, the real business requirements were not well understood by the quantitative experts. And even if business requirements were understood, the process usually took so long that by the time the decision aid tool was finished, the business had moved on. As such, only large companies with big research budgets and patient management could afford to go down this road.
The situation has improved in recent years. A quantum leap in computing power has been made, along with an exponential decrease in costs driven by cloud computing and big data technologies. Moreover, practical advances in numerical optimization have made it possible to harness this computing power to solve real-world problems. This know-how has been packaged into an increasing number of highly usable tools, some licensed and others open source. Software vendors have become adept at selecting the best PhDs and partnering with leading researchers to provide extremely powerful numerical optimization tools at a fraction of the cost of custom development. Therefore, instead of requiring large IT projects, these can be built from a variety of building blocks: a calculation engine from one vendor, modelling language from another source and finally a GUI from a third party. This leaves more time to gather business requirements, roll out prototypes, interact with business users and manage project costs.
MATHEMATICAL OPTIMIZATION: CONCEPTS
It is important to situate decision-making under uncertainty in the analytics space. Advanced analytics are often divided into three categories: descriptive, predictive and prescriptive. Descriptive analytics answers the question: “What has happened in the past?” Predictive analytics answers the question: “What is likely to happen in the future” or alternatively “What is the set of possible futures?” Finally, prescriptive analytics provides the business users with suggestions on “What should we do?” or “What is the set of possible decisions and their implications?” Decision-making under uncertainty includes all three categories: descriptive analysis informs predictive methods, which are used to forecast the possible values of uncertain data points. Prescriptive methods compute the right decision to make given these forecasts and any constraints, while maximizing or minimizing some objective such as profit or cost.
PREDICTIVE METHODS IN COMMODITIES
The most likely uncertain data points a business decision maker in commodities will face are prices, customer demand and weather (e.g., hydro inflows, renewable production). Prices are notoriously hard to predict. However, this is not a new problem for most commodity trading companies: it is very likely that forward curves of some quality are computed to value portfolios and calculate risk metrics. Risk departments are also quite familiar with Monte Carlo simulation methods that allow the computation of a large number of possible price paths. Predicting demand or weather might be a newer problem, but if enough historical data is available, forecasts can be obtained using statistical tools such as principal component analysis.
Predictive analytics is an emerging area, garnering increasing interest from both business users and software vendors. The area of applied mathematics that can model business problems using decision variables, constraints and an objective function to maximize or minimize is called mathematical optimization. When all data is “certain” or “deterministic” and all constraints and decision variables are linear and real valued, a problem can be modeled as a “linear program.” If mutually exclusive possibilities exist in the business problem, some variables might have to be restricted to binary or integer values, leading to business problems being modeled as “mixed integer linear programs.” Finally if some data points are uncertain or stochastic, and the optimization criterion is expected profits/costs, this uncertainty can be modeled as a set of scenarios and the problem formulation becomes a “stochastic program.” An important point is that, in some business problems, not all decisions have to be made at once and sometimes new information becomes available over time that can help improve the current best decision. This is referred to as “single stage”, “two stage” or “multi stage stochastic programs.”
As more elements are added to a mathematical optimization model, going from a linear program to a multistage stochastic program, the computational burden increases very quickly with the size of the problem. Before going with a full-fledged stochastic model it makes sense to perform sensitivity analysis on the input data: some types of data might influence the result more than others. Another commonly used method that allows business users to deal with uncertainty while restricting the modeling and computation effort to a deterministic model is “scenario analysis.” The kinds of data to which the model result is most sensitive can be modeled as multiple scenarios, with the model being run individually for each scenario and the results compared.
While sensitivity analysis and scenario analysis are inexpensive ways to work with uncertain data, in some cases, they might give imperfect insight to the decision maker. First, deterministic solutions can often be fragile—a small change in input data may result in a significant change in the recommended business decision. Second, “optionality” or solution flexibility may have a lot of value in some business problems. Deterministic solutions give zero value to solution flexibility and thus may not appear as smart choices to a savvy business user. Finally, in many business problems, not all information comes at the same point in time and not all decisions have to be made at the same time and are “set in stone” afterwards. When in doubt, calculating metrics such as “expected value of perfect information” will help determine if going down the stochastic route is a worthwhile investment compared to sticking with a deterministic model.
EFFECTIVE IMPLEMENTATION OF QUANTITATIVE DECISION TOOLS
The question now is how to actually turn all of the concepts above into a working application that provides valuable insights to business users. The many elements of success can be grouped in the following categories: solving the right problem, using the right tools, approximating business problems and leveraging visualization.
In the past it was impossible to collect, analyze and act on data in a timely manner centrally, leading to organizations that function as silos, with decision making fragmented across different business units. Today, technology can give a more holistic view of company operations and examine possible solutions and outcomes that can be vastly beyond the cognitive abilities of a human being. Therefore, instead of simply automating or streamlining the current decision-making process, it makes sense to examine to what extent it is possible to reengineer it. For example, a large European utility company used to have several layers of decisionmaking to schedule complicated river chains. When the central dispatching model was able to respect more complicated physical constraints, eliminating the need for elaborate manual schedule modifications, the organizational structure could be streamlined.
Using the right tools is the second element required for success. One important lesson to remember is that, at least with the first version of an optimization application, custom development of the actual calculation engines should be kept to a minimum. There is a wide variety of calculation engines and it is extremely unlikely that a couple of months of custom development can beat these packages even if done by extremely gifted quants.
Given the increasing availability of high-quality implementations of sophisticated algorithms, if projects are to be successful, the main emphasis of the initial stages of a decision aid application project must be business modeling. The question that should be addressed is the following: “Given the size, type, mathematical characteristics of the problem and the availability of third party calculation engines, what is the right mathematical approximation of the business problem, representing most of the important problem characteristics that can still be solved in a reasonable computation time?” Edison once said “the real measure of success is the number of experiments you can run in 24 hours.”
Rapid prototyping is therefore crucial and using the right tools, such as algebraic modeling languages, is essential for a project to succeed. Rapid prototyping is not only important to find a good mathematical formulation for a given problem, it is also essential to refine the actual problem statement. Most business users cannot be expected to sign off on a particular constraint or set of constraints, expressed in mathematical equations. To elicit detailed problem properties, it is usually much better to show business users’ increasingly complex solutions of a prototype model and validate the acceptability of these solutions. It may be hard for them to express exactly how they expect the model to work, but they are usually quite vocal about what constitutes an inacceptable solution. This interaction around the rapid prototyping process can also dispel the initial skepticism a business user might have toward the ability of a quantitative tool to satisfactorily address the business domain where her expertise lies. It will also prove that decision aid tools are not there to replace jobs but to enhance the decision making skill of a business domain expert.
An important element of the success of a decision aid application is the proper visualization of results. Complicated mathematical models can generate a large set of numbers and understanding these numbers quickly and finding business insights is not a trivial task. Intuitive visualizations of these results allow a wider audience in a typical company to leverage the results of quantitative tools in their daily tasks.
CURRENT AND FUTURE APPLICATIONS
There is a wide variation in the quantitative sophistication of companies across regions and industries, but large power utilities seem to be some of the most sophisticated users of advanced mathematical optimization tools. This is due to such factors as the need for centralized dispatching of the generation fleet, a long-time focus on efficient asset utilization and experience successfully executing technically complicated projects.
The existing applications that are currently being used can be divided into strategic decision aid applications and tactical or operational applications. Strategic decision aid applications are used to support investment decisions, calculate fundamental prices or simulate policy choices. Tactical applications are used mainly in the daily operation of physical asset portfolios. For example, most sophisticated operators of hydro storage use multi-stage stochastic optimization models to manage the water levels of their hydro assets. These models inform trading and hedging decisions, scheduling maintenance outages and flood prevention.
So where are the next real applications of stochastic optimization going to come from? They are likely going to come mainly from the tactical decision aid applications as business users move from Excel-based tools and heuristic decision-making to more sophisticated quantitative tools. The applications with the biggest ROI will appear in areas with the following characteristics:
- Physical assets with a large number of complicated physical constraints: quantitative tools are more likely to find interesting counterintuitive solutions if the underlying problem is very complex
- Large systems/portfolios and centralized decision making: the larger the portfolio the business decision is affecting, the larger the savings/additional profit in absolute terms
- A certain level of sophistication in the IT landscape: high data availability and quality is a precondition to successfully deploy advanced decision aid applications
Many large oil and gas companies still have some way to go to reach the level of sophistication in tactical decision-making of the most advanced power utilities. Different parts of operations, such as production, storage, transportation, refining and trading, can be represented in a single model and optimized globally, at least for a single geographical region. Taking full advantage of decision-aid tools to streamline their operations can be a major lever to cut costs to restore profitability if oil and gas prices stay at their current low levels. Merchant traders that have acquired physical assets in recent years would also benefit from integrating their physical operations with trading activities more tightly through optimization models.
For more information on this topic, reference the article, “ANALYTICS STRATEGY: creating a roadmap for success”
is Vice President and Lead for Analytics & Optimization for Commodities at Sapient Global Markets. Based in Houston, Rashed specializes in trading, supply logistics and risk management. He advises oil, gas and power companies to address their most complex challenges in business operations through innovative capabilities, processes and solutions.