You can still register for the 38th annual International Symposium on Forecasting, being held June 17-20 in Boulder, Colorado, at the foothills of the Rockies.
Beyond the daily keynotes and over 150 presentations of new research, a separate Practitioner Track features talks on the application of forecasting research to organizational challenges. There will also be an update on the M4 Forecasting Competition which ends this month.
As always, SAS will have a large presence at the event. Please stop by the SAS exhibit table, and check out these sessions:
Sunday June 17
9:00am-12:00pm Workshop: Large-Scale Automatic Forecasting with SAS Visual Forecasting (Chip Wells)
This workshop introduces the TSMODEL procedure, released in 2017 and included in SAS Visual Forecasting and SAS Econometrics.
1:00pm-4:00pm Workshop: Implementing Large-Scale Forecasting Systems (Michele Trovero & Michal Kurcewicz)
This practice-oriented workshop presents challenges in implementing large-scale forecasting systems, and discusses how modern in-memory architecture can address those challenges.
Tuesday June 19
2:00pm-2:20pm What Management Must Know About Forecasting (Mike Gilliland)
Why are business forecasts so frequently wrong, and what can we do about it? This presentation explores fundamental issues in forecasting that may not be apparent to those managers and executives who oversee or rely on the forecasting process. It will expose common worst practices that politicize the forecasting process, or add cost and complexity that fail to improve forecasting results. This material can be used to help educate management on the limitations of forecasting, and make them aware of alternative approaches to their business problems.
Wednesday June 20
11:40am-12:00pm Build Modeling Hierarchies for Large Scale Time Series (Yue Li)
Hierarchical forecasting has proven to be an effective method to improve forecast accuracy for large time series data with hierarchical structure, due to its capability of pooling information at different aggregation level to reduce noise and enhance signal. This method usually requires a good hierarchy; however, in reality the hierarchy is usually manually provided by the users based on planning structure, which might not be the best one for modeling purpose. This method presents a data driven framework to derive the best hierarchy that not only organizes the data in an easy-to-interpret way, but also improves forecast accuracy from the modeling perspective.
12:00pm-12:20pm Scalable Cloud-Based Time Series Analysis and Forecasting (Thiago Quirino, Michael Leonard & Ed Blair)
Many organizations need to process large numbers of time series for analysis, decomposition, forecasting, monitoring, and data mining. This paper presents a resilient, distributed, optimized generic time series analysis scripting environment for cloud computing. The framework comes equipped with capabilities such as automatic forecast model generation, automatic variable and event selection, and automatic model selection. It also provides advanced support for time series analysis (time domain and frequency domain), time series decomposition, time series modeling, signal analysis and anomaly detection (for IoT), and temporal data mining. This paper describes the scripting language that supports cloud-based time series analysis and provides examples that use this technique.
2:00pm-2:20pm Outcome Prediction in the Practice of Law (Mark K. Osbeck, University of Michigan Law School & Mike Gilliland)
Lawyers are best recognized for their role as client advocate -- arguing their client's interests in legal proceedings (and, if necessary, in the media). But an equally important role is as advisor -- analyzing the client's various options and the likely outcome of each. Outcome prediction, therefore, is an essential lawyering skill. So how do lawyers make their predictions? And can outcome prediction be improved with data and analytics? This presentation looks at traditional methods of outcome prediction used by lawyers, and examines the prospect of using data science to make better predictions.
2:00pm-2:20pm IoT Driven Analytics Will Define the Future Electric Utility (Bradley Lawson)
Electric utilities are facing a rapidly changing landscape fueled in large part by the falling price of batteries. Lower priced batteries will allow individual homes and businesses to couple batteries with their solar panels, but the biggest effect will come from the growth of electric vehicles (EVs). In many areas of the US and the world, one new EV charger can add as much demand as a new customer. Fast chargers that recharge EVs in just a few hours can add three times that demand. Too many EVs in a localized area could overload equipment, reducing reliability and requiring costly equipment upgrades.
Analytics and the Internet of Things (IoT) will become partners to create systems that enable the customer and utility to continually communicate their status and options. This will give customers maximum flexibility to charge their EVs without increasing the utility's costs. With a two-way flow of information, the utility will be able to take advantage of excess energy from EVs to maintain grid stability and reduce costs. Robust communication would allow utilities and customers to coordinate demand response to lower the utility demand at critical times. Historically, actions would be to simply disconnect a water heater or air conditioner for a few hours. Advanced analytics coupled with robust communications will allow complex actions such as instructing a home thermostat to change settings.
This paper reviews the individual systems already in place, their integration, and the analytics needed for a system that helps keep utility costs low while giving customers maximum flexibility.