Business Forecasting book review in JBF

Book Review in Journal of Business Forecasting

The Summer 2016 issue of Journal of Business Forecasting includes a book review of Business Forecasting: Practical Problems and Solutions. The review is by Simon Clarke, Group Director of Forecasting at The Coca-Cola Company.

You may be familiar with Clarke's many previous contributions to the forecasting literature, which include:

  • Managing the Introduction of a Structured Forecast Process: Transformational Lessons from Coca-Cola Enterprises Inc., Foresight #4 (June 2006).

and several entertaining "Joe and Simon Sez" articles co-authored with Joe Smith:

  • Who Should Own the Business Forecasting Function?, Foresight #20 (Winter 2011).
  • Forecasting Tools: Have They Upgraded the Forecasting Process?, Foresight #22 (Summer 2011).
  • Our Best Worst Forecasting Mistakes, Foresight #25 (Spring 2012).
  • Fostering Communication that Builds Trust, Foresight #28 (Winter 2013).

Attend the Foresight Practitioner Conference and Get a Free Book

Remember, if you attend the upcoming Foresight Practitioner Conference (October 5-6 in Raleigh, NC), you will receive a free copy of the new book. I will be joined by my co-editors Udo Sglavo (Sr. Director of Advanced Analytics R&D at SAS) and Len Tashman (Editor-in-Chief of Foresight) for a book signing on October 6, 7 - 8 a.m.

Book Giveaway

Post a Comment

Last day -- Foresight Practitioner Conference $200 registration discount

Companies launch initiatives to upgrade or improve their sales & operations planning and demand planning processes all the time, but many fail to deliver the results they should. Has your forecasting operation fallen short of expectations? Do you struggle with "best practices" that seem incapable of producing accurate, useful results?

Join your professional peers at the upcoming Foresight Practitioner Conference entitled "Worst Practices in Forecasting: Today's Mistakes to Tomorrow's Breakthroughs." This 1.5-day event will take place in Raleigh, North Carolina, October 5-6, 2016. Register today to take advantage of the early bird savings offer expiring on Monday, August 15. And at the event, be sure to meet the authors and receive a free signed copy of the new book, Business Forecasting: Practical Problems and Solutions.

Book Giveaway

Invited speakers will share how they and others have uncovered and eliminated bad habits and worst practices in their organizations, for potentially dramatic improvements in forecasting performance. Some of the topics to be addressed include:

  • Use and Abuse of Judgmental Overrides
  • Improper Practices in Inventory Optimization
  • Avoiding Dangers in Sales Force Input to Forecasts
  • Pitfalls in Forecast Accuracy Measurement
  • Worst Practices in S&OP and Demand Planning
  • Worst Practices in Forecasting Software Implementation

Registration will close in just over a month, and $200 in registration savings expires on Monday, August 15. Register today to reserve your seat and save!

Visit the conference web site for registration and complete conference information: https://forecasters.org/foresight/2016-conference/

Any questions? Contact:

Stacey Hilliard, Marketing Director
Foresight: The International Journal of Applied Forecasting
staceyhilliard@forecasters.org
+1 (781) 308-3334

This conference is being produced by the editorial team at Foresight: The International Journal of Applied Forecasting. Foresight is published four times a year by the non-profit International Institute of Forecasters (IIF), an unbiased, non-commercial organization, dedicated to the generation, distribution, and use of knowledge on forecasting in a wide range of fields.

Post a Comment

Next-generation demand management

Announcing New Book by Charlie Chase: Next-Generation Demand Management

Charlie ChaseMy colleague Charlie Chase has just published his latest book, Next-Generation Demand Management. It is available August 29, and can be pre-ordered now on amazon.com. It will also be available for purchase at the SAS Bookstore, along with Charlie's other books:

From the description:

Next-Generation Demand Management gives readers a better framework for building the foundation proven to fuel growth. Written by an international thought leader and practitioner in business forecasting, this next generation demand management framework radically improves the traditional supply chain demand forecasting and planning function. It moves beyond the typical demand-driven approach to a holistic view of the supply chain by identifying the commercial organization as a critical component of the unconstrained demand forecast. By doing so, it elevates demand generation, revenue, and profitability, along with customer service levels, and reduces inventory costs, waste, and working capital. Along the detailed road map, you gain the insight and tools for enhancing the skills and behaviors of your people, integrating horizontal processes, improving the accuracy of your forecasts with predictive analytics, and simplifying the entire process with scalable technology. Culled from the author’s extensive experience, this everyday guide covers only the theory directly applicable to situations encountered in the real world, which makes it completely relevant and easy to put to use right away. Take control of the big picture by:

Book cover• Identifying the most important skills the top people in demand management possess.

• Developing the internal structures and practices used by organizations leading the way in demand management.

• Fully taking advantage of big data and new technologies in order to master predictive and descriptive analytics.

Whether you’re currently in the field or plan to be soon, the revealingly illustrative examples of companies applying covered concepts in the real world enables you to implement strategies more easily the first time. Watch your revenues soar with more accurate predictions of how demand will im­pact your supply chain with Next-Generation Demand Management.

Meet Charlie at IBF Orlando

Meet Charlie (and me) at the Institute of Business Forecasting conference in Orlando, October 26-28.

Post a Comment

2016 SAS/IIF forecasting research grant

For the fourteenth year, the International Institute of Forecasters, in collaboration with SAS®, is proud to announce financial support for research on how to improve forecasting methods and business forecasting practice. The award for the 2016-2017 year will be two $5,000 grants, in Business Applications and Methodology.

Criteria for the award of the grant will include likely impact on forecasting methods and business applications. Consideration will be given to new researchers in the field and whether supplementary funding is possible.

Applications must include:

  • Description of the project (max. 4 pages)
  • Letter of support from the home institution where the researcher is based.
  • Brief (max. 4 page) c.v.
  • Budget and work-plan for the project.

The deadline for applications is September 30, 2016. 

For a complete overview of the requirements for the award, click here.

All applications or inquiries should be sent to IIF Business Director (pamstroud@forecasters.org).

The IIF-SAS grant was created in 2002 by the IIF, with financial support from the SAS Institute, in order to promote research on forecasting principles and practice. The fund provided amounts of US $10,000 per year, which is divided to support research in the two basic aspects of forecasting: development of theoretical results and new methods, and practical applications with real-world comparisons.

A list of previous grant recipients and their research is listed on the IIF website.

Post a Comment

The online SAS forecasting community

The online SAS Support Communities are a vibrant source of information and interaction for over 90,000 registered participants. Here you can ask (and answer) questions, grow (and share) your SAS expertise, and explore these collection points for other resources (like hot tips, articles, blogs, and events) relating to the community topic.

SAS Forecasting and Econometrics is one of over 30 such communities currently available on the SAS support pages.

This site should be visited and bookmarked by every user of SAS forecasting software. It contains answers on over 500 subjects, making it a great first stop when you have a coding or modeling question. (See my colleague Chris Hemedinger's informative blog about "How SAS Support Communities can expedite your tech support experience.")

Kinds of Forecasting and Econometrics Subjects

To give you a flavor for the subject matter in the community, here are some examples:

Most questions deal specifically with coding and modeling within SAS forecasting software. However, the community is also helpful for addressing broader questions on forecasting process, and for other forecasting-related announcements, such as:

Where to Begin

Lurking in your chosen community is perfectly acceptable. But I would encourage you to overcome any shyness, and to go ahead and post your questions (and answers).

Communities function best when there are lots of active participants, and the SAS communities are active. Chris Hemedinger cited recent statistics that 62% of questions get their first response within 60 minutes of posting! And 92% get responses within the first day.

Community moderators blast any unanswered questions to a corps of volunteer responders from SAS R&D, professional services, product management, and marketing. So there is a very good chance any questions you have will be answered promptly, and correctly, by an expert in the subject matter.

Post a Comment

Guest blogger: Len Tashman previews summer issue of Foresight

Editor Len Tashman's Preview of the Summer Issue of Foresight

Clarity and effectiveness of communication are key to success – so says a recent survey by the National Association of Business Economists (NABE), which reported that industry leaders and hiring managers considered communication skills to be the single most important attribute for the promotion of business economists. While we are not aware of a similar survey among demand forecasters and planners, we do have ample reason to believe that the importance of communication skills cannot be exaggerated – this according to the perspectives of numerous Foresight authors from the world of industry and commerce.

Now, Niels van Hove, forecasting practitioner, consultant, and behavioral coach, argues persuasively for the establishment of an effective communications process as a prime element in the implementation of a firm’s strategy. In his article, An S&OP Communication Plan: The Final Step in Support of Company Strategy, Niels explains that

[t]o effectively support strategy execution, S&OP communication needs to do more than fix operational issues. Formal horizontal communication—information flow across functions at the same level in the business hierarchy—is no longer sufficient; S&OP has to support two-way vertical and informal communication in support of a strategy, not just to inform employees but to align, engage, energize, and refocus them.

Now a resident of Melbourne, Australia, Niels is the subject of our Forecaster in the Field interview on page 11.

Perhaps the greatest concern we have today about future employment prospects is over the impact of exponential advances in artificial intelligence and robotics. Ira Sohn, Foresight’s Editor for Strategic Forecasting, says Step Aside, Climate Change – Get Ready for Mass Unemployment. Ira surveys opinion on this issue with emphasis on two important new books published in 2015: Rise of the Robots: Technology and the Threat of a Jobless Future by Martin Ford, and Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence by Jerry Kaplan. These strategic thinkers paint a troubling picture for job markets, and recommend policies for adaptation that fundamentally alter the prevailing ethic of “work for your pay.”

Bridging the Gap Between Academia and Business

The feature section of this 42nd issue of Foresight addresses the very apparent disconnect between (a) the needs of business forecasters and planners, and (b) the research undertaken and published by academics in the field. Our Spring 2016 issue concluded with an “editorial” by Sujit Singh, Forecasting: Academia vs. Business, which we reprint here for your convenience. Sujit’s piece has attracted considerable attention, and in this issue we now present commentaries from both sides of the aisle, which probe further into the basis of the disconnect and what can be done to help bridge the gap.

If there is one point of near universal agreement, it is that while considerable fault lies with academia’s inattention to the realities of the business world, companies put up their own formidable obstacles, as John Boylan and Aris Syntetos see it in their lead commentary, It Takes Two to Tango.

Some persuasive examples of how academics have begun to address issues of importance in industry were offered at the recent International Symposium on Forecasting in Santander, Spain. The presentations in the Forecasting in Practice Track demonstrated how recent research offers great promise for advancing the tools of the trade as well as enhancing the drive to improve forecast accuracy and other key performance metrics.

Here were the nine presentations in the Forecasting in Practice Track:

  • Trust in Forecasting
  • Building a Demand Planning Function
  • Forecasting, Supply Chain, and Financial Performance
  • Simplicity in Forecasting
  • Forecasting and Inventory Control: Mind the Gap
  • Demand Planning for Military Operations
  • Challenges in Strategic Pharmaceutical Forecasting
  • Beyond Exponential Smoothing: How to Proceed when Extrapolation Doesn’t Work
  • Forecasting Temporal Hierarchies

Santander is one of Spain’s enticing alternatives to the Cote d’Azur. Situated on the Bay of Biscay and bordered by Bilbao and the Basque country to the east with the Pico de Europa mountains and Cantabrian coast to the west, Santander graciously offered up the Palacio de la Magdalena as the conference venue. Built a little over a century ago by the local government of Santander to provide a seasonal residence for the royal family of Spain, the Magdalena Palace was converted to host educational courses in the 1930s and now serves as a major meeting and conference facility.

Further evidence of progress in bridging the research gap between academia and business comes with the upcoming Foresight Practitioner Conference, Worst Practices in Forecasting: Today’s Mistakes to Tomorrow’s Breakthroughs. Held in partnership with and at the venue of the Institute for Advanced Analytics at North Carolina State University in Raleigh, the presentations apply many lessons derived from academic research and business experience that we hope will motivate business forecasters to move forward and away from dysfunctional practices.

For details on the speakers and venue, please see the announcement in this issue and go to https://forecasters.org/foresight/2016-conference/ to register. Please note that Foresight readers receive an attractive discount in the registration fee. Hope to see you there.

Farewell to Rob Dhuyvetter

A member of the Foresight Advisory Board since the start, Rob’s duties at the Simplot Food Group have shifted away from forecasting and demand planning. We wish him the best in his new assignment.

Post a Comment

Paul Goodwin on misbehaving forecasting agents

Paul Goodwin

Paul Goodwin

Paul Goodwin is Professor Emeritus of Management Science at University of Bath, and one of the speakers at this fall's Foresight Practitioner Conference (October 5-6 in Raleigh, NC). His topic will be "Use and Abuse of Judgmental Overrides to Statistical Forecasts"-- an area in which he has contributed much of the important research.

Paul also contributed four articles to the recently published Business Forecasting: Practical Problems and Solutions, authoring or co-authoring:

In the Spring 2016 issue of Foresight, Paul has a new article dealing with "Misbehaving Agents."

The Principal and the Agent

An agent is a person (or organization) that is paid to act on behalf of a principal (another person or organization).

For example, you pay a real-estate agent to sell your house. But questions can arise whether the agent is acting in the best interests of the principal. (Is the agent going for a quick sale (and quick commission), minimizing the effort they have to spend on the sale? Or is the agent trying to maximize the selling price, even if it requires more time and effort?)

The Principal-Agent Problem in Forecasting

The principal-agent problem is well recognized in forecasting. Forecasting process participants (the agents being paid by the organization to develop the forecast) have a variety of interests. Those personal interests often compete with the organization's interest in receiving an honest and accurate forecast of the future. For a familiar example, consider the role of sales people as agents in the forecasting process.

In a June 26 post on SupplyChainShaman.com, industry analyst Lora Cecere stated:

Collaborative sales forecasting input leads to increased bias and error. My advice for the global supply chain leader is not waste your time asking the sales team to forecast.

I came to a similar conclusion in my Foresight article (Fall 2014), "Role of the Sales Force in Forecasting":

In the spirit of "economy of process," unless there is solid evidence that input from the sales force has improved the forecast (to a degree commensurate with the cost of engaging them), we are wasting their time -- and squandering company resources that could better be spent generating revenue.

Even assuming that sales people have special knowledge of future customer demand, there is still a question of their motivation to provide honest input to the forecast. If their sales quota is based on the forecast, it is in the sales person's interest to purposely forecast low (to get an easier-to-achieve quota). Goodwin cites a similar example from his earlier research*, where a marketing department deliberately forecasted low so they would "look good" to senior management by consistently beating it.

[Note: In his taxonomy of common forecasting misbehaviors, John Mello refers to deliberate under-forecasting as sandbagging. For the rest of his taxonomy, see "The Impact of Sales Forecast Game Playing on Supply Chains," Foresight (Spring 2009), 13-22, which also appears in Business Forecasting: Practical Problems and Solutions.]

Not a New Problem

Even back in 1957 (when future President Gerald R. Ford was still watching a lot of baseball on the radio), the problem of forecasting agents was recognized. James H. Lorie's article "Two important problems in sales forecasting" (The Journal of Business Vol. 30, No. 3 (July 1957), pp. 172-179) is so good I had to blog about it three times (Part 1, Part 2, Part 3).

Goodwin looks at other types forecasting related mischief, and reasons behind the misbehaviors. For example, new information may make it advisable to revise a forecast -- but an agent may not do so to avoid appearing wishy-washy or incompetent. Maintaining the appearance of competence is also behind unreasonably high probabilities assigned to forecasts (common among political pundits who get more air time the bolder and more confidently expressed their predictions).

Another interesting type of misbehavior, and a sure way to draw attention to yourself, is anti-herding (providing forecasts deliberately different from everyone else's). Ain't misbehavin' fun?

Solutions for Misbehaving Forecasting Agents

Metrics like Forecast Value Added can help you identify forecasting participants that are just making the forecast worse.

Goodwin suggests that forecast users (like planners and upper management) could be educated to accept the high level of uncertainty that is often unavoidable in forecasting. And it would help for reward systems to be designed to encourage honesty and accuracy (something attempted in the Gonik** system for sales force compensation developed in the 1970's for IBM Brazil).


*Goodwin, P. (1998). Enhancing Judgmental Sales Forecasting: The Role of Laboratory Research. In Wright, G. & Goodwin, P. (eds.), Forecasting with Judgement. Chichester: Wiley 91-111.

**Gonik, J. (1978). Tie Salesmen's Bonuses to Their Forecasts, Harvard Business Review, May-June 1978, 116-122.

 

Post a Comment

Using information criteria to select forecasting model (Part 2)

Let's continue now to Nikolaos Kourentzes' blog post on How to choose a forecast for your time series.

Using a Validation Sample

Nikos first discusses the fairly common approach of using a validation or "hold out" sample.

The idea is to build your model based on a subset of the historical data (the "fitting" set), and then test its forecasting performance over the historical data that has been held out (the "validation" set). For example, if you have four years of monthly sales data, you could build models using the oldest 36 months, and then test their performance over the most recent 12 months.

You might recognize this approach from the recent BFD blog Rob Hyndman on measuring forecast accuracy. Hyndman uses the terminology (which may be more familiar to data miners) of "training data" and "test data." He suggested that when there is enough history, about 20% of the observations (the most recent history) should be held out for the test data. The test data should be at least as large as the forecasting horizon (so hold out 12 months if you need to forecast one year into the future).

Hyndman uses this diagram to show the history divided into training and test data. The unknown future (that we are trying to forecast) is to the right of the arrow:

Training data and Test dataNikos works though a good example of this approach comparing an exponential smoothing and an ARIMA model. Each model is built using only the "fitting" set (the oldest history), and generates forecasts for the time periods covered in the validation set.

How accurately the competing models forecast the validation set can help you decide which type of model is more appropriate for the time series. You could then use the "winning" model to forecast the unknown future periods. An obvious drawback is that your forecasting model has only used the older fitting data, essentially sacrificing the more recent data in the validation set.

Another alternative, once you've use the above approach to determine which type of model is best, is to rebuild the same type of model based on the full history (fitting + validation sets).

Using Information Criteria

Information criteria (IC) provide an entirely different way of evaluating model performance. It is well recognized that more complex models can be constructed to fit the history better. In fact, it is always possible to create a model that fits a time series perfectly. But our job as forecasters isn't to fit models to history -- it is to generate reasonably good forecasts of the future.

As we saw in the previous BFD post, overly complex models may "overfit" the history, and actually generate very inappropriate forecasts.

Cubic model

An example of overfitting the model to history

Measures like Akaike's Information Criterion (AIC) help us avoid overfitting by balancing goodness of fit with model complexity (penalizing more complex models). Nikos provides a thorough example showing how the AIC works. As he points out, "The model with the smallest AIC will be the model that fits best to the data, with the least complexity and therefore less chance of overfitting."

Another benefit of the AIC is that it uses the full time series history, there is no need for separate fitting and validation sets. But a drawback is that AIC cannot be used to compare models from different model families (so you could not do the exponential smoothing vs. ARIMA comparison shown above). There is plenty of literature on the AIC so you can find more details before employing it.

Combining Forecasts

Nikos ends his post with a great piece of advice on combining models. Instead of struggling to pick a single best model for a given time series, why not just take an average of several "appropriate" models, and use that as your forecast?

There is growing evidence that combining forecasts can be effective at reducing forecast errors, while being less sensitive to the limitations of a single model. SAS® Forecast Server is one of the few commercial packages that readily allows you to combine forecasts from multiple models.

Post a Comment

Using information criteria to select forecasting model (Part 1)

Photo of Nikolaos Kourentzes

Dr. Nikolaos Kourentzes Forecasting Professor with great hair

Nikolaos Kourentzes is Associate Professor at Lancaster University, and a member of the Lancaster Centre for Forecasting. In addition to having a great head of hair for a forecasting professor, Nikos has a great head for explaining fundamental forecasting concepts.

In his recent blog post on How to choose a forecast for your time series, Nikos discusses the familiar validation (or "hold out") sample method, and the less familiar approach of using information criteria (IC). Using an IC helps you avoid the common problem of "overfitting" a model to the history. We can see what this means by a quick review of the time series forecasting problem.

The Time Series Forecasting Problem

A time series is a sequence of data points taken across equally spaced intervals. In business forecasting, we usually deal with things like weekly unit sales, or monthly revenue. Given the historical time series data, the forecaster's job is to predict the values for future data points. For example, you may have three years of weekly sales history for product XYZ, and need to forecast the next 52 weeks of sales for use in production, inventory, and revenue planning.

There is much discussion on the proper way to select a forecasting model. It may seem obvious to just choose a model that best fits the historical data. But is that such a smart thing?

Unfortunately, having a great fit to history is no guarantee that a model will be any good at forecasting the future. Let's look at a simple example to illustrate the point:

Suppose you have four weeks of sales: 5, 6, 4, and 7 units. What model should you choose to forecast the next several weeks? A simple way to start with with an average of the 4 historical data points:

Weekly mean model

Model 1: Weekly Mean

We see this model forecasts 5.5 units per week into the future, and there is an 18% error in the fit to history. Can we do better? Let's try a linear regression to capture any trend in the data:

Linear regression model

Model 2: Linear Regression

This model forecasts increasing sales, and the historical fit error has been reduced to 15%. But can we do better? Let's now try a quadratic model:

Quadratic model

Model 3: Quadratic

The quadratic model cuts the historical fit error in half to 8%, and it is also showing a big increase in future sales. But we can do even better with a cubic model:

Cubic model

Model 4: Cubic

It is always possible to find a model that fits your time series history perfectly. We have done so here with a cubic model. But even though the historical fit is perfect, is this an appropriate model for forecasting the future? It doesn't appear to be. This is a case of "overfitting" the model to the history.

In this example, only the first two models, the ones with the worst fit to history, appear to be reasonable choices for forecasting the near term future. While fit to history is a relevant consideration when selecting a forecasting a model, it should not be the sole consideration.

In the next post we'll look at Nikos' discussion of validation (hold out) samples, and how use of information criteria can help you avoid overfitting.

Post a Comment

Extending SAS Forecast Server Client

SAS® Forecast Server was released in 2005, and remains our flagship forecasting offering. It provides large-scale automatic forecasting, creating an appropriate customized forecasting model for each point in an organization's forecasting hierarchy.

The original SAS® Forecast Studio GUI lets users point and click their way to loading data, defining the hierarchy, specifying the roles of variables, identifying events, automatically creating the models, and generating the forecasts. (This 5-minute SAS Forecast Server demonstration illustrates the process, including reviewing the generated forecasts, making manual overrides, and reconciling the adjusted forecasts.)

Since the initial release in 2005, two additional interfaces have been developed to provide more capabilities to forecasters:

SAS® Time Series Studio

Forecasting immediately brings to mind the development of complex models and generation of forecast values, but equally important in the forecasting process is the analytic step prior to generating forecasts – understanding the structure of your time series data.  SAS Time Series Studio is a GUI for the interactive exploration and analysis of large volumes of time series data prior to forecasting. (This Analytics2012 presentation illustrates the importance of understanding the structure of your time series data in generating forecasts.)

SAS Time Series Studio provides forecast analysts with tools for identifying data problems, including outlier detection and management, missing values, and date ID issues. In addition, basic and advanced time series characterization, segmentation of the data into subsets and structural manipulation of the collective time series (hierarchy exploration) all contribute to faster forecast implementation and better modeling due to increased understanding of the data.

SAS® Forecast Server ClientForecast Server Client Workflow Diagram

Forecasters may need access to their system anywhere and anytime, and now they have it with SAS Forecast Server Client. This web interface, released last year, provides a standard forecasting workflow to address the majority of users. It also integrates capabilities from the Forecast Studio and Time Series Studio GUIs.

As an example, SAS Forecast Server Client supports rules-based segmentation of time series to facilitate different modeling strategies per segment. And it provides enhanced tracking of forecasting performance. (Learn more in this time series segmentation white paper and video.)

One of the great benefits of SAS forecasting software is that it is just one part of the overall SAS system. SAS gives users a powerful programming language, and all sorts of tools for data management, visualization, analysis, reporting -- and pretty much anything else.

It is a common practice for SAS Forecast Server users to first set up their forecasting projects through the GUI. This generates SAS code that users can access, tweak, and schedule into a batch forecasting process. All items can be forecast automatically, with little interaction from the forecast analyst. But the analyst may want to put extra effort into particularly high value forecasts, to squeeze out every last percentage point of forecast accuracy. They can do this using a wealth of SAS procedures, and by writing their own SAS code.

Extending SAS® Forecast Server Client

Photo of Alex ChienIn the most recent technical paper from SAS forecasting R&D, Alex Chien (Director, Advanced Analytics R&D) has written on  Extending SAS® Forecast Server Client. This paper shows how to write plug-ins to extend the capabilities of SAS Forecast Server. It shows how to use Lua as the interface between SAS Forecast Server Client and the plug-ins, and gives code examples for both a segmentation strategy and a modeling strategy plug-in.

You can think of a plug-in as a macro with a list of arguments that control how the macro functions on data or instructions. The new LUA procedure (available since the 3rd maintenance release of SAS 9.4) is a standard Lua interface between SAS Forecast Server and the plug-ins. (Find more information about programming in Lua at https://www.lua.org/pil/contents.html.)

This new paper will be of keen interest to SAS forecasting customers. It is full of code examples (in both Lua and SAS), showing step-by-step how to enhance their SAS Forecast Server implementation by creating their own plug-ins.

Post a Comment
  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.

    Mike is also the author of The Business Forecasting Deal, and co-editor of Business Forecasting: Practical Problems and Solutions. He also edits the Forecasting Practice section of Foresight: The International Journal of Applied Forecasting.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives