We ended last time having selected a cluster of surrogate products -- a subset of the original selection of like-items that had the same attributes as the new product. Judgment has been used throughout the process so far, in specification of the relevant attributes, filtering the original candidate pool of
Tag: SAS
Seems like we've been here before. It is January, so time again to announce the Fortune 100 Best Companies to Work For in the US. This year SAS is at #2, our fifth straight year in the top 3, over which our average rank has been 1.8. We've covered this topic
If you need an excuse to get out of the office and perhaps learn a thing or two this fall, here are three upcoming events: Foresight Practitioner Conference: S&OP and Collaborative Forecasting (Columbus, OH, September 25-26) From the campus of Ohio State University, Foresight's editor Len Tashman and S&OP column
SAS Forecast Server (release 12.3) is now shipping, and includes the new SAS Time Series Studio GUI. Time Series Studio (TSS), was released as "experimental" last August in 12.1, and is now in production. TSS provides an additional interface in Forecast Server, for time series data mining, exploration, and data preparation.
Mercifully, we have reached the final installment of Q&A from the June 20 Foresight-SAS webinar, "Forecast Value Added: A Reality Check on Forecasting Practices." As a reminder, a recording of the webinar is available for on-demand review, and the Foresight article (upon which the webinar was based) is available for free
Q: Is the MAPE of the naive forecast the basis for understanding the forecastability of the behavior? Or are there other more in depth ways to measure the forecastability of a behavior? MAPE of the naive forecast indicates the worst you should be able to forecast the behavior. You can
With this Q&A Part 3, we are about halfway through the questions submitted during the FVA webinar. We did over 15 minutes of live Q&A at the end of the webinar, and covered many of the submitted questions at that time, however I always prefer to issue complete written responses to
Q: Could you send me the presentation? With audio if possible. If you'd like a pdf of the slides, email me directly: mike.gilliland@sas.com For the audio, the webinar recording is available for free on-demand review: FVA: A Reality Check on Forecasting Practices Q: Can we get the case study referred here
As promised in yesterday's Foresight-SAS sponsored webinar on "Forecast Value Added: A Reality Check on Forecasting Practices," here is Part 1 of my written response to the over 25 questions that were submitted during the event. (Note: It may take a week or so to get through all of them.)
If an organization is spending time and money to have a forecasting process, is it not reasonable to expect the process to make the forecast more accurate and less biased (or at least not make it any worse!)? But how would we ever know what the process is accomplishing? To