Time series segmentation with SAS Forecast Server Client (Part 1)

My colleague Jessica Curtis is a Solutions Architect at SAS, specializing in forecasting. In this two part contribution, Jessica tells us what time series segmentation is, why we use it, and how it adds value to your forecasting process.

Guest Blogger Jessica Curtis on Time Series Segmentation

Photo of Jessica Curtis

Jessica Curtis

The large-scale forecasting challenge affects organizations across many different industries.

Whether you’re a retailer forecasting sales to determine the right assortment of SKUs in your stores, a hotel forecasting the demand for room occupancy, or a wholesaler forecasting the right level of inventory to keep your customers in stock. You could be a media company, forecasting power consumption in your data centers; an airline, forecasting passenger and freight traffic; or a telecom company, forecasting call center demand for labor capacity planning.

With any large-scale forecasting challenge, the most important first step is to understand the structure of your time series data.

The Ideal Time Series

In a perfect world, all your time series would look something like this: the data is high volume, with a long history, a steady, repeatable, and predictable pattern, with very few (if any) missing values.

Ideal time series

In this ideal forecasting world, good forecasting results are easily achieved through an automated forecasting engine and one forecast modeling strategy. Unfortunately, real world time series data is not always so clean.

The Real Time Series

In reality, there is much more diversity in the patterns of our time series data.

Types of time series patterns

There are stable time series, such as laundry detergent sales, and seasonal time series, such as the demand for grills. There could be time series with a level shift in demand; for example, when a market expands to new regions or sales channels. There are also situations where there is limited history in the time series, which would be brand new items incorporated into your offering. There are time series with sparsity of data or intermittent demand, such as spare parts for cars. And there could also be items that only sell at certain times of the year; for example, Halloween costumes.

Given the range of time series patterns, one modeling strategy for all these different types of time series will not give you the best forecast. So now the question becomes: how do I manage all the different patterns of my time series data?

Time Series Segmentation

The answer is time series segmentation. Time series segmentation is the method of dividing time series data into distinct types or segments based on the underlying properties of the time series. It is the one of the most important first steps in the forecasting process. What this allows you to do is then apply a customized forecast modeling strategy to each time series segment.

We'll continue this discussion tomorrow...


Time Series Segmentation Recorded Webinar

If you are eager to see how the story ends, watch the 15-minute webinar Time Series Segmentation by Jessica, which includes a 6-minute demonstration of the new SAS Forecast Server Client by SAS forecasting product manager Laura Ryan.





Post a Comment

Introducing the new SAS Forecast Server web client

Faithful readers,

I've spent the last several weeks neglecting this blog to spend my weekdays launching the 14.1 release of SAS® Advanced Analytics (which includes my products: SAS Forecast Server, SAS Forecasting for Desktop, SAS/ETS, SAS/STAT, SAS/OR, SAS/IMLSAS/QC, and SAS Simulation Studio among many others).

And I've spent my weekends helping renovate Dragonfly Farm, a 1927 farmhouse (and 26 acre compound) in Seagrove, NC.


1927 Farmhouse


Kitchen Demo








General Store

General Store

Workshop & Bunkhouse

Workshop & Bunkhouse








Seagrove is recognized as the pottery center of North Carolina (thanks to the exceptional clay in its soil), and is known for being so far out in the country there is virtually no internet or even cell phone service available. (Hence, my inability to contribute BFD posts on the weekends.)

Of course, there is currently no plumbing or electricity in the farmhouse itself. But there is a cozy two-seat outhouse for those who just can't leave their friend's behind.


On Site Facilities

Inside Outhouse

Seating for Two











But enough of the excuses...

The BFD is Back -- With a Vengeance

Starting tomorrow, we'll take a daily look at some exciting capabilities available through SAS Forecast Server's new web interface, Forecast Server Client (FSC).

FSC allows users to access the power of SAS forecasting from anywhere through a web browser. The web client provides a structured workflow (click on screen shot below), to help guide even the most novice users through the process of generating statistical forecasts.

Screen shot of Forecast Server Client

Forecast Server Client

FSC incorporates many of the functions of the existing Forecast Studio and Time Series Studio (Part 1, Part 2) interfaces, while adding functionality for demand classification, time series segmentation, and targeted modeling. It also supports multistage modeling, sophisticated forecast accuracy tracking across iterations, and parallel execution of modeling tasks (improving performance).

In this week's blog series, we'll look specifically at three of the new capabilities:

  • Time series segmentation (guest post by Jessica Curtis, SAS Solutions Architect)
  • Multistage modeling (guest post by Pu Wang, SAS R&D)
  • Forecast accuracy tracking across iterations (guest post by Evan Anderson, SAS R&D)

To get yourself started, watch this brief (15 minute) on-demand webinar Time Series Segmentation by Jessica Curtis, Solutions Architect at SAS. Jessica illustrates the different types of demand patterns you'll commonly encounter, and shows why you would segment them to apply the appropriate type of model to each type of pattern.

The webinar includes a 6-minute demonstration of Forecast Server Client by Laura Ryan, product manager for SAS forecasting.


Post a Comment

Announcing SAS/IIF research grants for 2015/16

SAS/IIF Grant to Promote Research on Forecasting

IIF LogoFor the thirteenth year, the International Institute of Forecasters, in collaboration with SAS, is proud to announce financial support for research on how to improve forecasting methods and business forecasting practice. The award for the 2015-2016 year will be (2) $5,000 grants. The deadline for applications is September 30, 2015. (Full description and requirements for the award)

Applications must include:
◾Description of the project (max. 4 pages)
◾Letter of support from the home institution where the researcher is based.
◾Brief (max. 4 page) c.v.
◾Budget and work-plan for the project.

All applications or inquiries should be sent to IIF Business Director (forecasters@forecasters.org).

The IIF-SAS grant was created in 2002 by the IIF, with financial support from the SAS® Institute, in order to promote research on forecasting principles and practice. The fund provided amounts of US $10,000 per year, which is divided to support research in the two basic aspects of forecasting: development of theoretical results and new methods, and practical applications with real-world comparisons.

Find previous grant recipients and links to their published findings on the IIF website announcement.


Post a Comment

Never be left stranded with a two-bin inventory control system (Part 2)

The Two-Bin Inventory Control System

Image of toilet paper dispenser

A two-bin inventory control system is designed to never leave you stranded

So what is a two-bin inventory control system? Whether you realize it or not, you are probably very familiar with the concept. In fact, if you are reading this on a tablet or smartphone, you may be sitting next to one right now (see photo).

According to Investopedia.com, the two-bin system is mainly used for small or "low value" items. (While we're not here to debate the value of toilet paper to western society, this blogger, for one, believes that Investopedia seriously underestimates it.) Here's how a two-bin system works:

  • Two identically sized bins are used to hold the item in inventory.
  • When the first bin is empty, this triggers an order to refill the bin.
  • Meantime, the second (full) bin is used to supply demand until the replenishment order is filled.

Size of the bins is determined by factors like the rate and variability of demand, and the desired service level (i.e., percent of time demand is fulfilled). You want the bin to hold enough inventory to cover demand until the empty bin is replenished.

Instead of needing to hire a toilet paper forecaster to generate daily (or even hourly) demand forecasts for each stall, you only need to know things like the average rate of consumption (squares per hour), variability of consumption, and perhaps maximum rate of consumption due to special circumstances (a particular menu item in the cafeteria, or a stomach virus going around). With this information, you can determine the roll size (number of squares) and frequency of monitoring the bins, in order to maintain the desired service level.

What Do You Do When You're Stranded and There's Nothing on the Roll?

While fulfilling demand 100% of the time is a pleasant thought, it isn't always possible. But in multi-stall restroom facilities, the user may have the option of pulling supply from next door, as we learned in "The Stall" on Seinfeld. This is analogous to a transfer of inventory between stores -- perhaps an awkward and costly action, but effective to mitigate the impact of stock outages.

Brother, can you spare a square?

Post a Comment

Never be left stranded with a two-bin inventory system (Part 1)

As discussed in the last BFD post, sometimes a difficult and expensive problem doesn't have to be solved -- it can simply be avoided.

When the teetering boulder threatened the baby below the cliff, we removed the baby and no longer had to worry about propping up the boulder. When it seems a retailer has to forecast every item, every week, at every store -- maybe they don't. Instead of generating what can be millions of item / store / week forecasts, which probably aren't going to be very accurate anyway, it may be sufficient to forecast at the item / DC level and use inventory management practices to stock the stores.

The idea is to not solve problems you don't have to solve, to not rely on forecasts when you don't have to.

Generating millions of forecasts is not a problem with current technology such as SAS Forecast Server, or SAS Forecast Server + SAS Grid Manager for extremely large-scale problems. (See this story about generating millions of forecasts at Wyndham Exchange & Rentals.) But demand patterns at highly granular levels like item / store / week tend to be very erratic and not necessarily "forecastable" to the degree of accuracy desired.

At a SAS Global Forum presentation last month, Curry Hilton of the rapidly growing hobby farm retailer Tractor Supply Company reported that half their store / item combinations had 11 or fewer non-zero sales weeks per year. (So those store / item combinations sold zero on 41 or more weeks of the year!) Another retailer I'm familiar with has stated that half of their items sell less than one unit per week per store.

Generating store / item forecasts for some items can make sense -- when volumes are high enough and fairly regular, or there are big promotional spikes upcoming, or the items are of particularly high value. But often we don't have to solve that difficult forecasting problem, and can instead rely on something simple like a two-bin inventory control system to drive store inventory replenishment.

We'll investigate the two-bin system in Part 2, using an everyday example.

Post a Comment

How to solve a difficult forecasting problem

The Boulder, The Cliff, and The Baby

Imagine you are faced with this very urgent problem: A large boulder is teetering on the edge of a cliff, at the bottom of which sits a baby, at risk of being crushed. How do you solve this problem?

One solution for the teetering boulder is this:

Use large ropes and cables to fasten the boulder to the top of the cliff, buying some time while you build a large infrastructure of concrete and metal to support the boulder from below.

Sounds great -- the boulder is secured and the baby is now safe. But is this really such a great solution? What if instead we just did this:

Remove the baby from the bottom of the cliff.

While this does not solve the issue of the teetering boulder, it has done something better. It has made the teetering boulder irrelevant -- no longer a problem that needs to be solved! (Once the baby is in a safe spot, who cares if the boulder falls?)

An Example from Forecasting

Too often, in dealing with our urgent business forecasting problems, we go for the first type of costly and time-consuming solution. Sometimes it may not be obvious that there are alternative approaches. Or sometimes we may have hired an unscrupulous consultant who will (of course) suggest a costly and time-consuming answer.

Consider the apparent problem of generating highly granular forecasts, such as by customer/item for a manufacturer, or store/item for a retailer. There can be millions of time series at this most granular level. It may appear that we need to forecast all of them. So we buy terabytes of storage and the fastest processors to be able to model and forecast each of these millions of series. But did we really have to do all this? Is this approach really going to give us the best answer to the ultimate business problem, which is meeting customer demand in a cost effective manner?

A manufacturer who fulfills customer demand out of network of distribution centers (DCs), probably doesn't have to care about the individual demands of individual customers. As long as forecasts at the DC/item level are accurate enough to keep an appropriate level of inventory in each DC, who cares what individual customers are demanding? Unless a customer dominates the demand for an item at the DC (consuming a high percentage of the DC volume in that item), there may be no good reason to try to forecast that customer/item combination.

You could make a similar argument for replenishable items in retail, where multiple stores are supplied from a central warehouse. Instead of forecasting each store/item combination every week, just forecast the total demand for each item at the warehouse level, and use inventory policies (min/max, 2-bin, etc.) to replenish the store shelves. As long as you have forecasted well enough at the warehouse/item level, you should not have to worry about store/item forecasting.

The point it, don't waste time solving a costly and difficult problem (like customer/item or store/item forecasts) if it doesn't need to be solved. Not only are highly granular forecasts likely to be quite inaccurate (especially compared to an aggregated intermediate level like DC/item), but they take considerably more time and resources to generate.

So make your life easy. Whenever possible, eliminate the need to do forecasting.

Post a Comment

Online Forecast Value Added course May 7-8

Portrait of Chip Wells

Chip Wells, PhD

The SAS Business Knowledge Series now offers an online version of the "Forecast Value Added Analysis" course, taught via live web in two afternoon sessions, May 7-8. The instructor is my colleague Chip Wells, who expanded our original 1/2 day FVA workshop with new material, examples, and exercises based on his experience across the chemical, energy, financial, healthcare, and transportation industries. I attended Chip's initial "test teach," and the course looks great!

Chip holds a PhD in Economics from North Carolina State University, and is co-author of the book Applied Data Mining for Forecasting Using SAS. In addition to teaching forecasting and data mining courses for the SAS Education Division, Chip also serves as an Analytic Consultant for the SAS Advanced Analytics Lab.

From the course description:

Forecast Value Added (FVA) is the change in a forecasting performance metric (such as MAPE or bias) that can be attributed to a particular step or participant in the forecasting process. FVA analysis is used to identify those process activities that are failing to make the forecast better (or may even be making it worse). This course provides step-by-step guidelines for conducting FVA analysis to identify and eliminate the waste, inefficiency, and worst practices in your forecasting process. The result can be better forecasts, with fewer resources and less management time spent on forecasting.

       Learn how to

  • map your forecasting process
  • gather, organize, and store required data
  • determine the volatility and forecastability of your demand patterns
  • visualize the data
  • create the "comet chart" relating forecast accuracy to volatility
  • analyze the data using simple methods from statistical process control
  • identify worst practices and other non-value adding activities
  • report FVA results
  • communicate results to management
  • eliminate wasted efforts and streamline your forecasting process
  • set reasonable forecasting performance objectives and expectations.

Who should attend
Forecasters, demand planners, and business analysts in any industry, as well as managers overseeing the business forecasting function

The live web course will be held Thursday and Friday, May 7-8, from 1:00 - 4:30pm EDT each afternoon. The cost is $825 USD, and attendees earn 1.7 EPTO. (Register by adding the course to your cart at the bottom of the page. See also the Course Outline tab for additional information.)

This course does not require knowledge of SAS software, but you should know very basic statistical concepts like mean and standard deviation, randomness, and variability.

Post a Comment

FVA interview with Steve Morlidge

Photo of Steve Morlidge

Steve Morlidge

The Institute of Business Forecasting's FVA blog series continued on March 2, with my interview of Steve Morlidge of CatchBull. Steve's research (and his articles in Foresight) have been a frequent subject of BFD blog posts over the last couple of years (e.g. The "Avoidability of Forecast Error (4 parts), Q&A with Steve Morlidge of CatchBull (4 parts), Forecast Quality in the Supply Chain (2 parts)). I asked Steve about the application of FVA analysis with his clients.

Steve notes that his goal is to measure forecasting performance compared to the alternative of simple replenishment based on prior period actual demand (i.e., compared to using the naïve random walk (aka "no change") model). He advocates use of the Relative Absolute Error (RAE) which is the ratio of the absolute error in the forecast to the error in the naïve forecast.

His findings, which are in some sense astounding, are that most businesses struggle to beat performance of the naïve forecast by more than 10-15%, and that 40-50% of low-level forecasts are actually worse than the naïve (so negative FVA!). His studies, reported in more detail in a series of Foresight articles, illustrate the generally poor state of execution of the business forecasting process.

The IBF interview includes two innovative reports Steve has developed to help visualize the state of forecasting at his clients. He argues that a value added perspective to performance measurement works better to expose the problem to management, and engage their efforts to make process improvements.

Read the full Steve Morlidge interview, and others in the FVA series, at www.demand-planning.com.

April Interview: Erin Marchant from Moen

Coming soon on the IBF blog site is my interview with Erin Marchant, Senior Analyst in Demand Management at Moen. Erin and I crossed paths last fall at the APICS conference, and her group is now in the early stages of an FVA analysis project.

In the interview Erin provides valuable guidance on engaging your business unit partners, and avoiding an "us versus them" situation. As she points out, FVA is a measure of the forecasting process, and is not intended as a means to embarrass individual process participants. [Note, however, that I myself am not above embarrassing participants who bias the forecast and contaminate the process for their own personal agendas.]

Post a Comment

Guest blogger: Len Tashman previews Spring issue of Foresight

Editor Len Tashman's Preview of the Spring issue of Foresight

Len Tashman

Len Tashman

The Special Feature article of this 37th issue of Foresight From Sales and Operations Planning to Business Integration – comes about through a rare but effective collaboration between an academic and a practitioner. The coauthors are Mark Moon, head of the Department of Marketing and Supply Chain Management at the University of Tennessee, Knoxville and author of Demand and Supply Integration: The Key to World-Class Demand Forecasting, and Pete Alle, Vice President of Supply Chain for Oberweis Dairy and overseer of global process improvement and innovation across that company’s supply chain.

The article identifies the failings of organizational implementations of S&OP, including (a) planning horizons that are too short-term, (b) a process driven by the supply chain, and (c) a retrospective rather than forward-looking agenda in the monthly planning meetings.

To effectively drive business integration, the authors argue, changes may be required in organizational structure, processes, and culture, with leadership commitments, employee incentive structures, and training comprising critical components.

Next up is our section on Strategic Forecasting, which looks outside our immediate planning fences. Businesses too often have only a short-term focus in their planning and operations, taking a reactive posture to crises in their supply chains while failing to consider and manage events that present important risks and opportunities. So argue Tom Goldsby, Chad Autry, and John Bell, the authors of Thinking Big! Incorporating Macrotrends into Supply Chain Planning and Execution. They examine four key macrotrends and describe their potentially profound effects on corporate supply chains. These macrotrends are (1) global population growth and shifts, (2) interconnectedness and social leveling, (3) climate change and sustainability, and (4) resource scarcity and conflicts.

Forecasts of global population growth are a lynchpin for strategic planning, but it’s a questionable assumption that even official forecasts can be considered reliable. In The United Nations Probabilistic Population Projections: An Introduction to Demographic Forecasting with Uncertainty, noted population-forecasting experts Leontine Alkema, Patrick Gerland, Adrian E. Raftery, and John R. Wilmoth describe the state of the art here in providing margins of error around the best point forecasts. Their examples show just how wide the area of uncertainty is, providing a cautionary note to long-range forecasters and strategic planners.

Have Corporate Prediction Markets Had Their Heyday? Despite great acclaim for the potential of prediction markets as an efficient tool for aggregating individual judgments, corporate adoption has been limited. To find out why, Thomas Wolfram examined existing research and interviewed several dozen key business executives. He reports that problems with overall acceptance of the corporate prediction market (CPM) often stem from a lack of trust by management as well as a greater business focus these days on big data and social media content. Hope for the CPM persists, though – if certain obstacles can be finessed.

Intermittent demands have been a bugaboo for corporate forecasters, and numerous articles have been published – several in Foresight – that show how to make the best of a very difficult methodological challenge. An important aspect of the problem is the basis upon which we choose a “best” forecasting method. In his new article Measuring the Quality of Intermittent-Demand Forecasts: It’s Worse than We’ve Thought, Steve Morlidge clearly demonstrates that we must rethink the use of our most common accuracy metrics for selecting that optimal forecast method. The problem is acute, he writes, because many software applications use these metrics for performance evaluation and method selection; in doing so, they potentially provide us with poor feedback and inferior models, resulting in harmful consequences for inventory management.

Steve evaluates various metrics that have been offered to assess the performance of intermittent-demand models and proposes a bias-adjusted error metric that may be superior to its predecessors.

Foresight Cover


Our book review in this issue examines a new text that seeks to integrate demand forecasting with inventory management: Demand Forecasting for Inventory Control by Nick T. Thomopoulos, Professor Emeritus of the Illinois Institute of Technology. Reviewer Stephan Kolassa, Foresight’s Associate Editor, presents the high hopes he had for the book and goes on to frankly discuss his likes and misgivings.

This Spring 2015 issue concludes with our latest Forecaster in the Field interview, introducing Fotios Petropoulos, Foresight’s new Editor for Forecasting Support Systems. Fotios will be speaking at the upcoming International Symposium on Forecasting (ISF) on ways forward for our forecasting support systems, part of Foresight’s…

Forecasting in Practice Track

The 35th annual ISF will be held in Riverside, California from June 21-24. Riverside was rated America’s eighth coolest city in a Forbes survey last year.

For the past three and a half decades, the ISF has been the principal forum for presentation of new research on forecasting methods and practices, bringing together researchers and practitioners from more than 30 countries around the world. View the 2015 program and its speakers at www.forecasters.org/isf/.

An integral part of this year’s ISF is the Forecasting in Practice track. The nine speakers in this track will examine forecasting and planning processes, and offer proposals for upgrading our forecasting performance:

  • Sustaining a Collaborative Forecasting Process  Simon Clarke, Coca-Cola Refreshments
  • Improving Forecast Quality in Practice  Robert Fildes, Lancaster University
  • Managing Supply Chains in a Digital Interconnected Economy  Ram Ganeshan, College of William and Mary
  • Role of the Sales Force in Forecasting  Michael Gilliland, SAS Institute, Inc.
  • Macro Trends and Their Implications for Supply Chain Management  Thomas Goldsby, The Ohio State University
  • Harnessing your Judgment to Achieve Greater Forecast Accuracy  Paul Goodwin, University of Bath
  • Strategies for Demand and Supply Integration  John Mello, Arkansas State University
  • Managing the Performance of a Forecasting Process  Steve Morlidge, Satori Partners
  • Forecasting Support Systems: Ways Forward  Fotios Petropoulos, Cardiff University
Post a Comment

Guest blogger: Dr. Chaman Jain previews Winter issue of Journal of Business Forecasting

JBF Special Issue on Predictive Business Analytics

Dr. Chaman Jain

Dr. Chaman Jain

Dr. Chaman Jain, professor at St. Johns University, and editor-in-chief of the Journal of Business Forecasting, provides his preview of the Winter 2014-15 issue:

Predictive Business Analytics, the practice of extracting information from existing data to determine patterns, relationships and future outcomes, is not new; it has been used in practice for many years. What is new, however, is the massive amount of data, so-called Big Data, now available, that can better support decision making and improve planning & forecasting performance. Additionally, we now have access to technology that can store, process, and analyze large amounts of data. The analysis modules found in these tools have algorithms that can handle even the most complex of problems. Above all, there is a growing awareness among businesses, data scientists, and those with data responsibilities that there is a wealth of hidden information that should be tapped for better decision making.

To make the most of Predictive Business Analytics, we need to prioritize what problems we’re trying to solve, which data and signals we should collect, where they are located, and how they can be retrieved and compiled. Once these steps have been taken, the data must be filtered and cleansed in preparation for analysis with a predictive analytics software tool, which can be MS-Excel, an open source application, or the many other available applications in the marketplace.

Predictive Analytics and finding solutions from Big Data have no boundaries. It’s applicable to nearly every industry and discipline. Ying Liu demonstrates how this is currently being used by the retail industry, as well as in education, healthcare, banking, and entertainment industries. Macy’s, for example, on a daily basis, checks the competitive prices of 10,000 articles to develop its counter price strategy. Other authors of this issue talk mostly about its application in the areas of Demand Planning and Forecasting. In contrast, Charles Chase illustrates how POS data can be used in sensing and shaping demand to improve sales and profits, while John Gallucci demonstrates how it can be used in managing new product launches.

Cover of JBF

Journal of Business Forecasting

Gregory L. Schlegel discusses not only how Predictive Analytics can be used to optimize the supply chain, but also how it can manage risks. He talks about how credit card and insurance companies have reduced their overall risk by using predictive analytics. In addition, he provides case studies of companies in the CPG/Grocery, High-Tech, and Automobile industries that are using them to drastically improve their bottom line. He talks about how Volvo Car Corporation is targeting zero accidents and zero deaths/injuries by gaining insight from data downloaded each time their automobiles come to dealers for servicing. Using this data, Volvo wants to determine if there are defective parts that require correction, and/or if there are issues with driver behavior, all needing to be addressed.

Eric Wilson and Mark Demers explain that with Predictive Analytics, businesses are looking to apply a more inductive analysis approach, not in lieu of but in addition to, a deductive analysis. In inductive analysis, we set hypotheses and theories, and then look for strong evidence of the truth. In other words, we search for what the data would tell us if they could talk. This is in contrast with deductive reasoning where we draw conclusions from the past.

Allan Milliken outlines the processes that Demand Planners should follow to take full advantage of the power of Predictive Analytics. It can help to classify products that are forecastable and those that are not, enabling them to arrive at a consistent policy regarding which products should be built-to-order and which for built-to-stock. It can also flag exceptions, thereby enabling established corrective actions to be applied before it is too late. Larry Lapide demonstrates how Demand Planners can extract demand signals from the myriad of data gathered from various sources including a vast amount of electronic data drawn from the Internet. Mark Lawless discusses Predictive Analytics as a tool to peek into the mind of consumers to determine what types of products and services they are likely to purchase.

It is true that Predictive Analytics is not revolutionary, but its applications can certainly be considered innovative. The significant impact it can have will continue to change the way businesses are managing demand, production, procurement, logistics, and more. There is an overwhelming belief that companies that can identify the right data and leverage it into actionable information are more likely to improve decisions, act in a timely manner, and have a true competitive advantage. Advancing your organizations’ analytics capability can be the great difference maker between companies that are performing well, and those that are not.

Your comments on this special issue on Predictive Business Analytics & Big Data are welcome. I look forward to hearing from you.

Happy Forecasting!

Chaman L. Jain, Editor

St. John’s University | Email: Jainc@Stjohns.edu

IBF FlyerIBF's Predictive Business Analytics Conference

Just a reminder that the Institute of Business Forecasting's first Predictive Business Analytics Forecasting & Planning Conference will be held in Atlanta, April 22-24. Speakers include my colleague Charlie Chase, and Greg Schlegel, who also contributed articles to the special issue of JBF.



Post a Comment
  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives