The Analytic Hospitality Executive in 2014: A Review

Well, Analytic Hospitality Executives, the year has once again flown by, and here we sit just before the holidays looking back on 2014, and figuring out what it all means for 2015. I traveled even more than usual this year (if that’s even possible), spending a significant amount of time with our hospitality and travel colleagues in Asia Pacific, which was absolutely fantastic. It’s refreshing to see that while the scenarios might be slightly different, hospitality and travel companies around the world are realizing the value of their data, and the need for powerful analytics to unlock that value.

The industry really seems to be accepting and embracing that the volume, variety and velocity of data today is creating a big data challenge (and opportunity) for the industry, particularly when it comes to non-traditional formats like text or web data.  Companies that are able use analytics to turn all of this big data into meaningful actionable insights, especially those that improve the guest experience, will be the ones that win.   I’ve had many conversations around the globe on these topics this year, and expect to have many more in the years to come. Our blog posts this year have attempted to help our analytic hospitality executives understand where and how to identify these opportunities. Here is a review with links in case you missed anything, or wanted a refresher.

Right from the beginning of 2014, we started hearing the term “personalization” around the hospitality industry, with most major hospitality companies announcing strategic initiatives to develop a more relevant and engaging relationship with their guests. We kicked off the year with a couple of prediction blogs, one from the research faculty at the Cornell Center for Hospitality Research, our blog partners, and one from me “The 14 actions for hotels in 2014”. The CHR researchers talked extensively about the role of technology and data in improving the guest experience – reflecting the growing conversations about personalization. In my blog, I provided a set of “to do” items to help organizations prepare themselves for the data ,analytics and business process change required to set themselves on the path to personalization.

Our blog covered several broad topics through the year, including trends in digital marketing, pricing and revenue management, and text analytics. Underlying every topic was the importance of data management, to address the growing sources and impacts of big data in hospitality. Natalie spoke about how “Multi-channel marketing in hospitality starts with data management” emphasizing the importance of data storage, data integration, data quality and data governance as the underpinnings of a successful marketing program. I posted a two part series on Big Data and Big Analytics as a “Big Opportunity” for hotels: Part 1 and Part 2. In those two posts, I defined the technology enablers that facilitate the management of big data, and the execution of big analytics. As well, I advocated for “Responsible use of big data” and described how to thoroughly evaluate new data sources to ensure you aren’t adding data overhead with no value. Alex touched on the opportunities to leverage big data in revenue management in his video post “Can big data help revenue management”.   In her recent post about Web Analytics for Hospitality, Natalie and special guest Suneel Grover advocated for better data and better integration – reinforcing the need to incorporate the online and offline profile of guests for a true 360 degree view.

Modernizing marketing, especially moving towards digital marketing, was a big topic in the industry and on our blog as well. Natalie interviewed Bill Carrol about challenges and opportunities in hospitality marketing, to provide a broad perspective on the topic. We then featured Bruce Swann, Customer Intelligence Solutions Manager at SAS, in a couple of more detailed posts on elements of the hospitality marketing problem, including his interview in “What is the future of email in hospitality marketing”, where he discussed the relevance of email in hospitality marketing, and provided advice about how marketers can be relevant with email and stay ahead of the competition. He also provided advice for marketers on how to move beyond campaign management, to leverage omni-channel marketing, marketing optimization and digital intelligence. Natalie described how to apply advanced analytics techniques to digital marketing in her two posts “Forecasting digital outcomes for hospitality” and “Streamlining tough choices: Applying an optimization approach to hospitality marketing”.   .

We talked about opportunities for hotels to improve pricing strategies, including several features on the latest research into reputation management and pricing. I reviewed “What research tells us so far” in the arena of reviews, ratings and hotels. I also reported the latest research results from my “Pricing in a Social World” research, co-authored by Dr. Breffni Noone from Penn State, including a blog summary of our project on how business travelers buy, and a video summary of the research results. Natalie described how hotels can assess their reputation through text analytics, and described the importance using text analytics to capture, comprehend and act on the voice of the customer , summarizing her thoughts in a video blog.

Natalie provided an interesting perspective on what hotels can learn from casinos in a multi-part series: “Focus on knowing your guests” and “Focus on the service environment” which concluded with Alex’s describing what hotels could learn from casinos’ focus on non-rooms pricing. I also reminded our analytic hospitality executive that analytics are not just for innovation, but can help hotels maintain and improve their core services as well in my post “Be thankful for great hospitality and customer service” Alex provided a summary from the revenue management section at INFORMS, which described some of the latest research in pricing and revenue management analytics that was presented there.

So, what can you expect from us next year? Well, we’re going to focus our attention on three or four broad themes, including customer experience, big data and big analytics, and analytical data visualization. Within these themes we’ll continue to provide clear definitions and helpful tips and strategies for leveraging analytics to drive value in your organization. We look forward to continuing conversations with you next year! On behalf of the entire team at the Analytic Hospitality Executive, we hope you have a great holiday season, and best wishes for a safe and happy new year!!!

Post a Comment

Forecasting and Optimizing Digital Media Mix for Hospitality: Let’s get practical!

Last week, I spoke to Suneel Grover, senior solutions architect for digital intelligence at SAS, about how better data and integration can drive improvements in web analytics. This week I wanted to see how this could be applied for a hospitality company. We decided to tackle a frequently debated topic in marketing: how digital media mix can impact a firm’s set of business objectives. I asked Suneel Grover to walk me through how he would answer this question using methods that could be used and understood by a business user. Suneel used SAS Visual Analytics to show me how he would optimize digital media mix to increase website traffic by 50% in a forecasted time period.

Step One: Forecast future visitors

Suneel began by accessing digital clickstream data at a level of aggregation specific to summarizing individual session behavior (not simply overall clicks or page views). This approach provides a granular data view of individuals visiting a digital property, allowing for flexibility in ad hoc analysis. An example would be a guest who visited a hotel website 8 times over the last month, we would have a record of those 8 visits assigned to that individual. This kind of data is not available with typical web analytic reporting tools, but can be accessed when an organization modifies their digital data collection methodology to transform web analytic third party data into its own first party data source. When it becomes your own data, you are free to aggregate, modify, and manipulate it to support a variety of downstream analytic goals – from reporting and BI, to predictive analytics and data mining. Ultimately, the intention is to liberate yourself from the analytic limitations of traditional web analysis, and increase your organization’s level of digital intelligence.

Suneel created a forecast in Visual Analytics by first plotting the historical web traffic visitation. In this case we produced a line chart that displays Session ID by Visitor count. This indicates how many individual visitors (sessions) visited your site between certain dates.

Next, he created a visual of the forecast. In this instance Suneel selected to forecast for the next 14 days. To obtain the most accurate prediction, the technology uses a champion/challenger modeling approach which leverages six different forecasting algorithms. After applying the math, the software makes an assessment of which algorithm produces the most accurate answer (based on a statistical test) and displays the champion forecast with a 95% confidence interval. To this he added seven underlying factors (i.e. traffic referral sources such as display ads, social, search, email, etc.) to help identify the important digital media channels and further improve the forecast model’s accuracy. Both Organic & Paid Search were identified as statistically significant referral traffic sources with respect to the forecast, so as a result are featured in the visualization shown in Figure 1.

Figure 1. Visualization of the forecast showing underlying factors of Organic and Paid Search.

Figure 1. Visualization of the forecast showing underlying factors of Organic and Paid Search.

Step Two: Running What-If Simulations through Scenario Analysis

Using the forecast prepared in Step One, Suneel adjusted the forecast to explore potential what-if scenarios. He did this by addressing the following question: “If we increased our paid search advertising spend by 50% in the forecasted time period; how will this impact overall site traffic? Suneel adjusted the future values of paid search traffic by 50% for the duration of the forecast. Once he applied that adjustment he was able to assess the incremental impact of increasing the effect of paid search on the forecasted global site traffic, as shown in figure 2.

Figure 2. Visualization shows the baseline and inflated forecasted impact of a 50% increase in paid search with respect to overall website visits.

Figure 2. Visualization shows the baseline and inflated forecasted impact of a 50% increase in paid search with respect to overall website visits.

Of course, as soon as Suneel was able to produce a result for a 50% increase in paid search advertising, I wanted to know what the results would look like if we varied the simulations with increases of 15%, 25%, or even 35%. Suneel agreed that running multiple scenarios was one method of approach to arrive at a decision recommendation on how much money should be allocated into a specific digital media channel. However, he suggested an even more powerful idea to arrive at an optimized solution to this digital media channel allocation exercise.

A new feature released in SAS Visual Analytics 7.1 during the fall of 2014 was Goal-Seeking. Suneel continued to explain that running simulations on varying the potential impact of one digital media channel has tremendous value. However, at the end of the day, we need to address an overall business objective of increasing website visitation. Rather than manually adjusting the levers of each digital media channel, let’s get straight to the heart of this problem, and request the software to optimize the values of each digital media channel to reach our business objective. This is what the SAS Visual Analytic technology is built to do – run complex math very quickly on large volumes of digital data and arrive at a data-driven insight. He reset the scenario in Visual Analytics and applied a new adjustment of 100% to overall site visits – in essence, we want to double website visitation in the two week forecasted time period. Now tell me – how are we going to get there?

Figure 3. Visualization shows the recommended values within Organic and Paid Search to arrive at an increase of 100% in overall website visits in the forecasted time period.

Figure 3. Visualization shows the recommended values within Organic and Paid Search to arrive at an increase of 100% in overall website visits in the forecasted time period.

Once we had the results, we proceeded to review the results, and identify any areas of concern. 

Step Three: Assess the optimized scenario and apply business constraints.

One result that immediately jumped out to us was the level of spending required on a specific day to obtain the volume of paid clicks that would drive a 100% increase in individual visitors. Suneel knew that marketing leaders would balk at that level of investment, so decided to assign upper and lower constraints to allocations to paid search and control the level of daily spending on pay-per-click ads. Suneel used a lower constraint of 0 and upper constraint of 250. Once applied, he re-visualized the forecast with the constraints applied. The results can be seen in Figure 4.

Figure 4. Visual display of optimized predicted metrics with business rule constraints by day for business objective and media channels.

Figure 4. Visual display of optimized predicted metrics with business rule constraints by day for business objective and media channels.

In summary, Suneel’s analysis resulted in the identification of more important (and less important) digital media channels with respect to forecasting website visitation. After identifying that Organic and Paid Search were the two most important media channels in predicting future traffic, he proceeded to highlight approaches of running scenarios (or simulations) of varying the impact of one digital media channel. Ultimately, utilizing goal-seeking (i.e. optimization) and business rule constraints, Suneel arrived at the optimized values required by ALL digital media channels to reach an overall business objective of doubling website visits in a specific time period. One could suggest that this is an example of prescriptive analytics!

How do you identify what impact the increase or decrease in digital media channels will have on your website traffic?

Post a Comment

Web analytics for hospitality: It’s time for better data and better integration

There’s a lot of talk about modernizing hospitality marketing – and most of it is dependent on finally cracking the nut between online and offline guest data and bringing predictive analytics to play. But there is a fundamental problem hampering our ability to do this and it starts with how hospitality manages web analytics. This problem is not new to Suneel Grover, senior solutions architect for digital intelligence at SAS, since he started his marketing analytics career building predictive models for online marketing campaigns.

“I really wanted to use web interaction data in my models, to more accurately predict the outcomes of these online marketing campaigns” he told me. “When I went to the web analysts to get access to the underlying data, it fast became apparent that they didn’t have it.” Most web analytics data is being hosted in the cloud, then aggregated to produce the kinds of reports that tell you how many people came to your website last week, how many came from one search engine versus another, what version of the operating system or browser they were using, etc. These reports can be very helpful, but do not give you the type of data you need for a deeper analysis.

Where this impacts hospitality firms the most is when it comes to personalization and targeting. Without a more granular level of data, your segmentation will remain at a very macro level. You also need the underlying data for building predictive models. “And if you cannot do predictive analysis on digital data, then your personalization efforts will plateau very quickly,” Suneel explained.

“Basically, web analytics is still functioning at the descriptive level,” Suneel said, “and few web analysts have ever built predictive models, or are familiar with predictive techniques.” That makes it rather difficult for there to be common ground between the marketing analyst and his/her web analytics counterpart. And these two need to be able to work together to meet the demands of the industry today. “To achieve effective personalization and targeting, we need to be able to work with both online and offline data in an integrated format,” explained Suneel. “This is extremely important – if hospitality companies are going to be relevant and engaging as they interact with their consumers across online devices and offline interactions - these two functions need to come together.”

Up to recently, many organizations could not handle detailed digital data because it amassed at such a quick rate. “Digital data has always been a classic big data problem,” Suneel told me. “Aggregating digital data for reports is much easier to manage,” he said, “but this has very limited analytic value.” Now that we are able to play with very large digital data sources, the possibilities really open up. “You can quickly start to move from descriptive analytics – describing things that happened in the past, to diagnostic analytics – statistical validation that analyzes why something happened,” he explained. “Predictive analytics allows you to be more proactive, because you are analyzing what will happen next. But most important is prescriptive analytics, which provides data-driven decision recommendations prioritized for individuals (not mass groups) in outbound and inbound consumer interactions.”

Predictive and prescriptive analytics are needed today by hospitality marketing organizations to answer the questions they have, such as how can we double website visitation? Should we invest more advertising dollars into Paid Search, Social, Email, or Display? How much more should invest within each of these channel? Is there a way to arrive to an optimized allocation to support our digital media mix business goals? “With detailed digital data we can answer these questions,” said Suneel. “Using optimization to identify how much money I should spend in each media channel in a given time period to get the highest return on an overall business objective is the queen bee of analytics, and we can do this by re-thinking how we collect, store, and apply advanced analytics to digital data.”

How is your organization thinking about digital data? Are you stuck at the aggregated level? Or do you already have a plan for collecting, storing and using detailed digital data?

Post a Comment

Be thankful for great hospitality and customer service

I generally don’t use this blog to air my personal experiences, but recent events have reminded me of a few things that I think would benefit our Analytic Hospitality Executives and their organizations to also be reminded of.

This past week, I took my fifth trip to Asia this year. With such a lengthy flight and so many nights in a row in so many different hotels (including a weekend this time), I was immersed in the hospitality and travel consumer experience for a while.   I will keep names out to protect the innocent and the very, very guilty, but being on the receiving end of so much “service” , and a really interesting conversation with a New Zealand technology consultant living in Macau, reminded me of a few things.

In this exciting, fast moving world of technology and analytics, it’s easy to get caught up in the next shiny object, the next innovation, the next “killer app”. However, it is crucial in the midst of what’s new, not to forget about the core service proposition.   My New Zealand colleague and I were trading personal pet peeves about hospitality and travel between meetings in Macau. She used to work in call centers, and pointed out to me that nearly every “hold” message starts with “Your call is very important to us”. Well, if it were that important, wouldn’t you answer it instead of putting me on hold?

This conversation got me thinking about how important it is to keep the guest at the center of the hospitality experience, with everything that happens designed around the experience that the brand promise sets for that guest.   I don’t see this attitude as consistently in the hospitality and travel industries today as I would hope.

For me, the hands down most frustrating experience of my long haul trips this year has been the baggage claim experience inside customs and immigration in the US. The last trip back to the states in early September, I waited 50 minutes at LAX at the carousel (not including the 20 minutes it took them to bus us to the immigration area) just to pick up my bag, only to drop it off again two minutes later after going through customs – and it had a priority tag on it (that is, it took them nearly 70 minutes to get my bag onto the carousel)!! I nearly missed my connection with an hour and fifteen minute layover – and I had global entry, top status with my airline and TSA pre-check!   In Detroit a couple of months ago, one of my bags arrived on the priority carousel and the other on the regular carousel across the baggage claim area (both were priority tagged). On a previous trip to Detroit, the bags came out on a completely different carousel than the one the flight was listed on (again, causing me to nearly miss my connection).

When I arrived in Hong Kong this trip, it was 40 minutes from when I left my seat on the plane to when I got into the cab. This included a long walk to immigration, a lengthy line at immigration and a lengthy cab line. My bag was waiting for me on the carousel when I left the immigration line (I’m guessing 25 minutes after landing). In Singapore this year, my record was 17 minutes – including picking up a checked bag, which again was waiting on the carousel when I got out of immigration!

Hotels, in general, do much better, but not perfect by any means. I have had great service at some hotels, and touchy service at others. I have endured overly chatty front desk agents, long check-in lines and not-ready rooms, when I just want to get to my room after a long trip. I’ve also had very thoughtful housekeepers that noticed that I drank every bottle of water in the room, leave me extra the next night (also housekeepers who failed to replace the water). I’ve been turned away with no alternative at hotel restaurants when my delayed flight meant arriving after the kitchen closed, and been “protected” from overly-friendly bar patrons by thoughtful bartenders.   I’m getting a little tired of crawling under desks to find an outlet to charge my phone, or waking up every five minutes because heavy doors are banging shut up and down my hall. I’ve had to open my bag on the floor because there was no surface large enough to support it in the room, and no luggage rack – and had to endure long waits for staff to bring an iron when there was none provided and no time to send out for service before my meeting.

This is not a therapy session (although I do feel better now that I’ve gotten it out), but rather a reminder that data and analytics are not just for the next big innovation.   It’s great that we can monitor twitter to intervene when a customer identifies a service problem – but what about preventing that problem in the first place? Data and analytics should be used first and foremost to monitor and improve core service delivery processes to ensure excellent customer experiences.

Forecasting arrivals and departures can ensure lines are kept to a minimum at check-in, and housekeeping staff is adequate to turn over rooms in time for early arrivals. Simulation and optimization algorithms can model baggage handling processes to streamline delivery. Thoughtfully crafted and carefully analyzed surveys paired with content analysis of online reviews will identify key traveler needs to inspire product and service design, as well as opportunity areas for staff training and development. Statistical analysis of demand patterns will help to ensure that operating hours for ancillary outlets align with guests travel patterns. Forecasting and optimization algorithms ensure that call hold times are kept to a minimum. Within all of this, your basic instincts, experience and training about what service means and how to deliver it in the context of your brand promise and your organizational capabilities will put the right wrapper around the analytic results.

This kind of discipline helps with innovation as well. I attended a really interesting talk from Clayton Christenson at a SAS event this fall. Professor Christensen teaches at Harvard Business School and is famous for his work on disruptive technologies. He suggested that companies are spending too much time trying to “know” their customer – understanding everything about them. Rather, they should be understanding the basic need that is driving that consumer to buy your product or service – or as he put it, “the job that your customer is hiring your product or service to do”. I would “hire” a very different room for a romantic getaway than I would for an overnight for a business trip. Think why your guests would “hire” your room – is it because you have good lighting and plenty of outlets in the work area? Is it because you are close to a fun restaurant and bar area? What would you really be able to do with a set of demographic information about a segment (35-40, 45% female, traveling from east coast of US), without understanding what that traveler NEEDS from your service offering – why they would hire you to provide it?

So, consider this a call to action for all of our Analytic Hospitality Executives. Are you confident that the service delivery of your core product is measuring up? Can you identify areas of improvement? Can you coherently describe why your guests would hire your hotel for their next stay, and how you will deliver against that need?

 

Post a Comment

Revenue Management and Pricing Analytics: A report from the INFORMS Annual Meeting

Last week I had the opportunity to attend the INFORMS Annual Meeting in San Francisco.  For those of you not familiar with this organization or conference, the Institute for Operations Research and the Management Sciences (INFORMS) is the largest society in the world for professionals in the field of operations research (O.R.), management science, and analytics.  Several thousand papers are presented at a typical INFORMS Annual Meeting, which is held over three to four days, and includes many separate tracks covering distinct areas of focus.  The plurality of presentations at an INFORMS Annual Meeting seem to come from academic sources, but there are also papers presented by vendors and industry.  This mix of paper sources makes the conference an excellent place to see where research is being focused (by academics, industry, and vendors), while at the same time gauging where the current “state of the art” is in a given area.  Unfortunately, because the conference is so large, it is virtually impossible to see more than a very small slice of it, which makes getting general impressions across disciplines difficult.  In my entry today I’m going to focus very specifically on Revenue Management and Pricing, as this was my focus at the event.

This year’s conference included 83 separate tracks, ranging from Aviation Applications to Service Science.  It seemed that every available meeting room was taken up by the conference, and most attendees agreed that this year’s event was larger (in terms of numbers of sessions) and better attended than others in recent years.  There were two (sometimes three) tracks dedicated to Revenue Management and Pricing, but several very relevant presentations on revenue management were held in other tracks, so I found myself shifting tracks quite a bit to catch those topics that interested me most.

My own presentation was held in the Practice track, and all three of the presenters in this section focused on revenue management.  I presented “Revenue Management in the Big Data Era” where I focused on the industry and technology changes that are driving new challenges to revenue management, and how Big Data appears to offer the opportunity to face these challenges more successfully in the future – when that data is paired with new analytic approaches.  In my presentation I referenced several studies that are showing the potential value of big data to revenue management, including Kelly McGuire and Breffni Noone’s studies on Hotel Pricing in a Social World.

The two other presenters in my practice section were Warren Lieberman from Veritec Solutions (Warren also chaired our session, which was titled “Succeeding with Revenue Management”), and Cory Canamo from Disney Cruise Line.  Warren’s presentation “Models and Methods” covered a number of topics, including comparing the benefits of centralized vs. decentralized revenue management organizations, and the availability data and importance of using a range of data to drive pricing decisions – which dovetailed nicely off of my own piece.  Warren then described the benefits that Veritec’s customers are seeing from a multiple signal approach.  Cory’s presentation provided an overview of Disney Cruise Line, and some of the unique challenges that Disney Cruise Line faces as a niche, family-oriented cruise line.  Cory also discussed some of the industry innovations that Disney Cruise Line has introduced (such as the Magical Porthole), and how customer reactions to these innovations has challenged the revenue management function at Disney Cruise Line.

This year saw a large number of papers at the INFORMS Annual Meeting, with 27 papers involving SAS authors, presenters, or topics.  A number of these presentations related to revenue management and pricing, including Matt Maxwell’s two presentations “Customer Choice Model Optimization with Overlapping Consideration Sets”, and “Optimization Challenges with a Customer Choice Model”.  Matt’s presentations highlighted:

  • Limitations in traditional revenue management forecasting approaches - in particular the assumption of independence between demand streams
  • How customer choice model approaches overcome these limitations, and
  • Challenges that must be overcome to optimize revenues using customer choice models

Jason Chen’s (also from SAS) presentation “A Simulation Study of BAR by Day Heuristics” discussed the complexity in optimizing BAR (best available rate) by Day (where room prices are the same for a given date of stay for all occupants, regardless of length of stay).  As Jason explained, BAR by day is attractive due to its simplicity, but it presents challenges when trying to optimize revenues while accounting for length of stay effects.  Jason presented the outcomes of simulation studies used to gauge several different approaches to optimizing BAR by day pricing.

One last presentation that I thought would be interesting to regular readers here at the Analytic Hospitality Executive: “Upgrades and Upsells: Hertz vs. Hilton Models” by Guillermo Gallego from Columbia University.  Upgrades and upsells is an area that has received quite a bit of attention, not only in hotels, but also in airlines (due to the increased number of seat inventory types caused by the addition of extended-legroom coach seating).  Dr. Gallego’s presentation discussed the origin of upgrading / upselling (mismatching demand and inventory types) and compared several different business strategies to maximize revenue.  Dr. Gallego also discussed the issue of “strategic behavior” (i.e. customers choosing not to pay for paid upsells when free upgrades are frequent), and the impact on optimization and outcomes.  One of the things that struck me about this particular presentation was the value that was gained from improving inventory utilization (occupancy) when using just a simple free upgrading policy – value that I think is being lost or forgotten as hotels have begun to follow paid upgrade policies, because the value gained from the payment is simply easier to see.

Overall I had two general impressions regarding revenue management and pricing from this conference:

  1. Choice modeling continues to be a very hot topic amongst researchers, with at least 5 separate sessions (3-4 presentations per session) in the revenue management tracks dedicated to Choice modeling in revenue management – and a handful of other sessions outside of the official RM tracks covered this topic.
  2. I was also intrigued and encouraged at the large number of presentations at the conference regarding the application of game theory in revenue management.  One of the general criticisms of traditional revenue management approaches has been the focus on short-term revenue / profitability, and the inability to capture the complexities of strategic decisions that need to be made in a competitive environment, where competitors respond to actions.  Based on what I saw at INFORMS, researchers are clearly beginning to take a hard look at these very issues.  You might want to study up on “Nash Equilibriums” before attending your next revenue management conference.

The most anticipated session that didn’t actually occur was the planned keynote by Google project leader Anthony Levandowski. Anthony was to present on Google’s work on the driverless car, but unfortunately this session was cancelled at the last minute.  An audible “No!” was heard throughout the conference center when the email went out to all attendees, less than an hour prior to the event, indicating that the session was cancelled – leaving all of us with the question “If they can invent a driverless car, how come they can’t run a speaker-less keynote session?”  (Full credit for this quip goes to my colleague Ed Hughes).

Post a Comment

Big data and big analytics are a big opportunity for hotels – Part 2

Big data is of no use unless you can turn it into information and insight.  For that you need big analytics.  Every piece of the analytics cycle has been impacted by big data, from reporting, with the need to quickly render reports from billions of rows of data, through advanced analytics like forecasting and optimization, which require complex math executed by multiple passes through the data set.

Without changes to the technology infrastructure, analytic processes on big data sets will take longer and longer to execute.  It’s not enough now to push the button and wait hours or days for an answer.  Today’s advanced analytics need to be fast and they need to be accessible.  This means more changes to the technology infrastructure to support these new processes.

Analytics companies like SAS have been developing new methods for executing analytics more quickly.  Below is a high level description of some of these new methodologies, including why they provide an advantage.  Once again, the intention is to provide enough detail to start conversations with IT counterparts (or understand what they are talking about), certainly not to become an expert.  There is a ton of information out there if you want more detail!

  1. Grid computing and parallel processing – Calculations are split across multiple CPUS to solve a bunch of smaller problems in parallel, as opposed to one big problem in sequence.  Think about the difference between adding a series of 8 numbers in a row versus splitting the problem into in four sets of two, and handing them out to four of your friends.   To accomplish this, multiple CPUs are tied together, so the algorithms can access the resources of the entire bank of CPUs.
  2. In-database processing - Most analytic programs lift data sets out of the database, execute the “math” and then dump the data sets back in the database.  The larger the data sets, the more time consuming it is to move them around.  In-database analytics bring the math to the data.  The analytics run in the database with the data, reducing the amount of time-consuming data movement.
  3. In-memory processing – This capability is a bit harder to understand for non-technical people, but it provides a crucial advantage for both reporting and analytics.  Large sets of data are typically stored on the hard drive of a computer, which is the physical disk inside the computer (or server).  It takes time to read the data off of the physical disk space, and every pass through the data adds additional time.  It is much faster to conduct analysis and build reports from the computer’s memory. Memory is becoming cheaper today, so it is now possible to add enough memory to hold “big data” sets for significantly faster reporting and analytics.

To give you an idea of the scale of the impact, applying these methodologies, we have been able to render a summary report (with drill down capability) from a billion rows of data in seconds.  Large scale optimizations like risk calculations for major banks, or price optimization for thousands of retail products across hundreds of stores, have gone from hours or days to minutes and seconds to calculate.  As you can tell, the advantages are tremendous.  Organizations can now run analytics on their entire data set, rather than a sample.  It is possible to run more analyses more frequently, testing scenarios and refining results.

Here are some examples of how innovative companies are applying big analytics to get value from their big data:

  • Airline companies are incorporating the voice of the customer into their analyses, by mining all of the internal and external unstructured text data collected across channels like social media, forums, guest surveys, call center logs, and maintenance records for passenger sentiment and common topics.  With big text analytics, these organizations are able to analyze all of their text data, as opposed to small samples, to better understand the passenger experience and improve their service and product offerings.
  • A major retailer is keeping labor costs down while maintaining service levels by using customer traffic patterns detected by security video to predict in advance when lines will form a the register.  This way, staff can be deployed to various stocking tasks around the store when there are no lines, but given enough notice to open a register as demand increases, but before lines start to form.
  • A major hotel company has deployed a “what if” analysis in their revenue management system which allows users to immediately see the impact of price changes or forecast overrides on their demand, by re-optimizing around the user’s changes.  Revenue managers no longer have to make a change and wait until the overnight optimization runs.

Unlocking the insights in big data with big analytics will require making some investments in modernizing technology environments.  The rewards for the investment are great.  Organizations that are able to use all that big data to improve the guest experience while maximizing revenue and profits will be the ones that get ahead and stay ahead in this highly competitive environment!

Post a Comment

Big data and big analytics are a big opportunity for hotels

By infusing analytics through every phase of the guest journey, hotel managers can help shore up the complicated balance between the guest experience and revenue and profit responsibilities – delivering memorable and personalized guest experiences, while maximizing revenue and profits.  To accomplish this, hotels need to be able to collect, store and analyze the volumes of data generated by their guest interactions, their operations and the broader market.    As the volume and complexity of data increases, wrapping your head around what is available and how it can be useful is becoming challenging.

“Big Data” is a challenge for organizations not just because the volume of data has increased, but also because the variety has increased – it’s gone beyond traditional transactional data into unstructured formats like text, video, email, call logs, images, click-stream.  It is coming at us fast.  Data like tweets or location is stale nearly the minute it is created.

The reason why big data is a “big deal” is because the volume and complexity of the data puts pressure on traditional technology infrastructures, which are set up to handle primarily structured data (and not that much of it).  In these environments, it is difficult, or even impossible, for organizations to access, store and analyze “big data” for accurate and timely decision making.  This problem has driven innovations in data storage and processing, such that it is now possible to access more and different kinds of data.

To a certain extent, big data is forcing business leaders (like our analytic hospitality executives) to get more involved in technology decisions than ever before.  To help with this, in this post, I’ll talk about how technology has evolved to handle big data and give some examples of how companies are innovating with their big data.   Next week, I’ll do the same for big analytics.  This is not intended to make everyone into technology experts, but rather, to provide some basic information that can arm hotel managers to start having conversations with their IT counterparts.

This influx of large amounts of complex data has necessitated changes in the way that data is captured and stored.  To handle the volumes of unstructured data, databases need to be faster, cheaper, scalable and most importantly more flexible.  This is why some have been talking about Hadoop as an emerging platform for storing and accessing big data.  Hadoop is a database that is designed to handle large volumes of unstructured data.    Hadoop works because it is cheap, scalable, flexible and fast.

  1. Cheap & Scalable – Hadoop is built on commodity hardware – which is exactly what it sounds like – really cheap, “generic” hardware.  It is designed as a “cluster”, tying together groups of inexpensive servers.  This means it’s relatively inexpensive to get started, and easy to add more storage space as your data, inevitably, expands.  (it also has built in redundancy – data stored in multiple places,- so if any of the servers  happen to go down, you don’t’ lose all the data)
  2. Flexible – The Hadoop data storage platform does not require a pre-defined data structure, or data schema.  I use the analogy of the silverware drawer in your kitchen.  The insert that sorts place settings is like a traditional relational database.  You had to purchase it ahead of time, planning in advance for the size of the drawer and the kinds of silverware you wanted to put in it.  It makes it easy for you to grab out the four sets of forks and knives you need for a place setting.  However, the pre-defined schema makes it difficult to add additional pieces of silverware should you decide to buy ice tea spoons or butter knives, or if you are looking for a place to store serving utensils.  Hadoop, on the other hand, is more like an empty drawer with no insert – it has no pre-defined schema.  You can put any silverware you want in there without planning ahead of time.  You can see the advantage of this approach with unstructured data.  There is no need to “translate” it into a pre-defined schema, you can just “throw it in there” and figure out the relationships later.
  3.  Fast – Hadoop is fast in two ways.  First, it uses massive parallel processing to comb through the databases to extract the information.   Data is stored in a series of smaller containers, and there are many helpers available to reach in and pull out what you are looking for (extending the drawer metaphor: picture four drawers of silverware with a family member retrieving one place setting from each at the same time).  The work is split up, as opposed to being done in sequence.  The second way that Hadoop is fast is that because the database is not constrained by a pre-defined schema, the process of loading the data into the database is a lot faster.  (picture the time it takes to sort the silverware from the dishwasher rack into the insert, as opposed to dumping the rack contents straight into the drawer).

Many companies have put a lot of effort into organizing structured data over the years, and there are some data sets that make sense to be stored according to traditional methods (like the silverware you use every day).  Because of this, most companies see Hadoop as an addition to their existing technology infrastructure rather than a replacement for their relational, structured, database.

Next week I’ll talk about innovations in the execution of analytics that speed up time to results, allowing organizations to take full advantage of all of that big data.

 

Post a Comment

What can hotels learn from casinos? Focus on non-rooms pricing.

This week I’m going to “fill in” for Natalie to complete our series on “What can Hotels learn from Casinos”?  (Look for parts 1 and 2 here and here, respectively) Today I’m going to address pricing. We all know that great entertainment comes at a price, and we also know that casinos need to manage demand and supply just like any other business. But when we are managing revenue in a casino, there are a lot of interrelated factors at play. The denomination or price of a slot machine can impact demand for that machine. The price and availability of hotel rooms need to be balanced with the ability to offer comp rooms to the casino’s most valuable players, as well as with the overall goal to drive revenue throughout the estate.  After all, the modern casino resort produces revenue across many fronts (rooms, entertainment, gaming, food and beverage, and more), so any decision that impacts room rates and availability has to consider the broader effect on estate-wide revenues.

Predictive analytics, like forecasting and optimization, have been used in revenue management applications in travel, hospitality AND gaming companies for some time. But the traditional revenue management approach is revenue-focused, not customer focused.  Furthermore, typical revenue management applications in hotels (and most hospitality and travel segments) have a notoriously short-term focus: that is, optimization focuses on optimizing revenues for a given date or date range – the longer-term impact of those pricing decisions is not considered.  As a business built for entertainment, casinos have focused their use of predictive analytics in revenue management not only on their patrons, but on a variety of other factors that impact pricing.

Let’s look at a few examples. First let’s consider the pricing of hotel rooms. In Natalie’s first entry in this series, she talked about the use of analytics to derive the customer lifetime value.  This is an important element for a casino when they decide to make a room available for a patron.  Casino patrons tend to play where they stay, and high-roller customers can make several trips to a casino each year. Traditional revenue management would optimize the best mix of business and prices for the hotel for any given day and length of stay. When you consider that the more you play at a casino, the more valuable you are and the less you may pay for a room (because discounting for “high rollers” is a common practice at most casino resorts), a typical rooms-revenue-focused optimization would likely recommend closing out the rates for the casinos’ most valuable, frequent patrons – and cost the casino valuable gaming and entertainment revenue.  Furthermore, if a casino doesn’t have room for its most valuable patrons to stay when they are here in Vegas, those patrons will likely stay and play at another hotel and casino– and this can cause valuable repeat customers’ business to be lost.

Today, the modern casino is factoring customer lifetime value into its revenue management and price optimization processes.  By segmenting their forecasts into patron value buckets that consider both estate-wide spending and frequency of visitation, versus by business types, casinos can ensure that the optimization process leaves rooms available for their most valuable patrons.

Our second example is with the pricing of tickets for shows. As noted earlier, most casino resorts have morphed into multi-facetted entertainment destinations.  Nowhere is this truer than in Las Vegas, where we continue to see an array of entertainment options whose goal is to attract guests to the vicinity and keep them entertained – the longer you stay, the more you play (and pay).  The expanding options in Las Vegas include the recently-constructed High Roller (billed as the world’s tallest observation wheel), and the recent announcement by MGM of a new 20,000-seat arena to be built just off the strip. And of course, when not gaming, watching a show, or going to a concert, you can also eat in a world-class restaurant run by a celebrity chef.

Of course, Las Vegas shows are significant revenue-generators.  With multiple shows per day, often seven days per week, with hundreds of seats available per show, and operating practically year-round, these shows can generate a lot of revenue.  But if casinos try to optimize ticket prices, without considering the impact on the rest of the estate, they may miss an important input into the decision making process; specifically, how do they fill the gaming floor with patrons? As a result, casinos must balance the need to drive ticket revenues with the need to drive patrons onto the property that will use the gaming floor, bars and restaurants. So, when making ticket pricing decisions, they use show occupancy as a goal in their optimization process and end up with a price that meets the twin goals of occupancy AND revenue while factoring in the constraint of the number of seats available in the theatre.

Finally, because casinos have a high transient quotient, casinos are frequently running campaigns to fill rooms during expected low periods – far more frequently than a typical hotel.  Many (though certainly not all) of these campaigns are oriented towards attracting known customers back to the property.  Because such campaigns are so frequent, marketing and revenue management teams in a typical casino environment work hard to:

  • Ensure that campaigns are properly aligned with “need” periods and pricing strategies
  • Estimate the “lift” from each campaign, so that revenue management can properly manage demand and pricing during campaign-affected arrival periods

Does managing all of these disparate revenue streams make casino pricing more difficult?  Certainly!  But I was not surprised when a panel of revenue managers at the G2E conference this year said that they felt that managing revenue for a 200-room hotel was in fact far more difficult than managing revenue for a 2,000 room casino property.  Why?  Because the variability of behavior on the smaller number of rooms makes predictions and revenue management so much more difficult!  The large number of rooms at many casinos leads to much more predictable behaviors across the estate – making most day-to-day pricing decisions much easier.

Casino properties are different from typical hotels – and it isn’t necessary to follow their every practice – but we should consider what can be learned from these practices in managing pricing for hotels, and apply them when and where they make sense.

How much revenue does your property generate from non-rooms revenue sources?  Do you consider the impacts on these sources when determining your rooms pricing decisions?  What about the long-term value of your guests?  Is this also considered in your decisions?

Post a Comment

Text Analytics: The voice of the customer at the speed of business

I’ve written about using text analytics to capture comprehend and act on the voice of the customer on this blog before. Recently, I recorded a short presentation that outlines how text analytics provides insight from all of the customer experience data that you collect, not just review data from social media sites but also internal data such as call logs and surveys.

Natalie Osborn discusses using Text Analytics to discover insights about customers.

If you are attending the Cornell Hospitality Research Summit, I’ll be presenting on the voice of the customer on Tuesday 14 October, 2014 at 1:15pm in Statler 398. If you are not attending, but would like more information on this topic – please download the SAS white paper – “Unleashing the Power of Text Analytics for Hotels”.

Post a Comment

Can big data help revenue management?

Next week I will be presenting at the Cornell Hospitality Research Summit on big data and revenue management. In preparation for the Summit, I have recorded a short presentation that outlines how big data can help augment revenue management.

Alex Dietz talks about how big data can help revenue management.

If you are attending the Cornell Hospitality Research Summit, I’ll be presenting on big data and revenue management on Tuesday 14 October, 2014 at 10:30AM in Statler 398. If you are not attending, but would like more information on this topic – please read my two blogs: Big data and revenue management part one and part two.

 

Post a Comment