Stress test your business intelligence apps with a Chaos Monkey!

0

In the latest news from the frontier of Internet technology, the NY Times presented an interesting article about the work that high tech providers are doing to ensure maximum system availability. Companies strive for the "5 nines," or 99.999% availability, although a more realistic number is a "4 nine," or 99.99% (which represents less than one hour of downtime per year).

Online movie provider Netflix stress tests their software by releasing a "chaos monkey" into the system. The monkey "creates mischief like shutting down Netflix’s own subsystems randomly and challenging the other subsystems to adapt on the fly...each part of its system is designed to fight its way through on its own, tolerating failure from other systems upon which it normally depends."

Of course, for internet-based applications, performance is determined by the weakest link in the chain - generally, the internet service provider, which on average has an availability of 99.8% - so even if your company's web application was 100% available, your customer still might not have the same experience.

But it brings up an interesting question - how available are not only your customer-facing applications, but your internal ones as well? Do you release Chaos Monkeys into your business intelligence and analytic systems on an ongoing basis? Keep in mind that maintaining a “5 nine” (or whatever the right number is in your organization) on your availability depends on the service level agreements you have in place with your end users. As part of your BI strategy, all of your production dashboards and reports should have a business priority assigned and the BI/data warehousing team accountable for monitoring the ongoing performance of the system.

As BI professionals are aware, a smoothly running BI system is made up of a number of complex processes. Database design, extract-transform-load (ETL) procedures, the BI semantic layer, BI and database architecture, the end-user’s PC, networks, BI personnel resource availability, end-user usage and changes or increases in data are among the many moving parts that drive the performance of the BI applications. Generally, performance considerations are made during the actual BI implementation or when something goes terribly wrong (and business users or business decisions are impacted). Even if your hardware and software are performing well, a successful BI initiative will probably lead to an increase in the number of users, and lots of new users can be tough on a system that doesn’t scale.

Why don’t we throw in an occasional Chaos Monkey in to test the system? If business intelligence and analytic decision-making brings competitive advantage to your organization, proactively testing key components of your BI applications and planning for growth is essential.

Share

About Author

Rachel Alt-Simmons

Business Transformation Lead - Customer Intelligence Practice

Rachel Alt-Simmons is a business transformation practitioner whose expertise extends to operationalizing analytic capabilities vertically and horizontally through organizations. As the Business Transformation Lead for customer analytics at SAS Institute, she is responsible for redesign and optimization of operational analytic workflow, business process redesign, training/knowledge transfer, and change management strategies for customers. Prior to SAS, Rachel served as Assistant Vice President, Center of Excellence, Enterprise Business Intelligence & Analytics at Travelers, and as Director, BI & Analytics, Global Wealth Management at The Hartford. Rachel Alt-Simmons is a certified Project Management Professional, certified Agile Practitioner, Six Sigma Black Belt, certified Lean Master, and holds a post as adjunct professor of computer science at Boston University’s Metropolitan College. She received her master’s degree in Computer Information Systems from Boston University.

Comments are closed.

Back to Top