Why Ops-ing is in (DevOps, ModelOps, DataOps)

0

What’s up with Ops?Business woman happy as she thinks about DataOps It seems to be popping up everywhere these days. In fact, “Ops” is about as popular a term in the information technology disciplines now as the term “big” was over the last decade (e.g., big data management, big data quality, big data analytics, big data governance).

Think of what is probably the most well-known Ops term – DevOps. As you may know, DevOps combines software development (Dev) and IT operations (Ops) into a set of best practices intended to shorten the software and application development life cycle. DevOps uses agile methodology to focus on continuous delivery through use of on-demand IT resources and automation. And it improves the velocity, quality, predictability and scale of software engineering and application deployment. While Dev may have been the first to Ops-in (so to speak), in recent years it’s becoming an Ops-tion for models and data as well (sorry for opting for continued puns).

I’m a model, but am I ready for the (production) runway?

Speaking of ModelOps: “Only about 50% of models are ever put in production,” Jeff Alford acknowledged, “and those that are take at least three months to be ready for deployment. This time and effort equal a real operational cost and mean a slower time to value, too.”

ModelOps is a holistic approach for rapidly and iteratively moving models through the analytics life cycle. Whereas DevOps focuses on application development, ModelOps focuses on getting models from the lab to validation, testing and deployment as quickly as possible, while ensuring quality results.

“ModelOps,” Alford explained, “is how analytical models are cycled from the data science team to the IT production team in a regular cadence of deployment and updates. It enables you to manage and scale models to meet demand and continuously monitor them to spot and fix early signs of degradation. ModelOps is based on long-standing DevOps principles. It’s a must-have for implementing scalable predictive analytics. But let’s be clear: Model development practices are not the same as software engineering best practices.”

By encompassing culture, processes and technology, ModelOps can enable enterprises to efficiently and continuously develop and deploy models, not just to get more of them into production but also to deliver more advanced analytics solutions, more often.

DataOps (oops, I did it again) to operationalize analytics

Borrowing methods from DevOps and ModelOps, DataOps seeks to operationalize analytics. It does so by integrating data engineering, data integration, data quality, data security and data privacy with operations to improve the cycle time of extracting value from data analytics.

DataOps, David Loshin explains, facilitates “end-to-end management of ingestion, integration and utilization of data from various sources to targets. Because organizations incorporate traditional (i.e., structured) data and an increasing variety of other types of information – and have to support different use cases – data integration can no longer be limited to a sequence of coordinated batch tasks of data extraction, staging, transformation and loading. Instead, organizations should introduce methods and tools to develop, manage and orchestrate data pipelines so they can develop and deploy analytics on a continuous basis.”

Deploying practical artificial intelligence solutions is an enormous challenge for many enterprises. This is another area where DataOps is essential, according to Kirk Borne, since it’s “an iterative, fail-fast, learn-fast, agile process that creates proofs of value. The process involves frequent interactions between the end-users who provide requirements, the developers who build, test, and deploy the technology, and the business owners who provide metrics and evaluation.”

While DataOps began as a set of best practices, it has matured to become a new and independent approach to data analytics. DataOps applies to the entire data life cycle – from data preparation to reporting – and it recognizes the interconnected nature of the data analytics team and IT operations.

Download a paper about operationalizing analytics and AI
Share

About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

Related Posts

Leave A Reply

Back to Top