Sometimes, I look back at my first analytics projects and cringe. When I was completing the capstone project for my graduate degree in analytics, my team made an egregious mistake. The project was for a local health system, and my team and I spent hours cleaning and preparing the data we received from them. Then we trained several models and determined that our random forest was most effective. And then for our final project, we handed our training code to the local health system, told them they could re-create our model using it, graduated, and went on our way. Unfortunately, our program had taught us nothing about how to operationalize models. We never wrote a scoring function or thought about how the model was going to be used. Our role stopped once the model was trained.
Since then, I have spoken to various engineers across dozens of organizations about their struggles getting models into production. A common theme among their complaints was that data scientists tend to throw the model over the fence for IT to deal with. IT was left to put in the work writing the score code, preparing data to score and implementing the model, often with limited information from the data scientist. With this new perspective in hand, I think back to what I would do differently in my capstone project to make our model easier to use.
At the end of the day, MLOps really is a team sport, and we as data scientists can’t throw our work over the fence and leave the rest of the team to figure it out on their own. In this article, I want to highlight the roles on that team and how they contribute to a flourishing MLOps process.

Data Scientists
Data scientists are central to the analytics process. They not only develop models, but they do so with an understanding of the business problem. Data scientists are versed in the application of the model and the underlying data used to develop it. Furthermore, they should forge common ground with IT & MLOps Engineers to understand the limitations, constraints and nuance between development, testing and production environments. In several organizations, they also work with risk teams to provide validation that their model has a low chance of causing issues. Data scientists understand the significance of statistical metrics for model performance and changes in data distributions. They communicate and collaborate with end users and IT to determine when a model has decayed and develop actions to refresh models in an expedient manner.
MLOps Engineers (IT)
MLOps engineers, also referred to as Engineering or IT at many organizations, validate and test the models from the data scientist, move models into production and make them available for the end users. IT and MLOps Engineers understand and monitor the testing and production infrastructure. MLOps engineers catalyze the use of AI at their business, navigating the team from experimentation to successful and repeatable enterprise AI implementation. They work to take the models developed by data scientists and ensure they scale to meet the needs of business within the constrained ecosystems of their production environments. They do all this while assessing data to understand the efficacy of their models on a persistent basis.
Managers & Executives
Managers and executives spearhead projects to improve outcomes through analytically driven decisions. They look at the effectiveness of an organization's analytical processes and greenlight innovation with AI by funding pilot projects. They are concerned with modernizing the enterprise to cut costs and meet organizational Key Performance Indicators (KPIs), such as Return on Investment (ROI) and Time to Value (TTV).
Risk Analysts & Managers
Depending on your industry, you may have to deal with additional regulatory considerations that impact ML deployment. This is where risk analysts and managers come in. Risk analysts analyze, document and help mitigate model risk. They understand which regulatory requirements apply to their organization. They also help quantify the cost of a bad model. Risk analysts need information about the business process from the end user and about the model from the data scientist.
End Users
End users leverage the output of the model to affect decision making at an organization. Example analytical end users are:
- Loan officers who leverage Probability of Default (PD) models when deciding to extend a loan to an individual.
- Marketers who leverage propensity models to determine which campaigns to apply for a set of prospective customers.
- Customer Success Managers who leverage churn models to determine which customers are high risk for canceling their contract and thus require special attention.
End users want to make the right choice using the information they have available. While they may not understand the analytics that went into the model, they need to trust that the outputs of the model are helping them. They want easy to understand information about the analytical process and model outputs, which may include:
- Clear and actionable advice.
- Natural Language Generation (NLG) explanations for how to interpret metrics.
- Model explainability and interpretability at a global and local level, meaning they would like general rules for how inputs affect outputs as well as why the model made a prediction for a specific set of inputs.
Working Together
Beyond the roles here, we may also see Data Engineers, AI Ethicists, AI Product Managers, Business Analysts, Prompt Engineers and much more. When you pull back and look at everyone who impacts an analytics project, it can feel overwhelming. Here are 5 collective traits that help synchronize teams working towards a unified MLOPs strategy:
- Communication- Clear and empathetic communication is essential for MLOps teams, aligning stakeholders and articulating diverse objectives. It's a cornerstone of success, requiring consistency and clarity.
- Engagement from stakeholders- Quality engagement matters more than time spent, with the aim to reach collective milestones rather than individual wins.
- Teamwork – No longer throwing anything over the fence. Teamwork is founded on a unified cadence, fostering a collaborative culture that smooths out friction and amplifies collective achievements. It's less about individual tasks and more about consistent, cohesive flow of team efforts. Connect, iterate and repeat.
- Solve problems – At its core, AI is about solving people problems through technology. When MLOps teams collaborate and iterate on problems, they cultivate a special environment that encourages creative solutions that often lie at the intersection of different areas of expertise. This collective knowledge is not just about individual domain proficiency but about being adept at navigating the challenges that arise at the seams of these domains.
- Demonstrate leadership- Even with blurred lines of responsibilities across roles, leadership remains paramount. Every team member should be empowered to lead and take initiative in the absence of direct instruction. Informing the team about emerging issues and embodying the principle that leadership is an action, individuals should consistently do what is right for the team.
When organizations adopt MLOps, the team strives to solve collective problems as they arise while individuals lead their perspective areas. As time goes on and synergies are formed, these disparate stakeholders begin to understand the natural tensions between job roles. This facilitates work across these interpersonal seamlines and enable the group to win as a team.
Conclusion
Rome wasn’t built in a day, but it also wasn’t the work of a single individual. It required input from countless experts like engineers, masons, laborers and decision makers all working together towards a monumental goal. Like a bustling empire, MLOps thrives on a collaborative spirit with the singular goal of operationalizing AI models. Naturally, building your team is a cornerstone of a successful MLOps strategy.
Learn More
LEARN MORE | Check out our free 1-hour webinar to Supercharge your Analytics JourneyLEARN MORE | Read about how SAS Viya is the perfect choice for MLOps
This article was written with the collaboration of Luis Flynn.