It might be more down to organic growth rather than a strategy that many large-scale organisations today operate in a multi and hybrid cloud environment.

But orchestrating a mix of cloud and on-premise infrastructure is a growing challenge. Despite the benefits of flexibility, scalability, security and avoiding a single point of failure, data sprawl, lack of interoperability, and under-utilized features are pushing up costs for human capital. According to our research, as many as 99% of technology professionals in large companies are experiencing difficulties. This compromises both the accuracy and timeliness of their analytics, increasing the risk of poor decision-making in the organisation.

Tackling these challenges was the subject of a recent tech UK webinar, Bridging Clouds: Interoperability, portability, and a multi-cloud world. Drawing on the panel’s expertise in cloud solutions and enterprise strategy, we’ve distilled some of their insights into five points Chief Information Officers (CIOs) should consider when looking to optimise and not just manage a multi-cloud environment.

5 things CIOs need to know about hybrid and multi-cloud environments

1. No one-size-fits-all approach

Digital maturity varies between organisations and the number of vendors they work with. Our survey showed that well over half (56%) only use one public cloud – but a fifth use three or four vendors. The proportion of on-premise versus cloud varies a lot too. One panellist pointed to the growing appetite to modernise on-premise and make it more ‘cloud-like.’

Whether organisations are early adopters with several vendors or still at the start of the journey, the panel agreed that technology leaders are becoming savvier about how they can use the cloud more effectively. They don’t see the public cloud as a ‘silver bullet’ and are becoming more pragmatic in choosing ‘the right application for the right cloud.’

2. Cost reduction is important – but maximising value is even more critical

A multi-cloud environment brings hidden and not-so-hidden costs, including vendor subscriptions, data storage and computational power requirements. Often these operational inefficiencies lead to time lost. Our survey respondents pointed to the high costs-to-analytics and staffing costs, suggesting they need to get value from their existing infrastructure.

The panel echoed this sentiment, suggesting it’s not simply about lowering costs but deriving the most value from the technology. One panelist urged CIOs to choose the right platforms for the right reasons, to converge traditional and cloud-native operating models, and align processes to ensure full visibility and synergy between teams. A strong business case for multi-cloud investment must exist, including an analysis of the human capital cost.

A further consideration is the efficiency of your analytics platform – faster compute speeds can significantly impact productivity, or the same level of productivity can lead to significant cost savings.

3. Know what you’re signing up for

As with any major purchase, the panel highlighted the importance of conducting due diligence to understand exactly what their cloud vendor offers.

As one panel member explained, decision-makers need to analyse their enterprise strategy for the cloud and undertake a comprehensive discovery of the IT landscape, including applications, servers, networks, storage and so on, before mapping it to business-critical services. Panelists also said organisations must understand the value of their data (more on that later) and know how to exit a vendor contract if it doesn’t offer value.

4. Skills requirements are changing

One panel member pointed to the growing number of roles created by public cloud adoption, including cloud solutions architects and DevOps engineers. The problem is that many organisations cannot recruit or train these people fast enough. As evidenced in our survey, lack of automation and high staff training costs are compounding skills shortages.

A consolidated cloud agnostic analytics platform, which supports automation and multiple programming languages, including open source, enables organisations to make the most of their scarce data talent. The next step is to adopt a low/no/pro code multimodal platform that allows non-technical staff to develop their analytics models in line with their KPIs, freeing up data teams to concentrate on developing and optimising cloud strategies.

5. Recognise the value and portability of data

Earlier, we mentioned recognising the value of your data, including where it resides and how easily it can be retrieved and moved. Given the vast amounts of data being stored in different clouds, one panel member stressed that organisations need to get a handle on how long it would take to get it back in the event of a disaster.

Portability is not just a business advantage but also a requirement in some sectors. Earlier this year, the Bank of England stipulated that financial institutions must be able to ‘resume operations within two hours following disruptive events.’

So what is the way forward for organisations faced with these requirements?

I think there are two options available from SAS:

  • Leveraging the capability of SAS Cloud Data Exchange enables data mesh.  Data can be securely and efficiently accessed where it resides, honouring the authentication and access management of the cloud or data centre local to that data source.  At the same time, analytics can be pushed down into that data source, mitigating the need to extract, upload and join data, which leads to the latency and integrity challenges associated with hybrid or multi-cloud environments.
  • They are distributing real-time analytics to the point of consumption.  SAS’ ability to deploy its workloads in standalone container runtimes allows operational systems to leverage the power of SAS in novel ways.  For example, the data cloud company Snowflake is now using SAS Viya AI and decisioning capabilities to expand the scope of its Snowpark Container Services. According to Snowflake, users can ‘deploy, manage, and scale containerised workloads’ to ‘ensure portability and consistency across environments, especially for sophisticated AI/ML models and full-stack data-intensive apps.’

Find out how to optimise your cloud environment in SAS’ report, A silver lining from every cloud: How to avoid the pitfalls of multi cloud and analytics platform environments.

Looking to grow, innovate and tranform your organisation?

Join the Leaders’ Network by SAS – An exclusive network for senior executives who are looking to grow, innovate and transform their organisations. Here you can discuss interesting topics which are top of mind in every boardroom.

 

Share

About Author

David Shannon

David has over 20 years of experience as a Director and Consultant in Analytics. He provides strategic and tactical advice across the analytics industry delivering cost benefits, productivity and innovation. With in-depth IT knowledge and a reputation for getting things done. Today, David works for SAS leading the UK & Ireland’s Hyperautomation agenda and helping organisations drive digital transformation with automation. Outside of SAS, David is the volunteer IT Director for The MG Car Club. Formed in 1930, The MG Car Club is the original club for MG owners and one of the world's oldest car clubs with around 10,000 members world-wide.

Leave A Reply

Back to Top