Adopt AI or fall behind: that’s been the message to many businesses recently. AI can be enormously beneficial, but according to a recent study from Akamai, 82% of businesses in the UK, France and Germany are not able to track the ROI of their AI experiments. As a result, 68% of organisations are now having to either cut back on AI itself, or on critical functions such as cybersecurity or staffing.
The cost of experimenting with AI can be high. Cloud GPUs are often five to fifteen times more expensive than basic compute power on a per-hour basis. Supply chains change, with fluctuating international policy and chip supply, particularly for high-end AI chips, often difficult to predict. Furthermore, inflation has also been a difficult and unrelenting force for IT decision makers to tackle, with costs rising while budgets often stay flat.
Although technology teams are adept at doing more with less, this perfect storm has prompted many organisations to reconsider their overall approach to data handling, especially with regards to cost control and information sensitivity.
Managing cost control
There are ways to control AI’s costs. The most obvious one is to make sure you have a solid business case and evaluation criteria behind the experimentation. However, there are other factors to consider as well. For example, model training is generally not sensitive to latency, because it’s not user-facing. This means that it can be done in a cheaper region or if time permits, with slower, cheaper GPUs.
A close understanding of the AI equipment – specifically, GPUs – on the market can also be beneficial. According to Cloud Mercato, an L40S might have half the performance for LLM inference tasks compared to an H100, but it’s usually around half the price. And when it comes to object detection inference specifically, the L40S and H100 have almost the same performance.
All of this data needs to be stored somewhere, and again, these costs can be controlled. For example, teams should consider how often data needs to be retrieved, and how quickly it needs to be retrieved. Very ‘cold’ (often tape) storage is significantly cheaper than regular (object / block) storage, but retrieval times are much longer.
This raises an important concern that intersects with cost control. Organisations today are facing more questions than ever before about where data is stored, handled and by whom.
Issues of sovereignty
One of the main challenges for data sovereignty is the term itself: there’s no industry-agreed definition. Ultimately, sovereignty is about knowing where your data is, who controls it, which laws it is subject to, how it’s used and who has access to it.
Historically, there have been two main challenges to establishing a robust sovereignty strategy. The first is that the dangers haven’t been apparent; it wasn’t until this year that a lot of companies realised that foreign usage of data for economic or political intelligence purposes was a genuine threat. Indeed, it was so unclear that the Dutch government commissioned a group of lawyers to investigate. The report ominously concluded that ‘We qualify central government’s use of the cloud as worrying. The services provided to citizens and businesses, as well as central government’s operational continuity, are exposed to too much risk.’
The second challenge – which intersects with cost issues – has been the sheer complexity and difficulty of change. Cloud migrations are intricate processes at the best of times, and a lot of companies have historically chosen to work with a hyperscaler purely for the initial ease, credits and bundled discounts, but found themselves locked in further down the road. Until recently, many cloud organisations charged egress fees, a per-gigabyte cost for moving data to a different environment.
This is why the CMA and Ofcom’s investigations concluded that there were competitive dysfunctions in the cloud market today, and why it’s increasingly urgent that these concerns are addressed, because AI lives in the cloud.
In the interests of fairness, it’s also worth noting that it’s not always worthwhile moving clouds! Data is often fine where it is, and the cost and difficulty of moving clouds can exceed the benefits. However, we have seen some compelling cases where businesses have saved over 50% on their cloud costs by moving to sovereign providers, and even others – notably Basecamp owner 37Signals – which saved money by moving data back on-prem.
Managing the balancing act
As we’ve said, these matters are rarely a one-size-fits-all exercise. Some workloads are sensitive and need to be in a highly controlled environment in a datacentre in their country of origin, and in compliance with sector regulations. Other data – test and dev or spinning up a server for UAT, for example – don’t usually need such rigour and can be in a cost-effective public cloud anywhere.
However, it is important to be aware of the complexity so that you can plan effectively. IT teams that understand the nuance, have profiled their workloads and understand the long- and short-term limitations and benefits of cloud providers and their respective offerings, can make informed decisions. In turn, this will help them navigate the trade-offs between cost, control, risk and experimentation, making their organisations secure from cyber threats, as well as staying agile, competitive and ultimately being assured that customer data is safe and well-handled.
Matt Tebay
Matt Tebay is Multi-Cloud Evangelist at OVHcloud.


