Machine learning and automated networks

Digital transformation is described as “the fourth industrial revolution” because it is changing business, government and the world.

Traditional business functions are being broken into independent microservices that can be deployed across the whole organisation, building a library of re-usable service modules that expands and updates in real-time as the business environment evolves.

Should e-signatures be replaced by palm scans, the update should be automatically propagated across the entire business. Changes that could take weeks become almost immediate. Network agility is vital to this ongoing evolution.

A network of functions

We used to think of a network connecting physical locations; better now to think of it connecting, and shaping itself to, applications. No more separation into hierarchical silos – connections must run horizontally across the organisation, with everything connected to everything else, in order to harvest and communicate the accumulating wealth of data.

Stocktaking, for example, is no longer a weekly task for people with clipboards. It is an automated function updated in real-time by Internet of Things (IoT) chips on every shelved item. Customer choices and buying patterns are registered and recorded along with payment details and other data. When IoT includes phones, cars, surveillance cameras, sensors, control systems and more, it generates exabytes of data.

Knowledge is power but unstructured data is simply a burden. A “mountain of data” describes it well, because the mountain can be mined for nuggets of knowledge. Human intelligence is a masterful miner of all the data receivable by human senses – we would instantly recognise that CA, Cal and California all mean the same thing in an address. But this natural human ability to recognise these similarities belies the sophisticated intelligence required to parse and analyze such data. Even a relatively well-defined source such as click data from websites still needs to be massaged into consistency by specialist data scientists because similar data from different users comes in many different formats according to the platform.

Mined by artificial intelligence

With the volume of data gathered through IoT, such human processing is no longer viable. To identify, correlate and then analyze all the data coming from our machines, we need other machines. Artificial intelligence (AI) is the key.

AI can no longer consist of programmed algorithms to tell it that CA, Cal and California all mean the same thing, let alone allow for human misspelling. AI has to evolve more rapidly; it has to teach itself, and that means machine learning – a process that depends upon comparing present data with past and comparable data, looking for meaningful patterns and learning what does and does not work.

Machine learning can only work if it has sufficiently rapid access to the data mountain, and so it depends critically on the performance and efficiency of the data centre and storage network.


Playing the game

At the recent GPU Technology Conference in San Jose, California, Nvidia Corp announced a partnership to allow online games to be streamed over tomorrow’s 5G networks.

Nvidia builds graphics chips that make video games more realistic. Now the company is putting those same chips inside servers in data centres so that gamers can stream games from the data centre without needing to buy the most expensive hardware. Instead, it is processed in the cloud. One “pod” of these graphics cards can support ten thousand streaming gamers at once.

This again will generate a massive amount of network bandwidth, requiring utterly reliable real-time responsiveness if it is to satisfy skilled gamers.

How is this achieved?

Evolving microservices, massive amounts of IoT data being received and processed using artificial intelligence – these are some of the pressures facing today’s data centres. So how are their operators preparing for this future?

The report ‘Untold Secrets of the Efficient Data Centre’ summarises a recent survey of over 200 high level data centre professionals across China, USA and the UK to find answers.

It revealed a significant shift from the practice of adding greater processing power and more servers to optimise data centre performance. Instead, the network is now seen as a key performance driver. Building intelligence into the network itself is taking an enormous load off servers that have been responsible not only for application support but also for a host of virtual network functions. 84% of respondents thought network infrastructure was either “very important” or “important” to supporting artificial intelligence and machine learning.

The report shows how the very techniques developed by hyperscale cloud giants are now migrating to the enterprise, where distributed applications dominate. At every level, automation is the key: automating the collection of data via the Internet of Things; automating the mining of that data; and automating the network itself to support all these functions in real time.

Hyperautomated connectivity is the nervous system that brings the parts into one intelligent and responsive whole.

Amit Krig, Vice President of Software, Mellanox

Author

Scroll to Top

SUBSCRIBE

SUBSCRIBE