PREVIOUS ARTICLENEXT ARTICLE
EVENTS
By 4 December 2017 | Categories: Events

0

VIEWING PAGE 1 OF 1

With 2018 fast approaching, NetApp South Africa’s country manager Morne Bekker detailed the latest chief technology officer (CTO) predictions for the year ahead at an event in Rosebank last week.

While there were only five predictions highlighted, each of the points elaborated on were not the usual trends we have heard before, but rather, seemed potentially ground-breaking on their own.

1. Data becomes self-aware

Bekker began by pointing out that today, we have processes that act on data and determine how it’s moved, managed and protected. But, he posed, what if the data defined itself instead?

“As data becomes self-aware and even more diverse than it is today, the metadata will make it possible for the data to proactively transport, categorise, analyse and protect itself,” he enthused.

The flow between data, applications and storage elements are expected to be mapped in real time as the data delivers the exact information a user needs at the exact time they need it. This also introduces the ability for data to self-govern. The data itself will determine who has the right to access, share and use it, which could have wider implications for external data protection, privacy, governance and sovereignty.

Bekker explained that, for example, if you are in a car accident there may be a number of different groups that want or demand access to the data from your car. A judge or insurance company may need it to determine liability, while an auto manufacturer may want it to optimise the performance of the brakes or other mechanical systems. When data is self-aware, it can be tagged so it controls who sees what parts of it and when, without additional time consuming and potentially error prone human intervention to subdivide, approve and disseminate the valuable data.

2. Virtual machines become “rideshare” machines

According to NetApp, it will be faster, cheaper and more convenient to manage increasingly distributed data using virtual machines, provisioned on webscale infrastructure, than it will be on real machines.

This can be thought of in terms of buying a car versus leasing one, or using a rideshare service like Uber or Lyft. Bekker elaborated that if you are someone that hauls heavy loads every day, it would make sense for you to buy a truck.

However, someone else may only need a certain kind of vehicle for a set period of time, making it more practical to lease. And then, there are those who only need a vehicle to get them from point A to point B, one time only. In this scenario, the type of vehicle doesn’t matter, just speed and convenience, so a rideshare service would be the best option.

Bekker continued that this same thinking applies in the context of virtual versus physical machine instances. Custom hardware can be expensive, but for consistent, intensive workloads, it might make more sense to invest in the physical infrastructure. A virtual machine instance in the cloud supporting variable workloads would be like leasing since users can access the virtual machine without owning it or needing to know any details about it. And, at the end of the “lease,” it’s gone.

He pointed out that virtual machines provisioned on webscale infrastructure (that is, serverless computing) are like the rideshare service of computing where the user simply specifies the task that needs to be done. They leave the rest of the details for the cloud provider to sort out, making it more convenient and easier to use than traditional models for certain types of workloads.

3. Data will grow faster than the ability to transport it... and that’s OK!

Bekker continued that it’s no secret that data has become incredibly dynamic and is being generated at an unprecedented rate that will greatly exceed the ability to transfer it.

However, he asserted, instead of moving the data, the applications and resources needed to process it will be moved to the data and that has implications for new architectures like edge, core, and cloud. According to this prediction, in the future, the amount of data ingested in the core will always be less than the amount generated at the edge, but this won’t happen by accident. Thus, Bekker stressed, it must be enabled very deliberately to ensure that the right data is being retained for later decision making.

For example, autonomous car manufacturers are adding sensors that will generate so much data that there's no network fast enough between the car and data centers to move it. He pointed out that historically, devices at the edge haven’t created a lot of data, but now with sensors in everything from cars to thermostats to wearables, edge data is growing so fast it will exceed the capacity of the network connections to the core.

Autonomous cars and other edge devices require real-time analysis at the edge in order to make critical in-the-moment decisions. Indeed, it just not practical for drivers to have to wait, even if it is only seconds, for essential data that could impact their lives to be transmitted to the car. As a result, it is predicted that we will move the applications to the data.

4. Evolving from “Big Data” to “Huge Data” will demand new solid state-driven architectures

Just when businesses were seeming to come to grips with big data, it appears that the megatrend itself won’t stop growing. Bekker elaborated that as the demand to analyse enormous sets of data ever more rapidly increases, we will need to move the data closer to the compute resource.

He continued that persistent memory is what will allow ultra-low latency computing without data loss; and these latency demands will finally force software architectures to change and create new data driven opportunities for businesses. Flash technology has been a hot topic in the industry, however, the software being run on it didn’t really change, it just got faster.

This is being driven by the evolution of IT’s role in an organisation. In the past, IT’s primary function would have been to automate and optimise processes like ordering, billing, accounts receivable and others. Today, IT is integral to enriching customer relationships by offering always-on services, mobile apps and rich web experiences.

The next step will be to monetise the data being collected through various sensors and devices to create new business opportunities and it’s this step that will require new application architectures supported by technology like persistent memory.

5. Emergence of decentralised immutable mechanisms for managing data

Mechanisms to manage data in a trustworthy, immutable and truly distributed way (meaning no central authority) will emerge and have a profound impact on the data center. Blockchain is a prime example of this.

Decentralised mechanisms like blockchain challenge the traditional sense of data protection and management. Because there is no central point of control, such as a centralised server, it is impossible to change or delete information contained on a blockchain and all transactions are irreversible.

Bekker encouraged thinking of this as a biological system. “You have a host of small organisms and they each know what they're supposed to do without having to communicate with anything else or be told what to do. Then you throw in a bunch of nutrients, in this case, data. The nutrients know what to do and it all starts operating in a cooperative manner, without any central control, like a coral reef,” he elaborated.

Current data centers and applications operate like commercially managed farms, with a central point of control (the farmer) managing the surrounding environment. The decentralised immutable mechanisms for managing data will offer microservices that the data can use to perform necessary functions. The microservices and data will work cooperatively, without overall centrally managed control, Bekker added.

Final thoughts

Amid the intricacy of each of these trends that NetApp believes will come to the fore, there is also an excitement that was present in Bekker’s presentation. This was not just for the growth of NetApp’s fortunes though. Rather, he confirmed that many businesses are feeling a sense of anticipation for the year ahead, especially those that endured a difficult 2016 and tough 2017 and discovered how resilient and adaptable they are. Either way, on the back of the presentation, it was hard to deny that there is a near palpable sense of momentous change being on the horizon.

VIEWING PAGE 1 OF 1

USER COMMENTS

Read
Magazine Online
TechSmart.co.za is South Africa's leading magazine for tech product reviews, tech news, videos, tech specs and gadgets.
Start reading now >
Download latest issue

Have Your Say


What was the top tech news story of 2017?
Bitcoin's surge (6 votes)
The rise of A.I. (2 votes)
The WannaCry ransomware attack (1 votes)
#DataMustFall (2 votes)
The iPhone X gets unveiled (6 votes)
All the new VOD Streaming options (0 votes)
The Nintendo Switch is released (1 votes)
The success of SpaceX (3 votes)