Video: Big mobile tech players eye 5G rollout as groundwork ready
Every telco seems to be pushing out news about 5G these days. With Mobile World Conference taking place, and as noted having morphed from a consumer show to a a business event, this is hardly surprising.
Alongside 5G, machine learning, AI and cloud are seen as the technologies that will revolutionize domains such as autonomous vehicles and smart cities. But is 5G really a game changer? ZDNet discussed with leaders in the field, focusing on the impact of 5G on data collection, storage, processing, and applications and the interplay with cloud and AI.
Is faster better?
5G is sold mostly on the merit of advancing network speed. But is network speed the key for use cases involving edge analytics and intelligence, such as IoT and autonomous vehicles?
“Perhaps for some very high bandwidth applications, but the key limiting factor for most IoT applications today is data orchestration, not network speed,” says Olivier Pauzet, Vice President & General Manager, IoT Solutions in Sierra Wireless. “This is limiting the growth of IoT applications using of low bandwidth, low-power IoT devices with limited processing capabilities at the very edge of the IoT — the ‘Deep Edge.’ Data orchestration — not higher network speeds — is what is really needed to better process data on these edge devices, extract data from these devices, integrated this data with other data sources in the cloud, and update security and other software on these devices.”
Fast is good, but what good is getting more data if you are sloppy about what you do with it? This is the point Pauzet is making. Sierra Wireless is a leader in IoT with global presence, its products used from utilities to smart meters and from PCs and tablets to vehicles.
Tobi Knaup, Mesosphere CTO and co-founder, says the rise of 5G networks will allow for tremendous growth in edge computing use cases:
“With lower network speed, service providers must deploy edge compute devices for data cleaning, scrubbing and optimization close to the end device. This requires significant resources, both in terms of the cost of the actual device, and in maintenance.
Faster network speed at the edge enables far greater data aggregation for high density environments (like stadiums, tourist areas, etc.) which in turn enables the evolution of use-cases involving edge analytics and intelligence.”
Mesosphere is mostly known as the company behind DC/OS, one of the top platforms for building, deploying, and elastically scaling modern applications and big data. Recently Mesosphere was awarded the technology pioneer status by the World Economic Forum, and on this occasion its CEO noted a few things on edge computing and autonomous / connected cars.
Reliability is key — it would be silly to rely on cloud computing for self-driving
In many ways, cars are becoming big computers on wheels, equipped with a number of sensors such as Lidar. Lidar sensors send out a pulse of light and measure the reflective return to determine the distance between objects, generating a precise 3D map of the car’s surroundings.
These sensors, and other necessary systems such as GPS, lead to estimates that each autonomous vehicle will generate and consume roughly four terabytes of data for every eight hours of driving. Much of that needs to be acted upon instantly, as this could make the difference between detecting and reacting to an object in time or being in a deadly accident.
To accommodate this huge amount of data, many companies today are moving their computing into the cloud, but Mesosphere thinks the future will increasingly be on the edge. The goal with edge computing is to act on data quickly without the latency incurred by transmitting across a wide area network.
“Most autonomous vehicles are building the technology to need minimal connection with the internet to function. Most of the time they’re reporting particular statistics to the internet while driving, but the algorithms and maps are already pre-loaded onto the car. So, i don’t think network speed is a huge limiting factor unless it becomes significantly more reliable”.
Fast is not the point here. The one thing that would make a difference is reliability. This is what Alexandr Wang, CEO of Scale thinks.
Scale is the maker of Sensor Fusion Annotation API used for Lidar and Radar point cloud data, which powers autonomous vehicles, drones, maps and more. Scale provides training data for some of the most advanced autonomous vehicle companies in the world, such as GM Cruise, Alphabet, Uber and Honda. Wang adds:
“The biggest question is the reliability of the connection. If the car is reliant on some cloud processing step to function, there will definitely be cases where that connection fails and could jeopardize the safety of the passengers.
5G connectivity would have to prove significant reliability in testing to give any car manufacturer confidence that they can rely on it in any real world circumstance. It would have to be cheap, reliable, and fast. Significantly cheaper and more reliable than anything today, and likely significantly cheaper and more reliable than 5G would be.
It would also have to be fast enough that it would not cause delays in processing time while still providing value. Realistically, the time required for a round trip is the same as how long most of these vision or perception algorithms take to run, so it would be silly to rely on any cloud computing for self-driving.”
Computing on the edge
Cloud computing does not seem to make sense for autonomous vehicles, but edge computing does. Mesosphere’s Knaup notes there are many challenges in making edge computing use cases viable. He also points towards network reliability, and adds that edge nodes are often on heterogeneous platforms, which complicates things.
He adds that managing the infrastructure processing massive amounts of unstructured data, and privacy protection requirements around collecting sensitive data (like data collected from a smart home, for example) also need to be addressed as we move forward with edge compute.
Knaup also points towards the opportunities however: “With the massive increase we expect to see in the number of devices at the edge, the potential flourishes: from autonomous cars to connected stadiums, even major changes in healthcare data analytics.
Edge computing will allow for faster response times (thanks to the lower latency), the ability to offload computing tasks (which in turn reduces energy consumption) and better location awareness, among other benefits.”
Sierra Wireless’ Pauzet on his part says standard LPWA technologies, the first cellular data networking technologies designed specifically for the IoT, provide a tremendous opportunity to create new use cases for IoT applications:
“LPWA enables the proliferation of low cost, low-power devices needed to implement truly groundbreaking smart city, grid, farming and other applications. The challenge is combining LPWA with data orchestration, cloud and AI technologies.
This way companies can minimize the time and costs related to developing, deploying and operating these applications, while maximizing the value of insights that the data from these applications deliver.”
If 5G was available today, would it really change everything?
If 5G was available today, Knaup says he would massively increase the number of sensors in “smart” buildings/highways/infrastructure (i.e. all doors, elevators, fixtures, runways, etc.) for constant and real-time health and safety, as well as usage monitoring of the relevant systems.
“Availability of 5G would also reduce my dependency on wired backhaul networks to transport data from edge compute/storage devices back to regional compute/storage devices — for example, I could utilize a “telephone pole mounted cache” for content distribution.
The performance of deep learning algorithms increases with the amount of data they can learn from, so i would collect more data from devices than is possible today in order to increase the accuracy of deep learning applications.”
Pauzet on the other hand is not sure that much would change in the industry’s data motion, storage and processing philosophy and architecture if 5G was available today:
“While 5G does provide an opportunity to deliver multi-Gigabit speeds and massive capacity over short distances, opening up many new use cases, what will really change these philosophies and architectures is intelligent data orchestration.
The major challenge facing the industry today is how to extract, process, analyze and update IoT data more efficiency and effectively — and what will solve this challenge in the near term is better orchestration of data, not faster transfer of data over short distances.”
While 5G can facilitate faster data transfer between devices on the edge and the cloud, this does not necessarily open the gates to a brave new world. Data orchestration and management remain key, and there are use cases for which the cloud does not make sense.
These views hint at some really interesting topics to pick up on, and we will be continuing the analysis in future posts.
Bigdata and data center
thanks you RSS link