Originally published by Peggy Keene.
Edge Computing Is on the Rise as IoT Device Use Increases
With the rollout of 5G and the Internet of Things having been completed, the next big thing is edge computing. While the technical definition of edge computing and how it works is relatively complicated, edge computing can be explained, at its most basic level, as a process that improves response time, bandwidth use, and connectivity by having computation and data storage happen at a storage device nearby as opposed to happening in the cloud.
The Comeback of Edge Computing
Having originated in the 1990s, edge computing in itself is not new. With the advent of smart devices, however, the Internet of Things (IoT) has created massive amounts of new data that must be processed constantly and in real-time, which often suffers in areas that have low connectivity or unreliable connections.
The advantage of edge computing, when compared to cloud storage, is that it can run much more reliably in areas that have low or unreliable connectivity to the Internet because edge computing often uses far less bandwidth than traditional cloud computing, reduces latency, and speeds up applications. As such, experts predict that the increasing use of IoT devices will be see a correlating rise in the use of edge computing.
Edge Computing v. Cloud Computing
Because edge computing generally transmits data to nearby storage devices instead of the cloud, new and different kinds of privacy and security concerns can arise. The distributed nature of edge computing requires different security protocols than those used in cloud computing. Similarly, edge computing also requires different encryption mechanisms than those used in what has become traditional cloud computing. As such, counsel should be aware that these significant differences exist and help ensure that clients properly understand such risks if their businesses intend to involve or rely on smart technology, the Internet of Things, or edge computing.
Key Takeaways on Edge Computing
Experts predict that 2021 will see a rise in the use of edge computing because:
-
smart and IoT devices generally rely on edge computing;
-
it has efficient bandwidth use, faster response time, and speeds up applications; and
-
it works in areas that have low connectivity or unreliable connections.
For more information on data privacy, see our Technology and Data Privacy Services and Industry Focused Legal Solutions pages.
You may also be interested in:
Curated by Texas Bar Today. Follow us on Twitter @texasbartoday.
from Texas Bar Today https://ift.tt/2XkUmfV
via Abogado Aly Website
No comments:
Post a Comment