Device42 – Official Blog

Towards a Unified View of IT Infrastructure | The Official Device42 Blog


Understanding the Impact of Edge Computing on Data Center Management

Understanding the Impact of Edge Computing on Data Center Management

Edge computing has swept the IT industry by storm. During the pandemic, organizations rapidly expanded edge networks, placing new computing capacity close to users. Edge sites enable organizations to deploy just the capacity they need, process data and conduct real-time analytics locally, reduce latency, and deliver personalized digital experiences.

Gartner predicts that by 2025, more than 50% of enterprise-managed data will be created and processed outside the data center or cloud.

TechTarget defines edge computing “as a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible.” Today, that means connecting IT devices to sensors. Increasingly, however, teams will connect IT to operational technology (OT), via IoT devices and sensors, to automate processes across factories and networks of sites, ushering in an era of smart factories, smart grid applications, and smart cities. 

Edge computing leverages a distributed architecture to process data and enable services as physically close to users as possible, enabling new applications that can’t afford latency. This is in contrast to traditional on-premise data centers that process data centrally.  

How Edge Site Management Differs from Data Center Management 

Both data center and edge site management are moving to a remote model. Equipping IT and networking teams to work remotely enables leaders to reduce real estate holdings and operational costs while ensuring service continuity across core-to-edge networks. 

IT and security teams use serial and keyboard, video, and mouse (KVM) devices and management platforms to connect to the devices they’re managing offsite. They also leverage data center infrastructure management (DCIM) solutions which provide powerful data center visualization tools, auto-generate rack diagrams, offer detailed device information, and help optimize power management. 

However, data centers typically still have staff onsite, who can physically monitor devices and respond to alerts. Edge sites are usually remote and often dark, requiring costly truck rolls to service. So, being able to troubleshoot issues and power-cycle devices to bring them online is critical to maintaining edge IT system uptime and application performance. Teams that integrate their DCIM with a configuration management database (CMDB) can auto-discover all of their hardware, software and virtualized assets and gain valuable insights into their condition and application dependencies. They can then use this information to identify and mitigate the root causes of any issues to prevent their recurrence. 

Edge sites experienced an average of 5.39 outages over 24 months, outpacing data centers’ 4.81 outages. As a result, there is work to be done to improve edge site uptime and reliability. 

Advantages of Building Out Edge Networks 

So, why are organizations building out edge sites so rapidly? And why do analysts predict that edge sites will take share from core data centers and cloud networks?

Edge computing offers numerous advantages over both core data centers and cloud computing. These benefits include:

  • Improving data processing: Instead of transporting data over cloud services or corporate networks, data can be processed and stored where it is created. That reduces processing costs, complexity, and timeframes. It also enables the delivery of personalized B2C services, such as product recommendations and special offers. B2B companies benefit from enhanced capabilities such as automation, intelligent forecasting, and predictive maintenance.
  • Reducing latency: Across industries, there are many services that can’t afford latency, including media delivery, gaming, telehealth, video conferencing, and virtual reality/augmented reality (VR/AR) applications, to name just a few. Processing data locally can reduce latency to between five and 50 milliseconds, depending on edge site type, networks, and technology.

Graphic source: STL Partners 

  • Increasing storage efficiency: Edge sites process time-sensitive, “perishable” data, then typically transport it over the cloud, reducing storage requirements.  
  • Improving service reliability: With lower latency, digital services suffer from fewer performance issues, such as screen freezes, out-of-sync video and audio, and other issues that harm streaming media.
  • Abiding by regulations: With local data processing and storage, edge computing makes it easy to abide by data sovereignty requirements, such as local and regional data privacy and protection requirements. That can help organizations avoid eye-popping fines, such as the 2022 €405 million GDPR fine assessed against Instagram, owned by Meta, and the €746 million penalty Amazon incurred in 2021. 

Gartner predicts that 40% of enterprises will have edge use cases in production by 2024.

Edge Computing’s Impact on the Data Center Industry

With so many advantages, it’s not surprising that data center owners and operators are choosing edge sites over new-build facilities. 

Edge sites also have the advantage of being able to be deployed much faster than traditional stick-built facilities which often take 18 to 24 months to deploy. One model used frequently for edge sites, prefabricated modular data centers (PMDCs), are built with standard processes and technologies. As a result, they can be deployed up to 30% faster than traditional facilities and also cost 30% less.

In addition, edge sites are fit-for-purpose. Teams can deploy new capacity in building blocks, such as 0.5 to 1.0 MW of power capacity. Then, they can simply add more capacity over time. This enables data center owners to preserve vital capital, avoid overbuilding facilities and purchasing unnecessary devices, and reduce operational costs. Edge sites are typically less expensive to power and cool, given their smaller size, use of modern systems, and capacity that matches real needs. However, like any site or system, edge sites need to be carefully planned and architected to achieve power usage effectiveness (PUE) and other goals. 

Edge computing will likely continue to take share from core and cloud computing, due to its ability to deliver local data processing and services; reduce the risk of data loss or delay; and speed and cost of deployment. 

Typical Use Cases for Edge Sites 

There are many different use cases for edge sites. Organizations are capitalizing on some already, such as preventive maintenance. In other instances, organizations need to create systems of intelligence that integrate internal data and processes with external capabilities and infrastructure to enable smart grids and cities. Use cases include:

  • Enhancing autonomous vehicle operations and safety by facilitating communication between vehicles and the surrounding environment. Autonomous vehicles will enable fleet operators to run trucks 24/7 and free passengers from the drudgery of driving, enabling them to work or watch entertainment as they travel.
  • Improving oil and gas production by facilitating real-time monitoring and analytics on production conditions. Since drilling for oils is incredibly costly and time-consuming, analytics can help producers focus investments on the best opportunities, maximize production for existing wells, and prevent oil spills and other failures.
  • Enabling smart grid applications by connecting factories, plants, and offices. Energy companies, utilities, and even enterprises will be able to produce energy, decide which energy source to use at any given time, and reduce costs and waste. Enterprises that choose to create their own energy can use more renewables and reduce their carbon footprint.
  • Strengthening predictive maintenance by monitoring equipment health, performing analytics in real-time, and alerting staff about the need for proactive maintenance. By maintaining equipment before major issues occur, teams can drive production throughput and decrease operational costs.
  • Enabling industrial automation by using analytics and automation to execute predefined processes, improving operational efficiency. Smart factories can be operated by smaller teams, use self-healing processes to mitigate issues, and operate on a 24/7 basis.
  • Automating patient monitoring in hospitals to improve the quality of care. Healthcare teams will be able to integrate data from connected devices to develop richer pictures of patient health, proactively identify warning signs, and analyze trends across patient populations. 
  • Enabling cloud gaming by providing a low-latency, immersive gaming experience that players covet. Players can use portable devices, connect to local Wi-Fi, and enjoy a high-fidelity experience with friends.
  • Improving content delivery by caching content to reduce latency and improve service delivery. Companies such as Amazon, Netflix, and others use edge computing to deliver users’ favorite shows and movies without freezes, dropouts, and other issues.
  • Facilitating traffic management by automating traffic management to reduce congestion on the roads. Edge computing processes data on the latest conditions, enabling traffic lights, HOV lanes, and other traffic calming systems to work together to streamline traffic flow.
  • Powering smart homes by processing data locally for smart home devices, such as intelligent assistants, thermostats, and more. These systems provide users with comfort and convenience, while enabling companies to learn more about what consumers want and need.
  • Enabling smart city operations by leveraging 5G, connected devices, edge and cloud computing, and big-data analytics to enable the vast systems of intelligence that power smart cities. Smart cities will optimize people movement, energy use, building comfort controls, and more to improve quality of life for residents, while scaling resources to support population growth.

Challenges of Scaling Edge Networks

So, what are some challenges associated with building out edge networks?

  • Illuminating a need for standardization: Edge sites are designed to support different applications that have different business, latency, environmental, and requirements. As a result, they can take many forms, such as network and server closets or micro or modular data centers; and use different technologies. Hence, teams are seeking to standardize deployment and management processes. They can use reference designs, matching edge site designs and technologies, to specific use case requirements. They also can use standardized solutions such as PMDCs to address the challenges of more use cases.
  • Proliferating devices: Edge sites introduce more devices into networks. As a result, they need to be discovered and managed using integrated solutions such as configuration management databases (CMDBs) and data center infrastructure management (DCIM) platforms. Using integrated solutions enables teams to use consistent management practices across sites: deploying, configuring, maintaining, and troubleshooting similar devices the same way.
  • Considering edge security: Edge sites often process sensitive data, which needs to be safeguarded against attacks. Risks include insufficient physical security for data storage and protection, insecure password and authentication practices, and data sprawl. However, with proper encryption and access controls, edge site security can be strengthened and reduce the attack surface of applications. In addition, DCIM and CMDB data provide vital insights security teams can leverage to identify and prioritize device risks, such as location, condition, and configuration status.

    Teams will want to harden security before connecting more IT and operational technology (OT), such as at manufacturing plants, utilities, and more. They’ll want to protect OT from being tampered with by attackers, which is why most organizations expect to spend 11% to 20% of their edge investments on security. 

Graphic source: AT&T / Techmonitor 

By 2025, 25% of edge networks will be breached, up from less than 1% in 2021 as enterprises continue to converge IT and OT, Gartner predicts.

Gain Greater Insight into Edge Site Devices and Performance 

With their many business and technical advantages, it’s clear that edge computing is here to stay. Companies benefit by being able to deploy the best computing model — core, cloud, or edge — that meets their unique needs. 

To reap full advantage from edge computing capabilities, IT and security teams need to gain holistic visibility into the types and conditions of devices across sites, monitor and maintain them appropriately, optimize performance, and remediate security risks. 
Device42 DCIM and CMDB solutions can help teams achieve these goals.

Share this post

Rock Johnston
About the author