Distributed Cloud Computing And Its Impact On The Cabling Infrastructure Within A Data Center

Applications such as high-frequency trading, high-performance computing, AI, and gaming are compute intensive and latency sensitive. To process these workloads in the most optimized way, data center operators must move a significant portion of their computational load towards the edge of the network closer to the source of the data. This decentralizing of the cloud creates challenges for data center operators looking for ways to grow the computational power within their data center to meet the demands of these emerging applications. This paper will cover the various applications driving the shift towards edge computing, and how this impacts the cabling infrastructure in environments that are highly space constrained.
In a traditional cloud computing architecture, the compute and storage of all data is centralized. The cloud is able to leverage its massive compute and storage capability for large-scale data analysis and data storage. While the intelligence of the infrastructure continues to reside in the cloud, latency-sensitive data is processed using an edge computing model. Edge computing is a means of processing data physically close to where the data is being produced, which reduces latency and bandwidth use.
Maximizing compute power can be done by growing computational footprint, which would mean adding more racks in the data center with more compute, storage, and networking. This approach, however, may not be possible nor economically feasible in a co-location or smaller self-owned data center environment. An alternative option for data center operators is to leverage the existing footprint by building out higher density racks. These density-optimized racks impact the compute, storage, and, in particular, the network cabling infrastructure.
Get unlimited access to:
Enter your credentials below to log in. Not yet a member of VAR Insights? Subscribe today.