Akamai wires NVIDIA GPUs into 4,400 sites for real-time AI

robot
Abstract generation in progress

Akamai has launched its Inference Cloud, leveraging NVIDIA AI Grid across 4,400+ edge locations with RTX PRO 6000 Blackwell GPUs to provide intelligent orchestration for distributed AI inference. This initiative aims to reduce latency and cost for enterprise AI workloads, moving beyond centralized AI factories. The platform is available to qualified enterprise customers and is supported by a $200 million, four-year service agreement for a multi-thousand GPU cluster.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin