Edgecore Networks Showcases Comprehensive End-to-End AI Data Center Solutions at the OCP Global Summit.
OCP — Edgecore Networks, a leader in providing innovative network solutions for enterprises, data centers, and service providers, announces a new 32-port switch addition to its 800G AIS series – AI switches, designed to meet the demanding needs of AI and machine learning (ML) workloads.
These new switches offer low latency, high radix, and dynamic load balancing which ensure reduced congestion across a lossless network. Completing the total solution, Edgecore also provides a range of pluggable optical modules and a robust SONiC-based network operating system, enabling seamless integration with modern data centers. Together with Edgecore’s SONiC software ecosystem partners, such as Aviz Networks, BE Networks, Dorado, and Netris, these solutions provide users unparalleled visibility, performance assurance, and efficient deployment for enterprise data center networks.
Nanda Ravindran, Vice President of Product Management at Edgecore Networks, said:
We are thrilled to present and showcase our latest AI/ML data center series of 800G platforms at the OCP Summit.
“Through open-source community and our innovation, we offer flexibility for our customers with AI/ML fabric solutions, integrating technologies like RoCEv2, DCQCN, and DLB. The SONiC ecosystem enables seamless scaling and optimization, crucial for handling unique traffic patterns and latency-sensitive AI workloads.”
The Edgecore end-to-end AI data center solutions:
- Rich Ecosystem and Vendor-Agnostic: SONiC’s rich ecosystem supports broad compatibility with the industry’s most popular server NIC devices. Its vendor-agnostic approach and deep integration with telemetry tools ensure adaptability, scalability, and optimal performance for AI/ML workloads.
- 800G x 64-Port Switch: Capable of supporting 128-256 nodes, featuring NCCL PXN to eliminate latency from cross-switch communication, ensuring seamless scalability for AI workloads. Offering 64 and 32 port switches enables the most flexibility in AI data center design planning.
- Lossless Ethernet for AI RDMA Traffic: Guarantees maximum throughput and minimal latency, optimized for high-performance AI applications.
- DCQCN with PFC/ECN Support: Leverages advanced traffic management techniques to control AI fabric traffic, ensuring smooth operation with minimal congestion.
- ECMP Eligible Mode: Breaks down large “elephant flows” into smaller flowlets, preventing hash polarization and congestion across ECMP links, improving load distribution and performance.
The Edgecore/Accton end-to-end AI networking solution (along with some advanced demonstrations of the industry’s most energy efficient open-loop and immersion cooling AI solutions) will be shown at the OCP Global Summit, San Jose Convention Center (Booth # A4) on October 15-17.
READ the latest news shaping the data centre market at Data Centre Central
Edgecore Networks Showcases Comprehensive End-to-End AI Data Center Solutions at the OCP Global Summit. source