FREE SHIPPING on Orders Over US$99

English English

NVIDIA MQM8700-HS2F Mellanox Quantum™ HDR InfiniBand Switch, 40 QSFP56 Ports, 2 Power Supplies (AC), x86 Dual Core, Standard Depth, P2C Airflow, Rail Kit
P/N: MQM8700-HS2F
US$6999.00
3
Reviews
2
Questions
safeguard
4-Year Warranty
30-Day Returns
30-Day Exchange
(10 In Stock)

Product Highlights

40 HDR 200Gb/s Ports Split into 80X100Gb/s Via Breakout Cables
16Tb/s Non-blocking Switching Fabric for Congestion-Free Data Flow
Hot-swappable Dual Power Modules with 100-240V AC Compatibility
Advanced Adaptive Routing for Dragonfly+ and Slimfly Topologies
Integrated X86 Comex Broadwell CPU For Intelligent Fabric Management
NVIDIA MQM8700-HS2F Mellanox Quantum™ HDR InfiniBand Switch, 40 QSFP56 Ports, 2 Power Supplies (AC), x86 Dual Core, Standard Depth, P2C Airflow, Rail Kit
40-port non-blocking managed HDR 200Gb/s InfiniBand smart switch, Mellanox provides the world’s smartest switch, enabling in-network computing through the Co-Design Scalable Hierarchical Aggregation and Reduction Protocol (SHARP)™ technology. QM8700 has the highest fabric performance available in the market with up to 16Tb/s of non-blocking bandwidth with sub-130ns port-to-port latency.
Specifications
Manufacturer
Mellanox
Ports
40×QSFP56
Switching Capacity
16Tb/s
Fan Number
5+1 Hot-swappable
Input Voltage
100-127VAC 50/60Hz 4.5A, 200-240VAC 50/60Hz 2.9A
Software
MLNX-OS
Data Rate
40/56/100/200 Gb/s
Latency
130ns
Management
support
Airflow
Back-to-Front (P2C)
AC Power Supplies
1+1 Hot-swappable
Operating Temperature
0 to 40°C (32 to 104°F)
Storage Temperature
-40 to 70°C (-40 to 158°F)
Dimensions (H×W×D)
43.6×433.2×590.6mm
Rack Units
1 RU
Certification
Product
Product
Product
Product
Product
Product
Product
Product
Product
Product
Questions & Answers
edit icon Ask a question
All (2) Management (1)Product (1)
Q:

What is the maximum bandwidth supported by NVIDIA InfiniBand HDR switches?

by B***m on 2025-07-21 16:37:16

A:

NVIDIA HDR InfiniBand switches support up to 200Gb/s per port, delivering extremely high throughput for data-intensive workloads.

by CPLIGHT on 2025-07-21 16:49:20

like icon Helpful 0
comment icon Comment 0
Q:

Are these switches suitable for large-scale AI and HPC clusters?

by i***m on 2025-07-21 15:12:30

A:

Yes. NVIDIA InfiniBand switches are designed for high-performance computing and AI workloads, offering low latency and high bandwidth essential for multi-GPU or multi-node training.

by CPLIGHT on 2025-07-21 16:49:43

like icon Helpful 0
comment icon Comment 0
Customer Reviews
edit icon Write a review
All (3) Performance (1)Compatibility (1)Scalability (1)
P
P***m
2025-06-27 13:25:45
Confirmed Purchase
5.0

We were able to seamlessly scale our InfiniBand fabric using NVIDIA’s modular switch design. Adding more compute nodes was straightforward and non-disruptive.

link icon Helpful 0
comment icon Comment 0
z
z***m
2025-06-10 01:20:40
Confirmed Purchase
5.0

The low-latency and high-throughput performance of the NVIDIA InfiniBand switches exceeded expectations. We achieved full HDR bandwidth with consistent sub-microsecond latency.

link icon Helpful 0
comment icon Comment 0
y
y***m
2025-05-30 17:14:52
Confirmed Purchase
5.0

Interoperates flawlessly with our NVIDIA NICs and DGX systems. It was truly plug-and-play with minimal configuration needed.

link icon Helpful 0
comment icon Comment 0
Explore Our Extended Product Line