FREE SHIPPING on Orders Over US$99

English English

NVIDIA MQM8790-HS2F Mellanox Quantum™ HDR InfiniBand Switch, 40 QSFP56 Ports, 2 Power Supplies (AC), Unmanaged, Standard Depth, P2C Airflow, Rail Kit
P/N: MQM8790-HS2F
US$3899.00
3
Reviews
2
Questions
safeguard
4-Year Warranty
30-Day Returns
30-Day Exchange
(10 In Stock)

Product Highlights

40 HDR 200Gb/s Ports With Split Capability to 80x100Gb/s Via Breakout Cables
16Tb/s Non-Blocking Switching Fabric Ensuring sub-130ns Ultra-low Latency
Hot-swappable Dual Power Modules with 100-240V AC Compatibility and Redundant Cooling
Adaptive Routing for Optimized Traffic in Dragonfly+ and Slimfly HPC Topologies
Ibta 1.3-Compliant Architecture with 9 Virtual Channel Prioritization
NVIDIA MQM8790-HS2F Mellanox Quantum™ HDR InfiniBand Switch, 40 QSFP56 Ports, 2 Power Supplies (AC), Unmanaged, Standard Depth, P2C Airflow, Rail Kit
40-port non-blocking managed HDR 200Gb/s InfiniBand smart switch, Mellanox provides the world’s smartest switch, enabling in-network computing through the Co-Design Scalable Hierarchical Aggregation and Reduction Protocol (SHARP)™ technology. QM8790 has the highest fabric performance available in the market with up to 16Tb/s of non-blocking bandwidth with sub-130ns port-to-port latency.
Specifications
Manufacturer
Mellanox
Ports
40×QSFP56
Switching Capacity
16Tb/s
Fan Number
5+1 Hot-swappable
Input Voltage
100-127VAC 50/60Hz 4.5A, 200-240VAC 50/60Hz 2.9A
Software
MLNX-OS
Data Rate
40/56/100/200 Gb/s
Latency
130ns
Management
Unsupport
Airflow
Back-to-Front (P2C)
AC Power Supplies
1+1 Hot-swappable
Operating Temperature
0 to 40°C (32 to 104°F)
Storage Temperature
-40 to 70°C (-40 to 158°F)
Dimensions (H×W×D)
43.6×433.2×590.6mm
Rack Units
1 RU
Certification
Product
Product
Product
Product
Product
Product
Product
Product
Product
Product
Questions & Answers
edit icon Ask a question
All (2) Management (1)Product (1)
Q:

What is the maximum bandwidth supported by NVIDIA InfiniBand HDR switches?

by u***m on 2025-07-21 16:16:58

A:

NVIDIA HDR InfiniBand switches support up to 200Gb/s per port, delivering extremely high throughput for data-intensive workloads.

by Helpful on 2025-07-21 16:49:20

like icon Helpful 0
comment icon Comment 0
Q:

Are these switches suitable for large-scale AI and HPC clusters?

by Q***m on 2025-07-21 15:43:59

A:

Yes. NVIDIA InfiniBand switches are designed for high-performance computing and AI workloads, offering low latency and high bandwidth essential for multi-GPU or multi-node training.

by Helpful on 2025-07-21 16:49:43

like icon Helpful 0
comment icon Comment 0
Customer Reviews
edit icon Write a review
All (3) Performance (1)Compatibility (1)Scalability (1)
o
o***m
2025-06-28 23:43:40
Confirmed Purchase
5.0

Interoperates flawlessly with our NVIDIA NICs and DGX systems. It was truly plug-and-play with minimal configuration needed.

link icon Helpful 0
comment icon Comment 0
9
9***m
2025-06-26 01:40:10
Confirmed Purchase
5.0

The low-latency and high-throughput performance of the NVIDIA InfiniBand switches exceeded expectations. We achieved full HDR bandwidth with consistent sub-microsecond latency.

link icon Helpful 0
comment icon Comment 0
x
x***m
2025-06-09 21:21:55
Confirmed Purchase
5.0

We were able to seamlessly scale our InfiniBand fabric using NVIDIA’s modular switch design. Adding more compute nodes was straightforward and non-disruptive.

link icon Helpful 0
comment icon Comment 0
Explore Our Extended Product Line