FREE SHIPPING on Orders Over US$99

English English

NVIDIA MQM9790-NS2F Mellanox Quantum™-2 NDR InfiniBand Switch, 64-Ports NDR, 32 OSFP Ports, Unmanaged, P2C Airflow
P/N: MQM9790-NS2F
US$21999.00
3
Reviews
2
Questions
safeguard
4-Year Warranty
30-Day Returns
30-Day Exchange
(0 In Stock)

Product Highlights

64 NDR 400Gb/s Ports Split into 128x200Gb/s via OSFP Breakout Cables
51.2Tb/s Non-Blocking Throughput with 66.5B Packets/sec For AI/ML Workloads
P2C Airflow Design (Front-To-Rear Cooling) and 1+1 Redundant Hot-swap PSUs
Native UFM® Compatibility For Subnet Management without External Controllers
Multi-Speed Auto-negotiation (NDR/HDR/EDR) and Dragonfly + Topology Support
NVIDIA MQM9790-NS2F Mellanox Quantum™-2 NDR InfiniBand Switch, 64-Ports NDR, 32 OSFP Ports, Unmanaged, P2C Airflow
As high-performance computing (HPC) and artificial intelligence (AI) applications become more complex, the demand for the most advanced high-speed networking is critical for extreme-scale systems. NVIDIA Quantum-2 9700 is the industry-leading switch platform in power and density, with NDR 400 gigabit per second (Gb/s) InfiniBand throughput that provides AI developers and scientific researchers with the highest networking performance available to take on the world’s most challenging problems.
Specifications
Manufacturer
NVIDIA
Software
MLNX-OS
Ports
32*OSFP 800G
Data Rate
40/56/100/200/400 Gb/s
Switch series
NVIDIA Quantum-2
Switching Capacity
51.2Tb/s
Latency
130ns
Fan
6+1 Hot-swappable
Airflow
Back-to-Front (P2C)
Input Voltage
1x/2x, 200-240VAC 50/60Hz 10A
AC Power Supplies
1+1 Hot-swappable
Operating Temperature
0 to 35°C (32 to 95°F)
Storage Temperature
-40 to 70°C (-40 to 158°F)
Dimensions (H×W×D)
43.6×438×660mm
Rack Units
1 RU
Certification
Product
Product
Product
Product
Product
Product
Product
Product
Product
Product
Questions & Answers
edit icon Ask a question
All (2) Management (1)Product (1)
Q:

What is the maximum bandwidth supported by NVIDIA InfiniBand HDR switches?

by a***m on 2025-07-21 16:18:57

A:

NVIDIA HDR InfiniBand switches support up to 200Gb/s per port, delivering extremely high throughput for data-intensive workloads.

by CPLIGHT on 2025-07-21 16:49:20

like icon Helpful 0
comment icon Comment 0
Q:

Are these switches suitable for large-scale AI and HPC clusters?

by y***m on 2025-07-21 15:52:32

A:

Yes. NVIDIA InfiniBand switches are designed for high-performance computing and AI workloads, offering low latency and high bandwidth essential for multi-GPU or multi-node training.

by Helpful on 2025-07-21 16:49:43

like icon Helpful 0
comment icon Comment 0
Customer Reviews
edit icon Write a review
All (3) Performance (1)Compatibility (1)Scalability (1)
w
w***m
2025-07-11 21:41:54
Confirmed Purchase
5.0

The low-latency and high-throughput performance of the NVIDIA InfiniBand switches exceeded expectations. We achieved full HDR bandwidth with consistent sub-microsecond latency.

link icon Helpful 0
comment icon Comment 0
m
m***m
2025-05-26 14:29:09
Confirmed Purchase
5.0

Interoperates flawlessly with our NVIDIA NICs and DGX systems. It was truly plug-and-play with minimal configuration needed.

link icon Helpful 0
comment icon Comment 0
x
x***m
2025-05-07 20:04:35
Confirmed Purchase
5.0

We were able to seamlessly scale our InfiniBand fabric using NVIDIA’s modular switch design. Adding more compute nodes was straightforward and non-disruptive.

link icon Helpful 0
comment icon Comment 0
Explore Our Extended Product Line