FREE SHIPPING on Orders Over US$99

English English

1m (3ft) NVIDIA (InfiniBand) Compatible 800G OSFP Finned Top to 4 x 200G QSFP112 InfiniBand NDR Passive Direct Attach Copper Breakout Cable
P/N: O4Q112-800G-CU
US$269.00
10
Reviews
5
Questions
safeguard
4-Year Warranty
30-Day Returns
30-Day Exchange
(1000 In Stock)

Product Highlights

Qualified for InfiniBand NDR End-to-end Systems
Lowest-latency, Low Insertion Loss, Low Crosstalk, High-speed Interconnect
Bit Error Rate (BER) better than 2.4E-4 with FS InfiniBand NDR Systems
8x 100G-PAM4 to Quad 2x100G-PAM4
Compatible with NVIDIA QM9700/9790 Devices, ConnectX-7 Adapters and BlueField-3 DPUs
Minimum Bend Radius 31mm for Flexible Routing
Simplifies Patching and Offers a Cost-effective Way for Short Links
Compliant with OSFP MSA, QSFP112 MSA, IEEE 802.3ck and CMIS Rev5.0
See more
1m (3ft) NVIDIA (InfiniBand) Compatible 800G OSFP Finned Top to 4 x 200G QSFP112 InfiniBand NDR Passive Direct Attach Copper Breakout Cable
The NVIDIA/Mellanox InfiniBand MCP7Y40-N001 is an 2x 400Gb/s twin-port OSFP to four 200Gb/s twin-port QSFP112 Direct Attached Copper cable. The 8-channel twin-port OSFP end uses a finned top form-factor for use in Quantum-2 and Spectrum-4 switch cages. The four 200G ends support 2-channels of 100G-PAM4 (200G) and use a flat top QSFP112 for use in ConnectX-7 adapters and BlueField-3 DPUs using ridiing heat sinks on the connector cage. This cable is compliant with OSFP MSA, QSFP112 MSA (Multi-Source Agreement) and IEEE 802.3ck standards.
DACs offer the lowest latency, so are widely used in data centers to connect servers and GPU compute systems to the top-of-rack (TOR) switches and link spine-to-super-spine switches inside racks over short cable lengths. DACs are also used to create InfiniBand storage fabrics with hard-disk-drive (HDD) and flash memory subsystems.
Featuring a simple design and minimal components, DAC cables offer the lowest-cost, lowest-latency, and near-zero-power connections for high-speed links. They enhance port bandwidth, density, configurability, and reduce power requirements in supercomputers and hyperscale systems.
Specifications
NVIDIA/Mellanox Ethernet Compatible
MCP7Y40-N001
Vendor Name
AICPLIGHT
Form Factor
OSFP (Finned Top) to 4x QSFP112
Max Data Rate
800Gbps
Media
Copper
Cable Type
Passive Twinax
Minimum Bend Radius
31mm
Power Consumption
0.1W
Cable Length
1m (3ft)
Wire AWG
28AWG
Temperature Range
0 to 70°C (32 to 158°F)
Jacket Material
PVC (OFNR)
Protocols
OSFP MSA, QSFP112 MSA, IEEE 802.3ck, CMIS v5.0
Modulation Format
PAM4
Certification
Product
Product
Product
Product
Product
Product
Product
Product
Product
Product
Questions & Answers
edit icon Ask a question
All (5) Management (1)Product (1)Compatibility (1)Installation (1)Performance (1)
Q:

Do these cables require special configuration after plugging in?

by g***m on 2025-07-10 10:19:53

A:

No special configuration is needed. The cables are hot-swappable and designed for plug-and-play operation in InfiniBand environments.

by Helpful on 2025-07-10 10:25:43

like icon Helpful 0
comment icon Comment 0
Q:

Do InfiniBand AOC cables support long-distance connections?

by i***m on 2025-07-10 09:21:39

A:

Yes, AOC (Active Optical Cables) support longer distances than DACs, typically up to 30 meters or more, while maintaining low latency and high bandwidth transmission.

by Helpful on 2025-07-10 10:26:17

like icon Helpful 0
comment icon Comment 0
Q:

Can these cables be used for GPU Direct RDMA communication?

by 2***m on 2025-07-10 09:17:41

A:

Absolutely. These InfiniBand cables support GPU Direct RDMA, enabling low-latency, high-throughput data transfers directly between GPUs across the fabric.

by Helpful on 2025-07-10 10:24:35

like icon Helpful 0
comment icon Comment 0
Q:

Are these InfiniBand DAC/AOC cables compatible with NVIDIA/Mellanox ConnectX NICs?

by u***m on 2025-07-10 09:17:33

A:

Yes, our InfiniBand DAC and AOC cables are fully compatible with NVIDIA/Mellanox ConnectX-5, ConnectX-6, and ConnectX-7 NICs, as well as with NVIDIA Quantum and Spectrum switches.

by Helpful on 2025-07-10 10:24:04

like icon Helpful 0
comment icon Comment 0
Q:

What is the difference between HDR and NDR InfiniBand cables?

by J***m on 2025-07-10 09:14:32

A:

HDR (High Data Rate) supports up to 200Gb/s per port, while NDR (Next Data Rate) can handle 400Gb/s. Both are used for high-performance computing, but NDR is designed for newer-generation systems requiring ultra-high bandwidth.

by Helpful on 2025-07-10 10:23:26

like icon Helpful 0
comment icon Comment 0
Customer Reviews
edit icon Write a review
All (10) Performance (1)Compatibility (1)Ease of Use (1)Customer Support (1)Value (1)Reliability (1)Flexibility (1)Scalability (1)Quality (1)Delivery (1)
d
d***m
2025-06-18 15:35:41
Confirmed Purchase
5.0

The connectors are solid, and the shielding is robust. No damage or signal degradation even after multiple insertions.

link icon Helpful 0
comment icon Comment 0
l
l***m
2025-06-03 07:44:54
Confirmed Purchase
5.0

Running 24/7 in an AI training environment with zero errors so far. Excellent quality and heat management.

link icon Helpful 0
comment icon Comment 0
4
4***m
2025-05-30 11:57:21
Confirmed Purchase
5.0

We had tight routing paths between racks and these DACs were flexible enough for clean cable management without strain.

link icon Helpful 0
comment icon Comment 0
A
A***m
2025-05-26 23:01:34
Confirmed Purchase
5.0

Cables were clearly labeled and well packaged. Plug-and-play setup made our data center upgrade smooth and fast.

link icon Helpful 0
comment icon Comment 0
t
t***m
2025-05-21 10:21:04
Confirmed Purchase
5.0

We used breakout versions (1x400G to 4x100G) to expand our interconnect fabric and they worked flawlessly.

link icon Helpful 0
comment icon Comment 0
p
p***m
2025-05-09 21:47:48
Confirmed Purchase
5.0

Order arrived earlier than expected. All cables were individually tested and labeled, saving us time during rack installation.

link icon Helpful 0
comment icon Comment 0
Y
Y***m
2025-05-07 13:48:16
Confirmed Purchase
5.0

Customer service helped confirm the pinout for a custom breakout setup. Very knowledgeable and responsive team.

link icon Helpful 0
comment icon Comment 0
9
9***m
2025-04-30 07:28:54
Confirmed Purchase
5.0

We used these InfiniBand HDR DAC cables to link GPU servers in our HPC cluster. The latency and bandwidth were excellent — no packet loss even under full load.

link icon Helpful 0
comment icon Comment 0
5
5***m
2025-04-26 08:59:14
Confirmed Purchase
5.0

Compared to OEM versions, these cables offer the same performance at a significantly lower cost. Great for large-scale deployment.

link icon Helpful 0
comment icon Comment 0
S
S***m
2025-04-01 18:20:50
Confirmed Purchase
5.0

Worked out of the box with ConnectX-6 NICs and Quantum switches. Auto-negotiation and link detection were flawless across all ports.

link icon Helpful 0
comment icon Comment 0