NVIDIA/Mellanox MCX653105A-HDAT-SP ConnectX®-6 InfiniBand Adapter Card, HDR/200G, Single-Port QSFP56, PCIe 3.0/4.0 x 16, Tall&Short Bracket

#102393
Model: MCX653105A-HDAT-SP
Sold: 0
In Stock: 2
$ 1069.00
Brand:
NVIDIA/Mellanox (InfiniBand)
NVIDIA/Mellanox (InfiniBand)

Item Spotlights

  • Up to 200Gb/s connectivity per port
  • Sub 0.6usec latency
  • Advanced storage capabilities including block-level encryption and checksum offloads
  • Cutting-edge performance in virtualized networks including Network Function Virtualization (NFV)
  • Smart interconnect for x86, Power, Arm, GPU and FPGA-based compute and storage platforms
  • Flexible programmable pipeline for new network flows
NVIDIA MCX653105A-HDAT-SP ConnectX-6 VPI Adapter Card HDR/200GbE
NVIDIA/Mellanox MCX653105A-HDAT-SP ConnectX®-6 InfiniBand Adapter Card, HDR/200G, Single-Port QSFP56, PCIe 3.0/4.0 x 16, Tall&Short Bracket
Specifications
Applications
Product Highlights
Questions & Answers
Resources
Specifications
Applications
Product Highlights
Questions & Answers
Resources
Description

NVIDIA/Mellanox MCX653105A-HDAT-SP ConnectX®-6 InfiniBand Adapter Card, HDR/200G, Single-Port QSFP56, PCIe 3.0/4.0 x 16, Tall&Short Bracket

ConnectX®-6 Virtual Protocol Interconnect (VPI) cards are a groundbreaking addition to the ConnectX series of industry-leading network adapter cards. Providing one or two ports of HDR InfiniBand and 200GbE Ethernet connectivity, sub-600ns latency and 215 million messages per second, ConnectX-6 VPI cards enable the highest performance and most flexible solution aimed at meeting the continually growing demands of data center applications.

Specifications
Part Number
MCX653105A-HDAT-SP
Data Transmission Rate
InfiniBand: SDR/DDR/QDR/FDR/EDR/HDR100/HDR
Ethernet:10/25/40/50/100/200 Gb/s
Network Connector Type
Single-port QSFP56
Application
InfiniBand/Ethernet
Host Interface
PCIe Gen 3.0 / 4.0 SERDES @ 8.0GT/s / 16.0GT/s
Technology
RDMA/RoCE
Adapter Card Size
6.6 in. x 2.71 in. (167.65mm x 68.90mm)
RoHS
RoHS Compliant
Temperature
Operational: 0°C to 55°C
Storage: -40°C to 70°C
Supported operating systems
Linux, Windows, VMware
Connectivity Solutions
Applications
Product Highlights
GPUDirect RDMA: Enabling Direct Data Transfer Between GPUs for Enhanced Cluster Efficiency

GPUDirect enables direct memory-to-memory data transfer between GPUs, significantly improving efficiency by reducing bandwidth and latency, and optimizing GPU cluster performance.

Advanced Network Offloads: Accelerating Data Plane, Network, Storage, and Security for Enhanced Host Efficiency

Offloads I/O-related operations from the CPU, enabling in-network computing and memory functions, significantly improving host performance.

Accelerating Network Performance: Enhancing Speed and Reducing CPU Overhead in IP Packet Transmission

Utilizes accelerated switching and packet processing technologies to boost network performance and reduce CPU overhead in IP packet transmission, freeing up processor cycles for applications.

Questions & Answers
Ask a Question
Q:
Does the IB card in Ethernet mode support RDMA?
A:
Yes, it supports Ethernet RDMA, specifically RoCE (RDMA over Converged Ethernet). For large-scale networking, the NVIDIA Spectrum-X solution is recommended.
Q:
Is there a distinction between simplex and duplex for IB network cards?
A:
They are all duplexes. The terms simplex or duplex are merely conceptual for current devices, as the physical channels for sending and receiving have already been separated.
Q:
Can a server allow the use of two types of cards (encrypted and non-encrypted) together?
A:
Yes.
View More
Quality Certification
ISO140012015
ISO 90012015
ISO450012018
FDA
FCC
CE
RoHS
TUV-Mark
UL
WEEE
What We Supply