When used in AI & ML applications, memory semantic SSD can have up to 20X* performance improvement by increasing a random read speed and decreasing a latency. This can be achieve by implementation of Samsung’s industry first Compute Express Link (CXL) interconnection technology & built-in DRAM cache technology.
* 20x better performance compared to using PCIe Gen4 NVMe SSD. MS-SSD
Optimized to read & write small-sized data chunks, memory semantic SSD is perfect solution for AI/ML workload
that requires fast processing of smaller data sets.
Larger Capacity with NAND Flash. lower Latency with Internal DRAM Cache.
*Total Cost of Ownership
Go back to Main Page