By now, most IT professionals have dipped their toes in the solid-state drive (SSD) vs. hard disk drive (HDD) waters, but may not have had time to sort out the different types of SSDs used in their machines.
Here are some fairly recent commercial products that illustrate the SSD's ubiquity:
- Recently manufactured laptops like the Dell Inspiron 15 7000 series are available with 256GB (and up) SSDs.
- Entry-level laptops like the Chromebook use eMMC, a slower and less costly form of solid state storage.
- The Microsoft Surface 4 Pro which uses a Samsung-branded SSD that integrates a PCIe flash controller and (depending on your budget) 128 / 256 / 512GB or / 1TB GB of NAND flash. For details, see the IFixit product teardown.
- Business-grade desktops like the Energy Star-certified HP EliteDesk 800 feature a standard 256GB SSD.
- Servers from Supermicro support PCI-e SSD solutions by Fusion-io, which the company says "...creates a new tier in the memory hierarchy with 100 times the capacity density and 10 times the capacity per dollar of traditional DRAM."
The SSD is clearly mainstream in some respect, but it is far from fully displacing the hard drive. So is there a best practice for deploying SSDs? Or is this a purely economic decision?
Sensible SSD Scenarios
According to Tech Times, the cost of SSDs still makes some companies unable to undergo a full switchover. This acknowledges the primacy of a dynamically calculated crossover point. SSD vs. HDD? In the history of computing, this is not a new question.
From the beginning, there have been memory storage tiers: fast memory and slow memory. Cost, capacity and speed are traded off. In 1965, the CDC 6600 computer system featured central memory, extended core storage, fixed HDDs, moving head HDDs and tape. Managing storage tiers has always been critical to building hardware, creating applications and managing systems. In fact, few among today's generation of IT managers have had to schedule time to mount tape drives or spin down massive CDC 9760 disk packs.
Fast forward (to stay with the tape metaphor) a few decades to 2011. In that year, Wired introduced its readers to Gordon, "the world's first flash supercomputer." Named after Flash Gordon, the supercomputer installed at the San Diego Supercomputer Center (SDSC) used 300 terabytes of flash memory (initially Intel 710 series drives). Gordon foreshadowed even wider adoption of flash memory. SDSC applications lead Bob Sinkovits told Wired why they bought into SSDs: "For data-intensive applications, though, the biggest advantage is much lower latency."
It was an architectural goal that would have been familiar to the CDC 6600 design team.
Next-Gen Memory Thinking
Whether their ideas are truly new, or a sort of Back to the Future exercise, engineers are busy studying how best to use devices like SSDs. In a 2013 IEEE Computer article introducing a collection of papers on coming memory innovations, Atwood, Chae and Shim explain that as demand for scalable memory systems increases, "...memory technology becomes both a solution and a bottleneck, spurring the industry to redefine how these systems use memory. One of the best examples of this is the emergence of solid-state drives (SSDs) across the range of computing devices."
The changes some anticipate are major. The title of a paper in that issue by Swanson and Caulfield is typical of this vision: "Refactor, Reduce, Recycle: Restructuring the I/O Stack for the Future of Storage." Still more recently, SSDs are a big part of the drive toward software-defined storage, as InfoWorld reports.
Turn It Up to 10: App Speeds Through SSDs
Admins and DIY advocates would do well to hone their matchmaking skills. When SSDs are appropriately matched to the application, the results can bring smiles to users. If CFOs aren't happy, they can live with the results.
What sort of matchmaking analysis? Here are two examples:
- Oracle studied how SSDs could be used to optimize Oracle database performance.
- FlyData studied the impact of SSDs on Amazon's Redshift, and found that Redshift SSD performed 8X faster under their test conditions.
Your mileage may vary, but that's kind of the point. It's been that way with memory staging right from the beginning of computing.