I've blown off the dust from 2015 to bring back the COIL EarlyView podcast. In Session 1 2016, I speak with vmware and Hitachi Data Systems to talk virtualization for compute and storage.
Joining me at the COIL Studio is Bob Goldsand with vmware and Greg Smith, with Hitachi Data Systems. In this session Bob and Greg tell us about a specific data warehouse use case in this project that is behind why customers take interest in vmware NSX micro-segmentations for mission critical architectures.
This is a great session for implementers and IT managers to learn more about how vmware defines and creates software-defined storage via vmware Virtual Volumes, using The Hitachi Unified Compute Platform. Using this HDS Platform and vSphere APIs for Storage Awareness the project team is able to create vmware Virtual Volumes and storage containers which can be used to rapidly provision SAP landscapes in the Software-Defined Data Center.
These virtual datastores get created based on the exposed capabilities of Hitachi’s unified storage platform mapped to the application/database requirements as defined by SAP. The virtual datastores discussed will address the storage requirement necessary to cover the entire data life cycle for SAP HANA which include hot ,warm cold data and archiving storage spaces.