DataSphere applies machine learning to give applications infrastructure awareness. Using a metadata engine, DataSphere automates the placement of data across the enterprise and hybrid cloud, non-disruptively separating how an application views data from how it’s managed -- reducing cost and accelerating performance.

DataSphere separates how an application logically views data to free data from where it is physically stored. It is a completely new element in modern IT architecture built to maximize the value of existing infrastructure and help enterprises easily integrate with the cloud. The architecture unites different types of storage into a global namespace and automatically places data on the most appropriate storage resource to meet business and IT objectives across performance, protection and price. This helps enterprises overcome performance bottlenecks, integrate with the cloud for savings and active archival, and easily adopt new resources from any vendor to achieve unprecedented improvements in performance, efficiency and scalability.

Modern enterprise applications are some combination of mission-critical, virtualized, scale-out, micro-service architected, containerized, and geo-fenced; which usually means that you're not quite sure where your data is or how to optimize it. So, enterprises over-provision and over-spend at all stages in the lifecycle to use and protect their most important asset -- data.

Data migration is one example, typically taking months of planning as IT performs acrobatics to minimize disruption or downtime when migrating storage systems. Even so, IT often has to stop access to data by halting applications, manually copying data to the new storage and reconfiguring, then restarting applications.

DataSphere allows IT to ease or eliminate the common issues associated with data migration. Once DataSphere is in place, organizations are no longer faced with the headache of performing multiple steps to minimize disruption or downtime when migrating storage systems.

Key Features

  • Heterogeneous storage (NFS and S3) in a global scale-out namespace
  • Separation of metadata from the data for accelerated performance
  • Non-disruptive file/data migrations

Additional Resources

Introduction to DataSphere

Datasheet: DataSphere for the Enterprise

Demo: Non-disruptive Data Migration

DataSphere applies machine learning to give applications infrastructure awareness. Using a metadata engine, DataSphere automates the placement of data across the enterprise and hybrid cloud, non-disruptively separating how an application views data from how it’s managed -- reducing cost and accelerating performance.