Storage Vendor News
Hu Yoshidas blog
Gartner has always differentiated object storage from distributed file systems, and have published separate Critical Capabilities reports for each. The last Critical Capabilities for object storage was published March 31, 2016. In this report, written by Gartner analysts, Arun Chandrasekaran, Raj Bala and Garth Landers, Gartner recommends that readers of the report “Choose object storage products as alternatives to block and file storage when you need huge scalable capacity, reduced management overhead and lower cost of ownership.” The use cases for object storage and block and file systems are quite different.
This report clearly showed Hitachi Vantara’s HCP in a leadership position for object store.
Then in October of 2016, Gartner combined object storage and distributed file systems into one Magic Quadrant (MQ) report with the rationale that they are both scale-out storage systems. However, they still recognized the difference in these two technologies in their report.
“Distributed file system storage uses a single parallel file system to cluster multiple storage nodes together, presenting a single namespace and storage pool to provide high bandwidth for multiple hosts in parallel. Data is distributed over multiple nodes in the cluster to deliver data availability and resilience in a self-healing manner, and to provide high throughput and capacity linearly.”
“Object storage refers to devices and software that house data in structures called "objects," and serve clients via RESTful HTTP APIs, such as Amazon Simple Storage Service (S3) and OpenStack Swift.”
In order to place in the Leaders Quadrant a vendor will need to have an object storage system AND a distributed file system. Since only our HCP object storage system was evaluated, HCP was relegated to the Challengers quadrant where we were the only vendor. All the other vendors, aside from the three in the Leaders Quadrant are in the Niche Players or Visionaries Quadrants.
On October 17th, Gartner published their second annual "Gartner Magic Quadrant for Distributed File Systems and Object Storage" report, and we are still the only vendor in the Challengers Quadrant. It is important to note that the Magic Quadrant report is NOT a stand-alone assessment of object store. As the title states, this is a vendor-level analysis based on the COMBINATION of an Object Storage and Distributed File Systems (aka, scale-out NAS) offering.
A new Critical Capabilities for object storage is expected to be published in early 2018. That report will be a more accurate way to evaluate the leadership in object storage systems. We would expect to rank much higher due to the addition of geo-distributed erasure coding and other functionalities in HCP, as well as the addition of Hitachi Content Intelligence to the HCP portfolio.
Top IT Trends for 2018: Part 1
Thu, 16 Nov 2017
2017 was a watershed year for digital transformation. It wasn’t a year of major technology breakthroughs, but it was a year in which many of us began to change the way we use technology. Cloud adoption increased and more applications were being developed for it. Increasingly, corporate executives were more committed to and investing in digital transformation projects; early indications are that we have stopped the decline in productivity and are on an upturn.
For my 2018 IT trend predictions, I’ve decided to focus more on the operational changes I believe will affect IT, rather than changes in technologies like flash drives. Over the next four weeks, I will be posting my predictions under the following groupings:
- Preparing IT for IoT
- IT must do more than store and protect data
- Get ready for new Data types
- Methodologies for IT Innovation
These are my own prognostications and should not be considered as representing Hitachi’s opinion.
Preparing IT for IoT
Prediction 1: IT will adopt IoT platforms to facilitate the application of IoT solutions.
The application of IoT (Internet of Things) solutions can deliver valuable insights to support digital transformation and is rapidly becoming a strategic imperative in almost every industry and market sector. To achieve this, IT must work closely with the operations side of the business to focus on specific business needs and define the scope of an IoT project. IoT is an opportunity that can benefit all industries, whether it is a highly automated manufacturer or more manually oriented businesses like agriculture, which can benefit from timely, connected information about weather, soil conditions, equipment maintenance, etc.
Building IoT solutions that provides real value can be difficult without the right underlying architecture and a deep understanding of the business to properly simulate and digitalize operational entities and processes. This is where the selection of an IoT platform and the choice of an experienced services provider is important.
IT will be challenged to acquire the skills, to build this platform if it has to be developed from scratch. Utilizing a purpose-built IoT platform like Hitachi’s Lumada, will speed up time to value and free up IT teams to focus more on the final business outcomes. Depending on the complexity of the project, it might be implemented as an appliance, or could be a distributed platform from edge to gateway to core to cloud. Evaluate the available IoT platforms from experienced vendors before you commit to the time and resources to build your own.
Prediction 2: Movement to the next level of virtualization with containers.
IoT applications are designed to run in a cloud like environment for scalability and agility. Container-based virtualization are designed for the cloud and will gain wide acceptance in 2018.
Containerization is an OS –level virtualization method for deploying and running distributed applications on a single bare metal or VM operating system host. It is the next generation up from virtual machines (VM)s where the traditional virtual machine abstracts an entire device including the OS, containers only consist of the application and all its dependencies that the application needs. This makes it very easy to develop and deploy applications. Monolithic applications can be written as micro services and run in containers, for greater agility, scale, and reliability.
Everything in Google runs in containers from Gmail to YouTube. Each week they spin up over two billion containers. Almost every public cloud platform is using containers. The level of agility and scalability we see in the public cloud will be required in all enterprise applications, if we hope to meet the challenges of exploding data and information in the age of IoT.
Hitachi has adopted containers in all their new software platforms and is rapidly converting key legacy platforms over to containers, to not only realize the benefits of containers in our own operations, but to facilitate the use of containers by our customers. Our Lumada IoT platform is built on containers and micro services to ensure that it can scale and be open to new developments. We also provide a VSP/SVOS plugin to provision persistent VSP storage in containers. (For more information on our use of Containers see my previous blog post)
Prediction 3: Analytics and Artificial Intelligence
One of the primary objectives of IoT platforms is to gather data for analysis, which can also be learned and automated through AI. In 2018, we will see more investment in analytics and AI across the board as companies see real returns on their investments.
According to IDC, revenue growth from information-based products will double the rest of the product/services portfolio for a third of Fortune 500 companies by the end of 2017. Data monetization will become a major source of revenue as the world will create 163 zettabytes of data in 2025, up from 10 zettabytes in 2015. IDC also forecasts that more than a quarter of that data will be real-time in nature, with IoT data making up more than 95-percent of it.
Preparing a wide range of data for analytics is a struggle that affects both business analysts and the IT teams that support them. Studies show that data scientists spend 20% of their time collecting data sets and 60% of their time cleansing and organizing the data, leaving 20% of their time doing the actual analysis. Organizations are increasingly focused on incorporating self-service data preparation tools in conjunction with data integration, business analytics, and data science capabilities. The old adage “GIGO” (Garbage in, Garbage out) applies to analytics. This starts with data engineering: cleansing, shaping, transforming and ingesting data. Data preparation: refining, blending, preparing and enriching; before analytics can build, score, model, visualize, and analyze.
AI has become mainstream, with consumer products like Alexa and Siri. Hitachi believes that it is the collaboration of AI and humans that will bring real benefits to society. Through tools like Pentaho Data Integration our aim is to democratize the data engineering and data science process to make Machine Intelligence (AI + ML) more accessible to a wider variety of developers and engineers. Using tooling like the Pentaho Data Science Pack with R and Python connectivity are steps in that direction. Hitachi’s Lumada IoT platform enables scalable IoT machine learning with flexible input (Hadoop, Spark, No QL, Kafka streams) and standardized connections that can automatically configure and manage resources, and provide flexible output (databases, dashboards, alert emails, and text messages) and, in addition to Pentaho analytics, is compatible with python, R, and Java for machine learning.
This is an area where IT departments will need to learn new languages and toolsets for data preparation, analytics and AI. Integrated tools, like Hitachi’s Pentaho, will greatly reduce the learning curve and the effort.
In my next post, I will look at how data requirements are expected to shift in 2018, and the tools needed to address the coming changes.