Today, HPE paid $650 to acquire Simplivity. HPE paid this sizeable amount mainly to acquire OmniStack, Simplivity’s data virtualization software platform. OmniStack’s central value proposition is OPEX reduction through the elimination of traditional infrastructure management tasks. In short, Simplivity enables customers to manage their infrastructure at the virtual machine (VM) – level. Please note that there will be a separate EMA Impact Brief covering this topic in much more detail.
Customers in 2017 will demand solutions to claim back today’s massive CAPEX and OPEX waste of over 50% of their total IT spend. All our 2017 predictions are directly derived from the rapidly increasing pressure to reclaim these resources and leverage them to achieve direct business advantages in today’s highly competitive and fast moving marketplace.
In an ideal world, customers would be able to fully take advantage of the benefits of hybrid cloud by rationally matching infrastructure parameters -cost, performance, reliability, availability, security, regulatory compliance, scalability- with the requirements and dependencies of each application.
As we -Evan and I- were ranting last week about how OpenStack and VMware fit together (see #EMACloudRants), we were mainly focusing on the central conundrum that VMware faces within this context: “Should we support an open platform that could commoditize away a substantial part of our profitable infrastructure business or should we ignore the threat and do our own thing”
As promised in my previous post on “Software Defined Storage – Why Customers Should Care”, I want to follow-up with a brief overview of the competitive landscape.
Evan Quinn and I have been collecting popular customer questions for a while and wanted to share our thoughts on these questions in the form of a new format: EMA CLOUD RANTS. Each week we will discuss one of the hot topics in enterprise IT to provide the viewer with rapid analyst insights, without any fluff. Here goes the first one:
Of course, I always encourage practitioners to carefully study the full EMA research report on the “Obstacles and Priorities on the Journey to the Software-Defined Data Center” or at least read the research study summary or at the very least join the EMA SDDC Research webinar on February 18, but I still want to briefly summarize the key findings here.
What does Big Data mean to traditional enterprise IT? Organizations of any size and industry are becoming more and more aware of the incredible importance of capturing, managing and analyzing the data available to them. The more comprehensively companies are able to tap structured and unstructured data sources, the quicker they can refresh this data and the more successfully they make this body of data available to all business units, the better they can develop advantages in the market place. Today’s business units are demanding the rapid implementation of these big data use cases, as well as optimal resiliency, cost efficiency, security and performance.
Much marketing hype and heated discussions should be seen as excellent indicators for the fact that Software Defined Storage (SDS) is one of the hottest topics in today’s data center. Naturally, every vendor defines SDS based on their own product range, sometimes leaving customer out of the equation.
As every year, IBM invited the analyst community to Stamford, CT, for a deep dialogue on today’s most important topics in enterprise IT. Here is a short overview for everyone interested in IBM’s current world view.