IBM’s Differentiation in 10 Quotes from Think 2018 – Let’s Put Smart to Work

Mar 27, 2018 7:29:10 AM

Here are the 10 quotes that best sum up #Think2018. Inspired by Think 2018, I came up with my own ideas for AI bots and posted them here.

“Building a platform that brings data and AI together to take the organization to the next level”, Ginni Rometty, CEO. Instead of competing with infrastructure vendors and hyperscale clouds, IBM fully focuses on combining cloud, data, and AI into platforms that deliver individualized user experiences based on the end users intent, individual background, and role in the organization, as well as external contextual factors.  The difficult stuff is where differentiation can be found, and IBM definitely is focusing on the difficult stuff. 

“All processes can be made better through AI”, Ginni Rometty, CEO. This statement is as true as it is important to demonstrate IBM’s vision of infusing all aspects of the business and IT with AI to ensure optimal decision making based on business intent and relevant situational factors. Too many business processes were designed around dated beliefs and implicit assumptions and have led to inefficiencies, frustration, and unhappy customers. 

Every process can be improved through AI.

“Artificial intelligence enhances human employees. It does not replace them”, John E. Kelly III, SVP Cognitive Solutions: The key purpose of AI is to transform today’s approximately 80% of data into actionable knowledge to make employees more productive. The goal here is to augment reality through real-time AI-driven data analysis for optimal decision making and to cut tedious manual tasks out of an employees daily routine.

“Data is the precondition for successful AI. Trust is key for obtaining access to data”, David Kenny, SVP Watson & IBM Cloud: IBM’s freshly released Cloud Private for Data aims to fully abstract data governance from algorithms, applications, and APIs consuming and processing this data. This architectural principle of centrally enforcing compliance, and security for the entire data access layer prevents leaks and breaches by design. Only when data owners trust their company's data architecture, will they be willing to offer up their data for AI-driven analysis. This has become even more true after the recent Facebook data breach. 

The IBM Cloud aims to onboard all applications and enhance them through AI.

“IBM’s one cloud architecture aims to onboard any application, no matter how old or difficult”-David Kenny, SVP Watson & IBM Cloud: Recent EMA research has shown that customers want policy-driven deployment, a unified management and security layers, and the ability to enable their existing IT operations staff to operate bare metal, VMs, containers, PaaS, and FaaS. Whether an application should live in the data center on a VM or on a public container service should only depend on where it can be operated and used in the most efficient manner. It is important to note that this is different from one company to the other, as operational efficiency and compliance depends on the skills, experience, and capacity of corporate IT staff and tooling. Long story short, I fully agree with IBM's vision of a unified architecture for everything from mainframe apps to micro services.

Containers are an important tool, but they should not dominate the business conversation: This is not a quote, but a key observation from this year's IBM Think 2018. IBM built its cloud on Kubernetes and leverages containers as an efficient means to deliver business services and entire application runtime environments in a modular manner. However, IBM makes the right choice to leave the marketing arms-race of who has created the best container management platform to its competitors and focuses on building business value based on AI on top of Kubernetes containers instead. Customers are ultimately only concerned to run their enterprise software and AI systems in an environment that offers the desired balance between cost and risk. Whether this happens on containers or the mainframe is just a technical detail.

“Vertical expertise is key in ML/AI”: As the effort, duration, and cost of training ML/AI models is high and as ML/AI still requires very specific training today, IBM came up with Watson Data Kits. Watson Data Kits are vertical data sets that are accessible via REST API and can be used to train ML/AI models.

“Democratizing AI is critical”: Today, enterprises who can afford the most data scientists are at a significant advantage when it comes to leveraging AI. But even organizations that managed to obtain a large number of data scientists are unable to leverage AI anywhere near its full potential. This makes intuitive sense if you believe Rometty’s statement that each and every business process can and should be enhanced through AI. Not only are there thousands of processes, but there are also thousands of staff members who are typically needed to determine how to use AI to enhance the parts of the processes they are involved in. IBM has correctly recognized the massive opportunity that lies in enabling business to build their own AI applications based on internal and external data sources that they are familiar with.

Mainframe, bare metal, VM, container, PaaS, FaaS, SaaS - David Kenny wants all your workloads on the IBM Cloud

“Power9 and Quantum are IBM’s horses in the ML/AI hardware race”: Bob Picciano, SVP Cognitive Systems, claims that IBM Power 9 infrastructure completed the Clickstream AI benchmark in 91 seconds, while Google required 70 minutes. Power 9 can be tried as a service through the IBM Cloud, which makes it a low risk proposition to see if it makes sense for your ML/AI training tasks. 

SaveSave

Torsten Volk

Written by Torsten Volk

With over 15 years of enterprise IT experience, including a two-and-a-half-year stint leading ASG Technologies' cloud business unit, Torsten returns to EMA to help end users and vendors leverage the opportunities presented by today's hybrid cloud and software-defined infrastructure environments in combination with advanced machine learning. Torsten specializes in topics that lead the way from hybrid cloud and the software-defined data center (SDDC) toward a business-defined concept of enterprise IT. Torsten spearheads research projects on hybrid cloud and machine learning combined with an application- and service-centric approach to hyperconverged infrastructure, capacity planning, intelligent workload placement, public cloud, open source frameworks, containers and hyperscale computing.

  • There are no suggestions because the search field is empty.

Lists by Topic

see all

Posts by Topic

see all

Recent Posts