For centuries, electricity was thought to be the domain of sorcerers – magicians who left audiences puzzled about where it came from and how it was generated. And although Benjamin Franklin and his contemporaries were well aware of the phenomena when he proved the connection between electricity and lightning, he had difficulty envisioning a practical use for it in 1752. In fact, his most prized invention had more to do with avoiding electricity – the lightning rod. All new innovations go through a similar evolution: dismissal, avoidance, fear, and perhaps finally acceptance.


Today, too many people view artificial intelligence (AI) as another magical technology that’s being put to work with little understanding of how it works. They view AI as special and relegated to experts who have mastered and dazzled us with it. In this environment, AI has taken on an air of mysticism with promises of grandeur, and out of the reach of mere mortals.

The truth, of course, is there is no magic to AI. The term Artificial Intelligence was first coined in 1956 and since then the technology has progressed, disappointed, and re-emerged. As it was with electricity, the path to AI breakthroughs will come with mass experimentation. While many of those experiments will fail, the successful ones will have substantial impact.

That’s where we find ourselves today. As others, like Andrew Ng have suggested, AI is the new electricity. In addition to it becoming ubiquitous and increasingly accessible, AI is enhancing and altering the way business is conducted around the world. It is enabling predictions with supreme accuracy and automating business processes and decision-making. The impact is vast, ranging from greater customer experiences, to intelligent products and more efficient services. And in the end, the result will be economic impact for companies, countries, and society.

To be sure, organizations that drive mass experimentation in AI will win the next decade of market opportunity. To breakdown and help demystify AI, one needs to consider two key elements of the category: the componentry and the process. In other words, identifying what’s behind it and how it can be adopted.

The Componentry

Much like electricity was driven by basic components such as resistors, capacitors, diodes, etc., AI is being driven by modern software componentry:

  1. A unified, modern data fabric. AI feeds on data, and therefore data must be prepared for AI. A data fabric acts as a logical representation of all data assets, on any cloud. It pre-organizes and labels data across the enterprise. Seamless access to all data is available through virtualization from the firewall to the edge.
  2. A development environment and engine. A place to build, train, and run AI models. This enables end-to-end deep learning, from input to output. Machine learning models, help find patterns and structures in data that are inferred, rather than explicit. This is when it starts to feel like magic.
  3. Human features. A mechanism to bring models to life, by connecting models and applications to human features like voice, language, vision, and reasoning.
  4. AI management and exploitation. This enables you to insert AI into any application or business process, while understanding versions, how to improve impact, what has changed, bias, and variance. This is where your models live for exploitation and enables lifecycle management of all AI. Lastly, it offers proof and explain-ability for decisions made by AI.

The Process

With these components in hand, more organizations are unlocking the value of data. But to fully leverage AI, we must also understand how to adopt and implement the technology. For those planning the move, consider these fundamental steps first:

  1. Identify the Right Business Opportunities for AI. The potential areas for adoption are vast:  customer service, employee/company productivity, manufacturing defects, supply chain spending, and many more. Anything that can be easily described, can be programmed. Once it’s programmed, AI will make it better. The opportunities are endless.
  2. Prepare the Organization for AI. Organizations will require greater capacity and expertise in data science. Many of today’s repetitive and manual tasks will be automated, which will evolve the role of many employees. It’s rare that an entire role can be done by AI. But it’s also rare that none of the role could be enhanced by AI. All technology is useless without the talent to put it to use, so build a team of experts that will inspire and train others.
  3. Select Technology & Partners. While it’s unlikely that the CEO will personally select the technology, the implication here is more of a cultural one. An organization should adopt many technologies, comparing, contrasting, and learning through that process. An organization should also choose a handful of partners that have both the skills and technology to deliver AI.
  4. Accept Failures. If you try 100 AI projects, 50 will probably fail. But, the 50 that work will be more than compensate for the failures. The culture you create must be ready and willing accept failures, learn from them, and move onto the next. Fail-fast, as they say.

AI is becoming as fundamental as electricity, the internet, and mobile as they were born into the mainstream. Not having an AI strategy in 2019 will be like not having a mobile strategy in 2010, or an Internet strategy in 2000.

Let’s hope that when you look back at this moment in history, you can do so fondly, as someone who embraced data as the new resource and AI as the utility to harness it.

______________________________________________

A version of this story first appeared on Informationweek.

 

Related Stories:

Was this article helpful?
YesNo

More from Analytics

IBM acquires StreamSets, a leading real-time data integration company

3 min read - We are thrilled to announce that IBM has acquired StreamSets, a real-time data integration company specializing in streaming structured, unstructured and semistructured data across hybrid multicloud environments. Acquired from Software AG along with webMethods, this strategic acquisition expands IBM's already robust data integration capabilities, helping to solidify our position as a leader in the data integration market and enhancing IBM Data Fabric’s delivery of secure, high-quality data for artificial intelligence (AI).  According to a Forrester study conducted on behalf of…

Fine-tune your data lineage tracking with descriptive lineage

4 min read - Data lineage is the discipline of understanding how data flows through your organization: where it comes from, where it goes, and what happens to it along the way. Often used in support of regulatory compliance, data governance and technical impact analysis, data lineage answers these questions and more.  Whenever anyone talks about data lineage and how to achieve it, the spotlight tends to shine on automation. This is expected, as automating the process of calculating and establishing lineage is crucial to…

Reimagine data sharing with IBM Data Product Hub

3 min read - We are excited to announce the launch of IBM® Data Product Hub, a modern data sharing solution designed to accelerate data-driven outcomes across your organization. Today, we're making this product generally available to our clients across the world, following its announcement at the IBM Think conference in May 2024. Data sharing has become the lifeblood of modern organizations, fueling growth and driving innovation. But traditional approaches to data sharing can often be a bottleneck constricting the seamless sharing of data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters