Over the last several decades, the evolution of artificial intelligence has followed an uncertain path – reaching incredible highs and new levels of innovation, often followed by years of stagnation and disillusionment as the technology fails to deliver on its promises.
Today we are once again experiencing growing interest in the future possibilities for AI. From voice-powered personal assistants like Google Home and Alexa, to Netflix’s predictive recommendations, Nest learning thermostats and chatbots used by banks and retailers, there are countless examples of AI seeping into everyday life and the potential of future applications seem limitless … Again.
Despite the mounting interest and the proliferation of new technologies, is this current wave that much different than what we have seen in the past? Do the techniques of the modern AI movement – machine learning, data mining, deep learning, natural language processing and neural nets – deserve to be captured under the AI moniker, or is it just more of the same?
In the earlier peaks of interest, the broad set of activities that were typically bunched together under the term ‘AI’ were reserved for the labs and, if they ever saw the light of day, they were severely constrained by what the technology of the day could deliver and were limited by cost constraints. Many of the algorithms and structures central to AI have been known for some time; rather, previous surges of AI had unrealistic expectations of immediate consumer applications that could never be accomplished given limitations of the data and techniques available at the time.
However, within the last five years, the combination of enormous amounts of data and improvements to database technology to effectively utilize it, along with increases in computer horsepower to process it, have enabled AI to move from mainly scientific and academic usage to enterprise software usage, becoming viable as a mainstream business solution.
This time around, the AI movement seems to have tailwinds in the form of a few critical enabling and supporting factors:
- Technology and computing horsepower driving AI capability at the right (aka low) price point
- The availability of platforms from major players in the field like Google, Microsoft, Elon Musks’ OpenAI, Amazon, etc.
- Mainstreaming of practice – slowly building critical mass of practitioners who leverage these platforms
- Mainstream customer interest and demand reflected in real world ‘everyday’ use cases – data security, computer assisted diagnosis in healthcare, fraud detection, purchase prediction, smart home devices and more
- Increasing mass of data that is waiting to be exploited, which cannot be done solely by human means
- Changed customer expectations from what is doable using technology, which is further driving innovation in a secular manner (e.g., Alexa)
As the tide is turning for AI, innovation- and technology-driven corporations and their leaders – think IBM, Yahoo, Salesforce, Uber and Apple – have become believers in the power of AI and are willing to commit long term funds to this pursuit. The desire to inject new technology into their operations to drive corporate efficiency or improve workflows (both customer and back-office) has convinced many large corporations that this new iteration of AI is worth investigating and worth investment through acquisitions and investments in startups that innovate independently.
In addition, tech heavyweights Google, Facebook, Amazon, IBM and Microsoft recently joined forces to create a new AI partnership dedicated to advancing public understanding of the sector, as well as coming up with standards for future researchers to abide by.
With so much support from these titans of industry, it’s no wonder that the latest burst of AI interest seems to be gaining momentum rather than losing it. But are the techniques used today truly what is meant by AI?
As is typically the case in questions of technology and business, the answer is yes and no. Just like there are varying levels of complexity in other areas of technology (consider the range of databases from simple to complex, from SQL to NoSQL; or the range of programming languages: LOGO, BASIC, C, Perl, Swift, R) there are many technologies and techniques that naturally fall under the moniker ‘AI.’
AI as a technology is nebulous. Would machine learning be possible without access to large amounts of data from a traditional SQL or a cutting-edge NoSQL environment? Could an AI package be effectively used without modern concepts of APIs and REST services?
In my opinion, all of the tools commonly covered and discussed today are a part of the larger AI family of technologies that are going to drive the next generation of consumer, corporate and government solutions.
On the other hand, you have to remember that true “artificial intelligence” won’t happen anytime soon – at least no examples that can act independently of human intervention. A true AI system has the ability to learn on its own, making connections and improving on past scenarios without relying on programmed algorithms to improve capabilities. This is, thankfully, still the realm of science fiction.
What is called AI even today is, in fact, the leveraging of machines with minimal – though not zero – human intelligence to solve specific, narrow problems. Humans still have the upper hand as machines cannot think on their own and rely on human intervention (through code) and past data to be able to work. They can be better at finding patterns that humans can miss and can find similarities between objects, but this is possible only through sheer horsepower. Even with today’s state-of-the-art, they will never be able to invent something totally new or independently address a problem that they have never come across before.
Most of what passes for AI today is the sophisticated application of statistical techniques to data invented in the past four-to-five decades, not ‘real’ intelligence. Please note, however, that this designation does not detract from the immense capabilities afforded by the newfound resurgence of AI. It may not be fundamentally “intelligent,” but is no less useful and impressive.
While the core technologies of AI are similar to those of prior years and the term AI has become somewhat of a catchall for a variety of different techniques, the biggest difference – and perhaps what will spur future cycles of interest – is the thirst for and commitment to more from both corporations and consumers. With continued funding, research and interest in AI and with advances in the tools and techniques needs to capitalize on them, perhaps one day we will finally witness the emergence of true, independent AI.
This article was written by Terence Davis, VP and Chief Architect, Incedo from NetworkWorld and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to firstname.lastname@example.org.