Top Seven Myths of AI
Blog: Jim Sinur
I think we have all realized that 2017 is the year of AI or at least we are out of the 35+ year AI winter. When AI first appeared on the big stage in the mid to late 80s, it over promised and under delivered for the most part. While there were spotty successes, AI was too hard and expensive to leverage. This time there are some great differences the successful use of AI, but myths still abound. Here are my top seven myths of AI today.
AI is a general intelligence that mimics humans:
While this has always been the dream, the reality of successful use today revolves around specialty problems that involves specific knowledge, rule and constraints. This is a great way to focus around beneficial application of AI to special problems that require fast learning and adaptation.
About half of all our jobs will disappear soon because of AI:
While it is true that certain jobs will be displaced, there will be other jobs created. They said the same thing about computers. Yes robots and AI will take some jobs away, but the majority of AI applications will help people be better at what they do in a more efficient way.
AI is data, math, patterns and iteration only:
While a number of early AI problems used this formulaic approach, today we have natural language processing, knowledge, voice, image and video/vision approaches to AI. There we a growing number of approaches emerging over time that will cross leverage and converge over a long period of time.
AI is only for the supper intelligent technology elite:
At one time, AI was programmed by only the expensive and elite types. Today however, machine learning, model driven and simple knowledge representation can be used as a starting point for iterative learning. Today we all can use chat bots.
AI is only for difficult or expensive problems:
AI is easily accessible today and embedded in a number of digital business platforms. The overhead for creating a solution is much less expensive. You can even find libraries of cognitive components to leverage (COGs) as a developer.
Algorithms are more important than data:
Yes we love our algorithms, but data has embedded knowledge and can be learned from to create rules and processing optimization’s. Mining data can teach us much about a problem domain. In fact a large number of robotic programming approaches start with data.