Artificial intelligence · Software development

Is AI just another meaningless buzzword?

Richard Pridham Investor, President & CEO at Retina Labs

December 29th, 2017

AI seems to be the big thing nowadays. Everything is AI this or AI that. What find remarkable is that some companies, that have been around for a while doing pretty much the same thing are now calling themselves AI. I'm seeing this in several areas. One example is in data analytics. We went from "big data" to "predictive analytics" and now many are referring to themselves as "AI" companies. What's changed here? The same development tools are being used. The applications are doing the same thing. Is AI being used far too flippantly? What technology truly merits the term AI? Is this definable in objective terms?

Rob Sherali Inventive, Creative, Structured, Tenacious - looking to do something that means something!

December 30th, 2017

Is the term AI being used too flippantly - It depends on what the conversation is. If the objective is to make decisions about whether to invest in 'AI' to solve a problem in your domain of focus - then yes it probbaly is, it is important to know what you are trying to solve with it or what opportunities you want to explore before you open your wallet!


If the overall objective of the use of the word is to loosely discuss "Computers doing something that mimic humans in some way" - then I guess it doesnt really matter.


As per specifics about AI - I think this covers it quickly - nicely: https://www.youtube.com/watch?v=2ePf9rue1Ao

Richard Hammond Quantum is logical. Organizations grow like gardens. Become a knowledge farmer.

January 8th, 2018

Hi Richard. A couple of ideas. Firstly, the term “AI” is such a great marketing term. Perhaps another example, we can see the same thing a bit with VR and AR. Augmented reality seems to me to be taking the lead. Not because the underlying technologies are merging, but rather the term “AR” is a bit more flashy. From a marketing standpoint, maybe because it starts with the nonthreatening “A” versus the very harsh “V”.


A bit more in the weeds. Languages are fractal in nature, meaning in part that as the ideas become more complex, the language provides attendant complexification. Specialists have more terms available. Perhaps in your company, the term “telemedicine” is broken down into sub-types. But the term “telemedicine” is as already distributed term that is relatively well known. (And again, IMHO, the nice, soft, “tele” and “mmmm” are equivalent to bedside manner.)


Currently, the term “AI” is a good broadbrush term. However, the underlying technology is like fire to the cave person (even those that lived on the plains). It is great and it is terrible.


There may be only subtleties between say, big data and AI, but I think as a technology, it is creating many challenging questions. You may know the parable of mechanical toys in days gone by. Some were so good that people thought they were alive. AI takes up that history, but of course is significantly more powerful.


If you're really interested, I could recommend Ray Kurtzweil's idea of The Singularity, when machines become as smart as their creators... us! Again, I think the term “as smart as” is pretty broad, but four paragraphs seems close to abusive, so I'll stop there.


Thanks!

Erica Singh Patalysical AI developer

December 29th, 2017

It is. Most companies in india just see what’s trending on Twitter. And call themselves that.

Eugeniu Rotari

Last updated on January 8th, 2018

The author of AI and Deep Learning algorithm (invented 30 years ago, b.t.w.) confirms, that people tend to ascribe it some unrealistic capabilities; this article, with a few links to a published research paper in it discusses some issues.

Saransh Sharma Product Developer

January 7th, 2018

See the distinctions between these words


1. Big Data was the time when everyone was collecting data while really do not have any clue what to do with it.

2. Statistic came into picture which applied mechanics to moving data and then some patterns emerged.

3. Predictive analytics is just pattern recognition of the data but it came with a twist actually predictive analytics did a quite well (every one started using it) More or less its rule based (function)

4. AI is what all of the mixed together.

Akeem Famuyiwa Intellectual Property Consultant

December 29th, 2017

Hi Richard, I do not think AI in it entirety is a meaningless buzzword. The problem is how easy it is to reduce this words to a buzzwords due to oversimplification in daily use. "Innovation" was once in the same the same bracket. I see it this way: headline readers vs practioners Have always use this approach to silence the noise and focus on the music.

Julian Garrett Founder @ Aliniant, Mobile telecommunications performance analytics company

December 29th, 2017

It's not a meaningless buzzword, but has been abused pretty badly. If you look at the history of AI, you can trace it back to the 80's where Neural Networks got started, and there was a sortof similar buzz back then. It was DARPA that is responsible for the recent hype with their work in vision systems - it's not actually hype, the results you can see pretty clearly - it works well enough for car manufacturers to deploy the systems in an environment where you may not get a second chance to get things right.

But thats just one application. And when you look at that application it is a very narrowly focused system with a limited number of inputs, an output that controls 3 things - accelerator, brake and the steering wheel. (and gearbox, lights etc, but that is fairly minor). Thats the one major breakthrough we've had. IBM Watson has made advances, especially in the field of medicine, a lot of legal work has been analysed but any real advances are fairly limited - it's been good at digesting data and categorising things which is exactly what it was designed to do.

The cases where comnpanies like Huawei and Apple are saying that they have AI in their phones - utter tripe. Complete bollocks. If you are looking for something that looks at patterns behind the way you normally do things - wake up at 7am on average (would you like an alarm for 7am?), get a text from your mates about a party on (would you like me to set an alarm for that?) - it's not AI. At all. It recognises a dog as opposed to a cat. Woohoo. AI - barely. And again - based on vision systems.

True AI where you have contextualised learning and adaptive discrimination of events (I haven't seen this before, but based on what I know, I'd say this is the problem etc) - we're miles away from.

There's a good presentation on youtube on this from DARPA - just look for John Launchbury DARPA and it'll be the first one to appear.

My limited work on ML is in cellular networks - anomaly detection and performance management. Very slow going. We've made some progress but I'd hesitate to call it ML. It's more about algorithm selection and clustering of data - and in very limited roles. The one thing it has taught us is how we go about approaching the problem if we want to consider and combine a lot of use cases.


Justin T. Heck Owner @ GetFire.net

December 29th, 2017

Machine learning is pretty huge, but it has to be tuned for every application. Nothing out there comes close to the traditional sci-fi idea of AI.

Seems like a good term to help juice VCs though.

Roajer Gilbert Entrepreneur, Like to connect with highly motivated people

December 30th, 2017

AI solves a automation problem but in a fancy way. You could still write complex programs to do automation or just grab a proven AI algorithm and train it.


Every business needs automation for better ROI but it shouldn't be the other way of introducing a new component and dealing with the issues caused by it. Which I see happening a lot just because everyone are talking about it.


The only challenge I have is, tagging AI with innovation. Of course If we build a new algorithm that solves a complex problem it is innovation but just using those existing algorithm is not.


AI is been around for a long time and many (large) companies had been using it.


It became a popular buzzword now because of easy accessibility for any individuals. This happened because of the explosion of cloud services and reduced cost for these previously expensive complex solutions.


These technology advancements by cloud providers made any average developer to implement a fancy prediction solution without becoming a data scientist.


For example, few years ago Netflix spend millions of $ and months to build their recommendation algorithm which can be done over the weekend using some cloud free subscription.


Johnathan Lightfoot INFORMATION TECHNOLOGY | BOT ARCHITECT | AI

December 29th, 2017

Hi Richard,

I hear you, AI and related terms (ML, Big Data, IoT) are being thrown around with no regard to what they truly mean. In my opinion I don't think true AI is a buzzword, just that some Marketing types have gotten a hold of the word and use it willy-nilly to make their service or product seem more attractive. I am reading "The Future of Leadership: Rise of Automation, Robotics, and Artificial Intelligence" currently, the author does a wonderful job at defining the terms as well as qualifying what constitutes AI. They also do a good job at predicting how these technologies will play out in the future. A very enjoyable and educational read, that I would recommend you have a look at.

Johnathan