The human's supervising machine learning can always be replaced by darwinism
with the right "kill function". Not as algorithmically efficient but moves supervised to unsupervised
training. Just a matter of time.
Despite the vendor-driven hype, machine learning is not a miracle cure for businesses. In fact, it's something of a problem child, one that requires quite a bit of handholding and coddling. In a paper published this week, Ilias Flaounas, senior data scientist at developer tools biz Atlassian, argues that companies should be …
"we use it to predict future values of key business metrics"
And how did reality match prediction? Or are those still future values and hence untested? Or are the circumstances such that it becomes a self-fulfilling prophecy because the business works towards those targets in which case there's no evidence that that was the best use of the effort dedicated to it.
This validated machine learning prediction tool told me - on a test data - that I will win the lottery tomorrow; on the training data it said I had already won yesterday. Hence, today's prediction is: I will keep my job.
And talking about NLP: do I say that I will keep my job or the machine predicts that I will keep my job? What is it then with the lottery?!
Although I work in the medical sector, my original background is in software.
Just-for-fun I thought that I would dabble my toes in the Deep Learning waters.
I bought a new fast desktop PC with a stack of RAM and an Nvidia multiprocessor graphics card, and parked it in a quiet corner of the office.
I soon found that I had to use Linux not Windows. So I had to learn how to install and use Linux. I then had to install all the software bits & pieces to support AI running on the Nvidia card ... sheesh, a non-trivial exercise.
I then had to find a reasonable Deep Learning example to play with - again not easy.
Finally, after 3 months of tinkering, in spare moments between patient appointments, I got the flipping thing to run. I now have a real-time image classification program application using a webcam. Its amazing to see the system identify various objects & people in my office at around 10 FPS!
My conclusion? Firstly, developing - or even trying out - Deep Learning apps is currently not for the non-technical user ... although perhaps there are some super-duper easy-to-use development suites out there ... but I haven't seen any yet ..
Secondly, AI & Deep Learning apps are about to sweep the world ... I am still astonished at what my £700 computer can do. When this technology starts to be run on cheap hardware then our world will change drastically. BTW the first low cost AI 'run time' inference chips and boards will be available this year or next. We are essentially at Year Zero of Deep Learning adoption.
Biting the hand that feeds IT © 1998–2019