jameskobielus44
First question. http://www.via-cc.at...
Peter Burris
Which decade?
George Gilbert
how about algorithms that learn to make predictions or prescriptions from data rather than from explicitly programmed rules?
jameskobielus
@plburris Your choice.
Bert Latamore
I would define it as the ability of a system to reach conclusions based on the data available to it, to learn from those, and to in some cases make decisions based on them.
Peter Burris
I think it's interesting that the history of AI really is a history of hardware and data acquisition -- and less a history of insight into intelligence or algorithms.
jameskobielus
Augmented intelligence: AI augments humans’ organic powers of cognition, reasoning, natural language processing, predictive analysis, and pattern recognition. The incorporation of these capabilities into mobile, Internet of Things, wearable, and other mass
Bert Latamore
@plburris Is that changing now? For instance, Watson is a software system, not hardware.
jameskobielus
@BertLatamore Yes, it's data-driven adaptive algorithmic intelligence that gets smarter as you feed it more data.
jameskobielus
@ggilbert41 Yes, that's right. It's all about the business logic being derived from correlations (predictive, etc.) learned implicitly & algorithmically from the data, rather than being explicitly programmed into it.
Peter Burris
@BertLatamore I'm not sure that's entirely true. Many of the algorithms behind Watson -- per a briefing from a few years ago -- have been around for a long time. What's especially new is the packaging and interface and cloud access.
Bert Latamore
Are Cognitive and Machine Learning true AI? I read an interesting article in Discover a month or 2 ago that strongly implied that real AI is still a little over the hirozon.
jameskobielus
@plburris I disagree. AI has evolved through theoretical advances in the understanding of cognition (e.g, neuroscientists, psychologists, as well as theorists such as Minsky) and through practical advances in areas like neural nets & big data
jameskobielus
@BertLatamore "True AI"? I doubt there is such a thing. "AI" is a hodgepodge category of rule-driven expert systems, stat-driven machine learning, evolutionary & genetic algorithms, etc. It's no ONE thing.
Peter Burris
Minsky -- a super smart, influential guy -- has been dead for over a year. His theory of mind has been in place for between 35-40 years. Similarly, base neural net algorithms have been defined since the 1970s.
jameskobielus
@plburris When you look at the guts of, say, Watson, though some of the core techs (e.g, semantic web, NLP, machine learning) have been around for a while, IBM has poured massive R&D into evolving them. Ditto Google, FB, etc.
Peter Burris
Certainly, AI algorithms have advanced, but often in response to optimization challenges as hardware evolves.
Bert Latamore
I think one problem with the discussion in general (not in this group) is that the definitions of terms are often fuzzy.
Neil Raden
@ggilbert41 I wouldn't define AI as algorithms. Algorithms drive the calculations in AI but they are computer science proxy for AI
jameskobielus
@NeilRaden No. AI is not defined as algorithms, but it's algorithmic intelligence. AI relies on an ever-growing array of data-driven statistical algorithms, including machine learning, deep learning, reinforcement learning, etc.
jameskobielus
@plburris Sure, but that's like arguing that there have been no advances in, say, distributed computing protocols since core protocols such as TCP/IP came along in the 70s.
Peter Burris
@NeilRaden We need AI to define AI.
jameskobielus
@plburris Here's one way of getting our heads around the continued advances of ML and its march into the deepening layer of DL. Asimov Institute's "Neural Network Zoo" http://www.asimovins...
jameskobielus
@plburris I'm reading new R&D papers on advances in this field every day. It's amazing how fertile this is for innovation in the underlying AI algorithms and approaches. Transfer learning, for example, is super hot now in R&D.