IoTAIMicroservices

Building IoT AI Microservices
Discuss development, deployment, and management of AI microservices all the way to the IoT edge
jameskobielus
First question. http://www.via-cc.at...

George Gilbert
how about algorithms that learn to make predictions or prescriptions from data rather than from explicitly programmed rules?
Bert Latamore
I would define it as the ability of a system to reach conclusions based on the data available to it, to learn from those, and to in some cases make decisions based on them.
Peter Burris
I think it's interesting that the history of AI really is a history of hardware and data acquisition -- and less a history of insight into intelligence or algorithms.
jameskobielus
Augmented intelligence: AI augments humans’ organic powers of cognition, reasoning, natural language processing, predictive analysis, and pattern recognition. The incorporation of these capabilities into mobile, Internet of Things, wearable, and other mass
Bert Latamore
@plburris Is that changing now? For instance, Watson is a software system, not hardware.
jameskobielus
@BertLatamore Yes, it's data-driven adaptive algorithmic intelligence that gets smarter as you feed it more data.
jameskobielus
@ggilbert41 Yes, that's right. It's all about the business logic being derived from correlations (predictive, etc.) learned implicitly & algorithmically from the data, rather than being explicitly programmed into it.
Peter Burris
@BertLatamore I'm not sure that's entirely true. Many of the algorithms behind Watson -- per a briefing from a few years ago -- have been around for a long time. What's especially new is the packaging and interface and cloud access.
Bert Latamore
Are Cognitive and Machine Learning true AI? I read an interesting article in Discover a month or 2 ago that strongly implied that real AI is still a little over the hirozon.
jameskobielus
@plburris I disagree. AI has evolved through theoretical advances in the understanding of cognition (e.g, neuroscientists, psychologists, as well as theorists such as Minsky) and through practical advances in areas like neural nets & big data
jameskobielus
@BertLatamore "True AI"? I doubt there is such a thing. "AI" is a hodgepodge category of rule-driven expert systems, stat-driven machine learning, evolutionary & genetic algorithms, etc. It's no ONE thing.
Peter Burris
Minsky -- a super smart, influential guy -- has been dead for over a year. His theory of mind has been in place for between 35-40 years. Similarly, base neural net algorithms have been defined since the 1970s.
jameskobielus
@plburris When you look at the guts of, say, Watson, though some of the core techs (e.g, semantic web, NLP, machine learning) have been around for a while, IBM has poured massive R&D into evolving them. Ditto Google, FB, etc.
Peter Burris
Certainly, AI algorithms have advanced, but often in response to optimization challenges as hardware evolves.
Bert Latamore
I think one problem with the discussion in general (not in this group) is that the definitions of terms are often fuzzy.
Neil Raden
@ggilbert41 I wouldn't define AI as algorithms. Algorithms drive the calculations in AI but they are computer science proxy for AI
jameskobielus
@NeilRaden No. AI is not defined as algorithms, but it's algorithmic intelligence. AI relies on an ever-growing array of data-driven statistical algorithms, including machine learning, deep learning, reinforcement learning, etc.
jameskobielus
@plburris Sure, but that's like arguing that there have been no advances in, say, distributed computing protocols since core protocols such as TCP/IP came along in the 70s.
Peter Burris
@NeilRaden We need AI to define AI.
jameskobielus
@plburris Here's one way of getting our heads around the continued advances of ML and its march into the deepening layer of DL. Asimov Institute's "Neural Network Zoo" http://www.asimovins...
jameskobielus
@plburris I'm reading new R&D papers on advances in this field every day. It's amazing how fertile this is for innovation in the underlying AI algorithms and approaches. Transfer learning, for example, is super hot now in R&D.
jameskobielus
Question 3: http://www.via-cc.at...

George Gilbert
is Kubernetes the control plane for this?
Peter Burris
Big challenge. A variant on the question, "How fast will software eat the world." Lot's of sensor-specific hardware, stacks, and protocols out there in the world of operational technology; most works fine, even if not "modern."
jameskobielus
@ggilbert41 Any cloud-native orchestration backbone could suffice: Kubernetes, Docker Swarm, Mesos DC/OS, etc. However, the "control plane" might be deeper than that, including perhaps traffic/policy layers such as Istio.
jameskobielus
@plburris Yes, not every element of the application architecture can or should be containerized. Stateful apps can't, for example. If an AI app has stateful capabilities (such as managing persistent data), it may not be suitable for containerization.
George Gilbert
remedial question: how do you manage stateful microservices?
jameskobielus
@ggilbert41 Stateful microservices need to maintain state in some repository--e.g, database--that is commonly accessible.
George Gilbert
can't those databases or KV stores be embedded in microservice so it can manage its own state vs microservice managing shared state in external dbms?
jameskobielus
Question 2: http://www.via-cc.at...

George Gilbert
my take: AI microservices are the models that define the structure and behavior of Digital Twins
Peter Burris
Microservices facilitate different rates of change within an AI system, especially rates of change for models.
Peter Burris
Microservices facilitate different rates of change within an AI system, especially rates of change for models.
jameskobielus
@ggilbert41 Conceivably, you can architect each "digital twin" as its own intelligent AI microservice that is "entangled" (through a constant feed of training/sensor data) from its analog twin.
George Gilbert
also: a #digitaltwin could integrate multiple AI microservices, no?
jameskobielus
@plburris Yes, that's important as AI-based orchestrated apps grow more sophisticated (e.g, generative adversarial nets, which twin a "generator" to a "discriminator" net). If each of them is its own microservice, you can decouple their learning rates.
jameskobielus
@ggilbert41 For sure. Let's say the "analog" twin is an industrial robot that has separate cognitive, affective, and geospatial faculties. It's digital twin could be an orchestrated triad of AI microservices associated with each faculty.
jameskobielus
Question 4: http://www.via-cc.at...

George Gilbert
smaller edge devices (below gateway) are best for complex event processing of sensor data. full models need bigger gateways and the ability to send anomalies to the cloud for retraining.
jameskobielus
Yes. There may be plenty of anomalous "edge cases" for AI-driven apps that the "provisioned from cloud" algorithms aren't sufficiently trained to handle accurately. Either the edge devices do some retraining locally, or request cloud-based retraining.
jameskobielus
Google has a "federated training" service in alpha that addresses this to some degree. See my Wikibon blog here: https://wikibon.com/...
George Gilbert
has anyone even tried offering this federated training publicly?
jameskobielus
Question 8: http://www.via-cc.at...

Bert Latamore
Is the market anywhere near defining what is really needed for a well-designed tool set?
jameskobielus
@BertLatamore It isn't even close to defining a standard AI/deep learning workbench. The Spark arena--for ML/streaming--is much further along in that regard (eg., check out IBM, Cloudera, Databricks, etc.)
Bert Latamore
Based on history, it will probably take a decade or more to arrive at a real definition that everybody accepts.
jameskobielus
@BertLatamore I doubt there will ever be a definition of "AI" or "microservices" or even "IoT" that literally EVERYBODY accepts. Heck, we've emerged from Big Data mania with a zillion divergent definitions. None of these other paradigms will settle.
Bert Latamore
It took more than a decade to arrive at a good definition of what word processing should encompass, and this is much more cmplex.
John Furrier
I think that the tools are weak but emerging
jameskobielus
Question 6: http://www.via-cc.at...

George Gilbert
software companies trying to manage edge devices tell me that for all but large products like cars and industrial assets, the compute and memory footprint is very constrained at the edge
Bert Latamore
@ggilbert41 That relates directly to Peter's comment on the first question, that advances in AI are often connected to advances in the underlying hardware.
jameskobielus
@ggilbert41 Yes. But there are a range of industry initiatives to compress the algorithms and data for AI deployment on resource-constrained endpoints.
jameskobielus
@BertLatamore For sure. These are compute-intensive, data-intensive, memory-intensive algorithms: often massively parallel fast matrix transformations. Neuromorphic chips, GPUs, ASICs, etc essential to progress.