The oil and the engine, the sensors and the senses

Technologists dream of Artificial General Intelligence (AGI) - the day when machines can understand our world and perform almost any task that humans can. 

Raju Chithambaram Jan 14th 2019 A-A+
Raju_Chithambaram.jpg

My daughter, a business undergraduate student, was irritated. “Dad, every week you bring up a different technology. Big data, artificial intelligence, machine learning, internet of things (IoT), augmented reality. How do I keep pace with all this?”
 
“Can you remember four things - the oil and the engine, the sensors and the senses?”
 
She looked bemused. I elaborated, “Data is the oil, artificial intelligence (AI) is the engine, IoT is the sensor, and immersive experiences (virtual/augmented/mixed reality) are the senses. Together they are going to ‘intelligize’ companies in 2019.” Now, I had her attention!
 
Data is the oil, AI is the engine

Data is the oil on which AI engines run today (this idea is a variant of the concept described in the book Technology v Humanity by Gerd Leonhard). Twenty years ago, AI was represented by expert systems. Complex algorithms would work with small amounts of data on computers with a fraction of today’s available computing power. 
 
AI, or specifically machine learning, has now flipped the equation. Simpler algorithms like linear or logistic regression, probabilistic graph models (PGM), clustering, outliers, or time series filters are applied to massive data sets to derive insights, make decisions, and automate some actions. The cloud provides access to zettabytes of data, and computing power has increased exponentially. Machine Learning is about the “extraction” of patterns from these datasets. 
 
 The oil and the engine, the sensors and the senses

According to Forbes (Bernard Marr, April 2018), credit reference agency Experian leverages almost four petabytes of credit and spending data and applies machine learning to create a product that is customized differently for each individual user. My daughter’s eyes lit up. “Seriously, regression and time series? That is so simple, we do that in the second year at our college!” I savored the dad-daughter connection, but only for a moment. We had more to cover.
 
Deep learning (another branch of AI) takes AI up a notch from extraction to anticipation. It mimics the human brain in several ways. An array of neurons processes sound, images, and other data to recognize patterns. It then uses information layering to infer results similar to the way that the neocortex in the human brain functions. Deep Learning is used for image analytics in healthcare, natural language processing for virtual assistants, detection of deviant behaviors in cybersecurity, and goal setting in financial portfolios. Google’s Deep Learning Tool for example, can rapidly identify metastasized breast cancer with 99 percent accuracy, thus helping the patient in the recovery process and reducing costs for the service provider.
 
Engines don’t run well on crude (non-refined) oil. Data, like oil, must be refined. Data locked in “swamps” must be brought into a virtual “lake,” duplicates have to be reconciled, gaps imputed, and outliers removed. The AI engine then cranks away and delivers the benefits.
 
“Very cool,” my daughter said. “So where do the sensors and the senses fit in?”

...Engines don’t run well on crude (non-refined) oil. Data, like oil, must be refined. Data locked in “swamps” must be brought into a virtual “lake,” duplicates have to be reconciled, gaps imputed, and outliers removed. The AI engine then cranks away and delivers the benefits.
Raju Chithambaram
Chief Transformation Officer, UST Global

I explained, “The sensors tell us when and where these benefits need to be delivered and also measure the efficacy of these benefits. The senses immerse the user in the delightful experiences that deliver these benefits.”
 
IoT is the sensor, immersive experiences are the senses

The internet of things consists of 10 billion devices that collect and share data through an internet-connected chip. They vary from smart medication pills that use sensors to detect schizophrenia or bipolar disorder inside the human body, to sensors that detect faults in automobiles and automatically notify technicians, to Smart Cities with street networks that sense traffic and optimize lane management and traffic lights. Singapore was recently declared “Smart City 2018” by the Smart City Expo Congress for leveraging all things smart—from dynamic public bus routing to real time parent-teacher portals, and predicting a leak in a water pipe.
 
Immersive experiences extend our senses and make us part of the experience by using a variety of illusory environments. Virtual reality (VR) simulates real-life experience. For example, a surgeon in training can not only observe a surgery, she or he can be in the surgery room while the surgery is being performed. Augmented reality (AR) layers computer-generated enhancements atop an existing reality so that the user can interact with it. A building architect can go into a building and use AR to experiment with different placements and styles of windows, doors, hallways, and arches, and decide on the best design. Haptic retargeting allows the user to not only see and hear the virtual reality but touch and feel it. Gaze tracking technology can follow your eyes, read your intent, and align the experiences.

According to a report by CB Insights, Walmart “retail pods” will bring an immersive shopping experience to offices and other common locations. Users can shop in these virtual reality pods and pick up items from shelves. Computer vision and sensors will detect the shopper’s purchases. Robots in fulfillment centers will pack and ship the order to the user.
 
“I get it now,” my daughter declares. “Oil, engine, sense, sensors, equates to data, AI, IoT and immersive experiences. Together they ‘intelligize’.”
 
After a slight pause she asked, “Will these technologies make machines indistinguishable from humans?”

“Hmmm, not today or for some time,” I replied. “Humans have consciousness, and emotions -- machines don’t have these things. But technologists dream of Artificial General Intelligence (AGI) - the day when machines can understand our world and perform almost any task that humans can. Bring that question up again in 2019.”

Raju Chithambaram is Chief Transformation Officer at UST Global

Disclaimer: This article is published as part of the IDG Contributor Network. The views expressed in this article are solely those of the contributing authors and not of IDG Media and its editor(s).