Emotional AI -Jerric @Nucarbn

Coverage by Bhat Dittakavi of Variance.AI on “AI Deep Learning” by Jerric Lyns John, CEO of a Nucarbn at T-HUB on 7th March.
Speaker Jerric from Bangalore, with his trademark pony tail and business acumen mixed with occasional swear words, let his love for AI speak.
His best quote for the evening: “Only when you imagine a promise land, you will eventually make it there. Being futuristic is the key.” He has a wearable startuo called Nucarbn.
They are coming up with a wearable device similar to fitbit but it has ability to track your emotion and understand and the human inside you. 
Excerpts from his talk and presentation
 
Artificial General Intelligence (AGI: Strong AI)
Artificial Super Intelligence (ASI)
Singularity > Intelligence of human brain
At some point the network wakes up! Singularity.
If research money in a technology crosses one billion, something serious is cooking. ASI (Singularity)
Ethics and human centric design (so we don’t trouble humans)
Artificial Emotional Intelligence.
Ethics on how not to kill human beings 🙂
The driving force in everyone’s life is emotion.
Fear is negative and the joy on the other side is positive. These are two extremes. We could only capture few emotions and not all.
Emotions lead to actions. At times, we think with our emotions.
Behaviour
Environmental stimuli.
Emotion –> Behaviour
Behaviours –> Emotion.
Next revolution is quantification of emotion, not just steps.
Science
-ANS (Autonomous Nervous System)
-Sympathetic Nervous Systems (SNS)
-Directly linked to motion. Neural projections.
-Heart and sweat glands.
-Excicatory impulses.
-Inhibitory Impulses. (to bring it down from the high, else you may die)
Wearables
Heart Rate
HRV (Heart Rate Variability)
Electrothermal Activity (bio-impedence)
Skin Responsiveness
PPG
Combinational light of different wavelengths
Oxygenated haemoglobin
Electrodermal Activities
-Monitor voltage across skin
-Skin Conductivity
-Tonic: Background conductance level
-Phasic: Fluctuations with events
-Latence, Amplitude, Recovery Time
-Hands sweat when you are nervous.
Enable AI to target you as a person. Custom AI. Tailored AI to understand you from you. Your activity that you do affects your heart and your skin. More beats, more heat. Less beat, less heat.
Let us label emotions now
 
Anger
-HRV Index decreases (distance between two beats of heart, closer distance means HR increased)
-EDA
-Skin Temp  (Skin conductance goes up and you get minutely electrocuted and we notice that electrical activity
-EDA increases skin conductivity
Affection
-Emotionless. Body is at peace.
-Heart Rate decreases (inhibitory impulses, excitatory impulses)
-EDA increases in tonic activity
-Skin Tem decreases
Fitbit is goal driven and is disconnected with your current emotion.
Apple Watch is going down. Wearable devices are becoming fad. It is more about understanding the person now. Enable a system that understand the person.
2020: When 5G enabled network is out there with latency of 1ms. Connectivity is no issue. There would be billions of devices. Hope to see system controlled IoTs. Most of these IoTs are controlled manually.
See home automation systems. They are lying wasted after initial excitement. Automation without understanding the person is going to go down. We track emotion. We can’t improvise the person’s life otherwise.
Q) Stickiness of wearable?
In wearable device industry, only first two months create excitement. Later on readings remain the same as they see the same readings and hence stop seeing value in the device. This is not core value. No life improvisation and hence wearable device dusts the shelves.
Wearable device makers
Integrating home control and personal fitness. This is our goal.
Most of the wearable devives understand your sleep pattern. But their advice is no good.
How long do you interact with a chatbot? What if you have a wearable that adjusts room temperature based on your body temperature right after your return from morning jog.
We are releasing a fully connected wearable device in 4 months. We use bio-impedence systems.
Pricing of our devices to be sold in USA
$129 per clothing
$199 per wearable device
We use kind of wireless charging. No batteries.

Artificial emotional intelligence
AIBRAINDUMP.COM session in Bangalore is coming on March 24th at midnight. Join us. We bring Valley guys there on skype. We are bringing some of the best Startups there. Join us.

 
Deepak from Samson
We have a public initiative. In AI, we have NLU focus beyond what capabilities we have today. It is about making human dialogue systems more natural. When you have multiple rounds of discussions, what kind of action items we can do?
Try these things on app store. (Jiffy Cal)
There are concepts called here and there. They are related to something. You, your friend and so on. There is some disambiguation which is
Anafora and diactic ambiguities that we are solving. We try to understand the temporal flow.
What is the standard NLP: 
Stamford Parser
Open source parsers
NLP is all about accuracy. We modified it with specific domains that we concentrate. We do lots of pre processing and post processing both in syntax and semantics. Not just syntax.

Jerric) Design thinking and user experience for users is cut from PC mobile to wearable to our case of zero interaction design. Don’t ask the person to do it, let us do it by ourselves.
Accuracy matters most for medical devices. FDA approval on Wearable clothing is easily possible.
Q) You are into personal assistance that doesnt get you bored. Right?
Yes.
Q) I work in Kavitha Vemuri lab at IIIT and we have many devices. Value is one thing and how you interpret is one thing.
Jerric) Sleep tracking can’t be understood by the device we have. Club the sleep data with brain waves that are linked with sleep pattern. This is how we are getting things cross checked.
First feed of data comes and we learn. Then we take conversational feedback from the user. Then we apply probabilities.
We use different streams of the data for self training the machine.
-Vision
-Motion tracking
-Wearable devices
Lab to reality, percentage of accuracy goes haywire. Deep learning with the time series and waveforms can extract more data. We can see the difference between standing and sitting.
You enable the system to work. Before we used sygmoid functions. Now we don’t know. My co-founder is asking systems with memory in it.
How do you provide user holistically proper experience?
It takes two days to understand your habits. Understanding the person helps us to improve  the life.
Our friend has a policy in his company to read at least 8 research papers. By now all the interns are able to read every day 8 research papers. Now they are doing great job.
When we do, we keep it updated. We are getting insights we never had.
Q) Deep learning?
We do normal classification earlier and we are doing deep learning now.
Q) Do we track gender relevant emotions?
Yes.
Q) Segments?
This one we are targeting now is for the ones with white collar jobs. They are early adapters and ok to go through the pains.
Q) Lab data versus field data?
Lab data is controlled. We recalibrate the field data.
Q) What kinds of emotions?
Those easily labeled emotions.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s