As AI robots and machines become more integrated into our everyday lives, complementary human to robot interpersonal relationships and rituals will develop further.
Around the world, every day, more and more drones, vehicles and smaller autonomous collaborative robots are joining human society, learning and adapting to the way we move, work and live. To make this coexistence seamless, the next logical step is the development of interaction paradigms for AI driven devices, enabling them to more accurately resemble the way humans interact with one another.
Artificial intelligence is set to analyze interpersonal relationships and human rituals in order to understand how to make people feel comfortable. The future of the technology lies in how successfully it can address real human issues.
For this technology to become even more deeply integrated into our lives, AI needs to be increasingly location aware, human and empathic.
Robots are joining human society, learning and adapting to the way we move, work and live.
AI driven assistants are increasing in popularity and are becoming more and more involved in daily life. From groceries to recipes, soundtracks to heating, users are ever reliant on the convenience and connectivity of their virtual attendants.
Devices like the Sanbot Nano comes with advanced communication functions and feature Amazon Alexa’s smart IoT integration.
The next logical step is the development of interaction paradigms.
As the capabilities of AI technologies increase, so does their potential to seamlessly integrate themselves into every aspect of our lives. The more they learn, the more capable they are of solving issues that are unique to each user’s behavior and needs.
LG’s latest prototype, the Airport Guide Robot, provides travelers with directions and information about boarding times. It speaks four languages — Korean, English, Chinese, and Japanese — and can even scan boarding passes and escort users to their departure gate.
Analyze interpersonal relationships and human rituals in order to understand how to make people feel comfortable.
Robots, equipped with advanced cognitive capabilities are already attempting to understand body language and behavioral codes, and to analyze interactions with humans, including:
- comforting people with sonic and motion effects
- learning when to approach and start a conversation with an appropriate topic
- making their intentions more transparent to people
- learning shared control policies by observing a human assistant (in care)
- legibility of intentions, expressing trade-offs (both ways)
- incorporating gender cues to influence acceptance by humans.
The future of the technology lies in how successfully it can address real human issues.
Current research is exploring how human intelligence can be augmented by artificial intelligence to boost learning in users of all ages.
This car builds a picture of the driver’s skill level and constructs a training program that provides advice through dialogue and a large dashboard display. As a result, drivers can adapt and experience a vehicle that behaves the way they want it to, and enjoy driving to an even greater degree.
Adaptive Robot Language Tutoring
These robots trace the learner's knowledge to decide which skill to teach next based on the likely effects on the learner. Interacting with a peer-like social robot with a growth mindset can promote the same mindset in children, while personalized timing strategies help to promote learning.
Growing Growth Mindset with Social Robot Peer
This work explores a novel paradigm to foster a growth mindset in young children through playing a puzzle solving game with a fully autonomous peer-like social robot.
“For this technology to become even more deeply integrated into our lives, AI needs to be increasingly location aware, human and empathic.”
Want to read more? Download our full 2018 Location Trends report here.