The creator of HuggieBot shares how she stayed true to her vision to create the first human-sized hugging robot with visual and haptic perceptionIf anyone has ever laughed at your ideas, Alexis Block doesn’t want you to get discouraged. She is the creator of HuggieBot, the first human-sized hugging robot with visual and haptic perception, and she says she had to believe in herself and her idea to get to where she is today. She is a doctoral fellow with the Max Planck ETH Center for Learning Systems and earned her B.S.E. in Mechanical Engineering and Applied Mechanics with double minors in Mathematics and Entrepreneurship. She then earned a M.S.E. in Robotics from the University of Pennsylvania. She also co-founded the MPI Athena Group to support women in science, technology, engineering, mathematics, robotics, intelligent systems and related fields. Her research has been featured all over the world including in The New York Times and The Late Show with Seth Meyers. She sat down with Jessica Abo to discuss her career path and the importance of being true to yourself.
Jessica Abo: Alexis, tell us, where were you in your studies when you came up with this idea?
Alexis Block: I actually started this project as my master’s thesis in the fall of 2016. I was going through a difficult time emotionally. My father had passed away when I was a freshman and I was still struggling with it, and I really wanted a hug, but a hug from somebody who understood the depth of the emotion I was dealing with and could really support me in the way that I needed it. I wanted a hug from my mom or from my grandmother.
I spoke with my professor and we said, “Wouldn’t it be wonderful if we could come up with a way that people could send each other customized hugs so you could receive that comfort from that special loved one no matter the distance between the two of you?” I was so passionate and so excited about this topic that I’ve continued it through to my doctorate, which is what I’m working on finishing up now.
What was the first step you took to actually make this HuggieBot a reality?
First, I started with a commercially available robot called the PR2, which stands for the Personal Robot 2, and it was made by a company called Willow Garage. That company no longer exists, but I gave this robot custom hardware and software upgrades to see if I could make this robot hug and if people would be accepting of that. So, that was my master’s thesis, and I also tested some different parameters like if the robot was soft or warm and how tight and how long the robot would hug a person for. Once I found the optimal parameters, for my doctorate, I decided that there were no commercially available robotic platforms that were really ideal for this kind of close, social, physical human–robot interaction, and I maybe naively thought the best idea was for me to design and build my own robot, which was just so much work, but very, very rewarding.
This is all so fascinating. How does your HuggieBot work?
The first thing that happens when people hug each other is you see someone approaching you and you can tell by their forward movement and how they’re lifting their arms that they want to hug you. You can also easily and very quickly estimate their size and their height. So, that’s the first thing my robot needs to do. With the camera, I’m using an Intel RealSense depth sensing camera located on the top of the robot’s head, I notice an approaching user with their arms out for a hug walking forward and I quickly estimate the user’s height and interpolate that to find the joint angle that the robot should lift its arms to tell how high the robot should hug the person to make sure it’s an appropriate height for them.
Then the robot uses torque sensors at each joint of its arms to grasp a person, similarly to how two kind-of robotic grippers might grasp an object. This way it can adjust to the size and also the location if someone is not hugging directly in the front. If they’re off to the side a little bit, it’ll still make sure to grasp someone very well, firmly, but not too tight. I use torque thresholds to ensure a secure embrace that neither leaves air gaps nor applies excessive pressure to the user’s body. Then I also developed a novel inflatable sensing torso. What’s cool about this is that it kind of serves a dual purpose. It softens the robot while also acting as a sensing system. So I use this to detect contacts that the user makes on the robot’s torso. So this helps tell me when the person begins hugging the robot and when they let go, but it also tells me if the person is performing any actions on the robot, like patting the robot, squeezing it, or rubbing its back.
Then I’ve also developed a behavioral algorithm after I ran a long user study to find out how the robot should respond to these actions. I have a little bit of variety, a little spontaneity to make the robot seem not so robotic but more natural, where the robot will respond to these actions. Typically, when a person squeezes, the robot also squeezes, but occasionally there’s a little variety in what it will do. Then there’s two ways to release. You either let go of the robot or you can lean back against the arms, and both ways the robot will release you.
You were saying earlier that you could personalize the hugs. How does that work?
So, that’s something that I’m still working on. It’s not ready yet, but this was kind of the driving force for all of these years. I’m working on developing a mobile app where you can send customized hugs to another person. You can record a video. So your mom’s face could replace the animated face and she could say, “Oh, I love you so much,” or, “Everything’s going to be okay.” The customizer can also determine how long the hug should last and if the robot should perform any gestures on the person. You’ll get a notification on your phone and you’ll scan a QR code at the robot.
Who has access to your HuggieBot?
Right now, the robot is physically located in Stuttgart, Germany at the Max Planck Institute for Intelligent Systems. Right now, I’ve been getting people to come to the institute by just posting in an expat in Stuttgart Facebook Group and saying, “I’m running a user study. If anyone’s interested, please contact me.” We schedule a time, and they come to the institute and get some hugs.
What is next for you and next for HuggieBot?
I want to run a long-term study over maybe three months where people can use this application to send custom hugs, and I want to see if this robot could maybe help strengthen personal relationships that are separated by a physical distance, which is when I originally came up with the idea. I live in Europe and my family lives in the U.S, but now, in the times of COVID, it’s even more relevant than I ever could have imagined.
What advice do you want to give to the aspiring entrepreneurs out there who feel like everyone is laughing at their ideas?
Working on a project that will truly have some kind of lasting effect or impact takes years. When you find an idea for a topic you’re passionate about, you need to be thinking 20 years in the future, not just five. It takes a lot of hard work and dedication over a long time to bring something truly visionary to fruition, so when you first share your idea with people they might laugh and think it’s funny, but it’s just because they can’t see that far into the future. But if it’s something you’re really passionate about and you put in the work for, in time, they’ll catch up with you and your vision.
Take HuggieBot for example. When I started on my project four years ago, even some of my professors laughed at me. Then, three years ago when I was presenting my work at an academic conference, I had someone actually come up to my poster and laugh in my face and tell me “This is the dumbest thing I’ve ever seen. I can’t believe you actually spent time to make a hugging robot.” You see, these people could never have imagined a world without social touch.
Flash forward to the Covid pandemic, and unfortunately, people are now realizing just how hard it is to not be able to hug their loved ones. I never envisioned this kind of future, but I saw other applications for this robot, other examples of where people are physically separated by a distance who would benefit from social touch, but that the rest of society didn’t really care about: elderly people in nursing homes, prisoners, college students far away from their families maybe for the first time in their lives. These groups of people face high levels of depression for a number of reasons, but a contributing factor is the lack of social affective touch. Thanks in part to the pandemic, people are now beginning to realize its importance and take my research more seriously.