{"id":558880,"date":"2021-02-09T12:16:10","date_gmt":"2021-02-09T17:16:10","guid":{"rendered":"https:\/\/www.therobotreport.com\/?p=558880"},"modified":"2021-02-09T12:28:51","modified_gmt":"2021-02-09T17:28:51","slug":"shadow-help-robots-understand-human-touch","status":"publish","type":"post","link":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/","title":{"rendered":"How shadows can help robots understand human touch"},"content":{"rendered":"<div id=\"attachment_558882\" style=\"width: 955px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-558882\" class=\"wp-image-558882 size-full\" src=\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png\" alt=\"shadow\" width=\"945\" height=\"444\" srcset=\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png 945w, https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM-300x141.png 300w, https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM-150x70.png 150w, https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM-768x361.png 768w, https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM-368x173.png 368w\" sizes=\"(max-width: 945px) 100vw, 945px\" \/><p id=\"caption-attachment-558882\" class=\"wp-caption-text\"><strong>By placing a camera inside a robot, researchers can infer how the person is touching it and what the person\u2019s intent is just by looking at the shadow images. | Credit: Cornell University<\/strong><\/p><\/div>\n<p>Cornell University researchers created a low-cost method for <a href=\"https:\/\/www.therobotreport.com\/category\/technologies\/soft-robotics\/\">soft robots<\/a> to detect a range of physical interactions, from pats to punches to hugs, without relying on touch at all. Instead, a USB camera located inside the robot captures the shadow movements of hand gestures on the robot\u2019s skin and classifies them with machine-learning software.<\/p>\n<p>The new ShadowSense technology is the latest project from Cornell&#8217;s Human-Robot Collaboration and Companionship Lab. The group published a paper on the research called &#8220;<a href=\"https:\/\/dl.acm.org\/doi\/abs\/10.1145\/3432202\" target=\"_blank\" rel=\"noopener\">ShadowSense: Detecting Human Touch in a Social Robot Using Shadow Image Classification<\/a>.&#8221;<\/p>\n<p>&#8220;Touch is such an important mode of communication for most organisms, but it has been virtually absent from human-robot interaction,&#8221; said Guy Hoffman, associate professor at Cornell&#8217;s Sibley School of Mechanical and Aerospace Engineering and the paper&#8217;s senior author. &#8220;One of the reasons is that full-body touch used to require a massive number of sensors, and was therefore not practical to implement. This research offers a low-cost alternative.\u201d<\/p>\n<p>The technology originated as part of a collaboration to develop inflatable robots that could guide people to safety during emergency evacuations. Such a robot would need to be able to communicate with humans in extreme conditions and environments. Imagine a robot physically leading someone down a noisy, smoke-filled corridor by detecting the pressure of the person\u2019s hand.<\/p>\n<p>Rather than installing a large number of contact sensors \u2013 which would add weight and complex wiring to the robot, and would be difficult to embed in a deforming skin \u2013 the team took a counterintuitive approach. In order to gauge touch, they looked to sight.<\/p>\n<p>\u201cBy placing a camera inside the robot, we can infer how the person is touching it and what the person\u2019s intent is just by looking at the shadow images,\u201d said Yuhan Hu, the paper\u2019s lead author. &#8220;We think there is interesting potential there, because there are lots of social robots that are not able to detect touch gestures.&#8221;<\/p>\n<div align=\"center\">\n<p><iframe loading=\"lazy\" title=\"Researchers give soft robots a human touch\" width=\"740\" height=\"416\" src=\"https:\/\/www.youtube.com\/embed\/Ipb6VxPqGqs?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/p>\n<\/div>\n<h2>Prototype robot<\/h2>\n<p>The prototype robot consists of a soft inflatable bladder of nylon skin stretched around a cylindrical skeleton, roughly four feet in height, that is mounted on a mobile base. Under the robot\u2019s skin is a USB camera, which connects to a laptop. The researchers developed a neural network-based algorithm that uses previously recorded training data to distinguish between six touch gestures \u2013 touching with a palm, punching, touching with two hands, hugging, pointing and not touching at all \u2013 with an accuracy of 87.5 to 96%, depending on the lighting.<\/p>\n<p>The robot can be programmed to respond to certain touches and gestures, such as rolling away or issuing a message through a loudspeaker. And the robot\u2019s skin has the potential to be turned into an interactive screen.<\/p>\n<p>By collecting enough data, a robot could be trained to recognize an even wider vocabulary of interactions, custom-tailored to fit the robot\u2019s task, Hu said. The robot doesn\u2019t even have to be a robot. ShadowSense technology can be incorporated into other materials, such as balloons, turning them into touch-sensitive devices.<\/p>\n<p>\u201cWhile the technology has certain limitations, for example requiring a line of sight from the camera to the robot\u2019s skin, these constraints could actually spark a new approach to social robot design that would support a visual touch sensor like the one we proposed,\u201d Hoffman said. \u201cIn the future, we would like to experiment with using optical devices such as lenses and mirrors to enable additional form factors.\u201d<\/p>\n<h2>Increasing privacy<\/h2>\n<p>In addition to providing a simple solution to a complicated technical challenge, and making robots more user-friendly to boot, ShadowSense offers a comfort that is increasingly rare in these high-tech times: privacy.<\/p>\n<p>\u201cIf the robot can only see you in the form of your shadow, it can detect what you\u2019re doing without taking high fidelity images of your appearance,\u201d Hu said. \u201cThat gives you a physical filter and protection, and provides psychological comfort.\u201d<\/p>\n<p>The ability to physically interact and understand a person\u2019s movements and moods could ultimately be just as important to the person as it is to the robot.<\/p>\n<p>\u201cTouch interaction is a very important channel in terms of human-human interaction. It is an intimate modality of communication,\u201d Hu said. \u201cAnd that\u2019s not easily replaceable.\u201d<\/p>\n<p><em><strong>Editor&#8217;s Note:<\/strong> This article was republished from Cornell University.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Cornell University researchers created a low-cost method for soft robots to detect a range of physical interactions, from pats to punches to hugs, without relying on touch at all. Instead, a USB camera located inside the robot captures the shadow movements of hand gestures on the robot\u2019s skin and classifies them with machine-learning software. The&hellip;<\/p>\n","protected":false},"author":622,"featured_media":558882,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"rbr50_analysis":"","rbr50_state":"","rbr50_country":"","rbr50_description":"","rbr50_numemps":"","rbr50_text_taxonomy_radio":"","rbr50_text_taxonomy_select":"","rbr50_url":"","rbr50_yearfounded":"","_genesis_hide_title":false,"_genesis_hide_breadcrumbs":false,"_genesis_hide_singular_image":false,"_genesis_hide_footer_widgets":false,"_genesis_custom_body_class":"","_genesis_custom_post_class":"","_genesis_layout":"","ngg_post_thumbnail":0,"footnotes":""},"categories":[1753,1401,2160],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How shadows can help robots understand human touch<\/title>\n<meta name=\"description\" content=\"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How shadows can help robots understand human touch\" \/>\n<meta property=\"og:description\" content=\"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/\" \/>\n<meta property=\"og:site_name\" content=\"The Robot Report\" \/>\n<meta property=\"article:published_time\" content=\"2021-02-09T17:16:10+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-02-09T17:28:51+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png\" \/>\n\t<meta property=\"og:image:width\" content=\"945\" \/>\n\t<meta property=\"og:image:height\" content=\"444\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"David Nutt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@therobotreport\" \/>\n<meta name=\"twitter:site\" content=\"@therobotreport\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"David Nutt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/\",\"url\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/\",\"name\":\"How shadows can help robots understand human touch\",\"isPartOf\":{\"@id\":\"https:\/\/www.therobotreport.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png\",\"datePublished\":\"2021-02-09T17:16:10+00:00\",\"dateModified\":\"2021-02-09T17:28:51+00:00\",\"author\":{\"@id\":\"https:\/\/www.therobotreport.com\/#\/schema\/person\/230ee33dc921939a0f745c93671f50e0\"},\"description\":\"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage\",\"url\":\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png\",\"contentUrl\":\"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png\",\"width\":945,\"height\":444,\"caption\":\"shadows\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.therobotreport.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"How shadows can help robots understand human touch\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.therobotreport.com\/#website\",\"url\":\"https:\/\/www.therobotreport.com\/\",\"name\":\"The Robot Report\",\"description\":\"Robotics news, research and analysis\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.therobotreport.com\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.therobotreport.com\/#\/schema\/person\/230ee33dc921939a0f745c93671f50e0\",\"name\":\"David Nutt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.therobotreport.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/f0e31e560517200dd3e3f76c85492536?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/f0e31e560517200dd3e3f76c85492536?s=96&d=mm&r=g\",\"caption\":\"David Nutt\"},\"url\":\"https:\/\/www.therobotreport.com\/author\/dnutt\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How shadows can help robots understand human touch","description":"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/","og_locale":"en_US","og_type":"article","og_title":"How shadows can help robots understand human touch","og_description":"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.","og_url":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/","og_site_name":"The Robot Report","article_published_time":"2021-02-09T17:16:10+00:00","article_modified_time":"2021-02-09T17:28:51+00:00","og_image":[{"width":945,"height":444,"url":"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png","type":"image\/png"}],"author":"David Nutt","twitter_card":"summary_large_image","twitter_creator":"@therobotreport","twitter_site":"@therobotreport","twitter_misc":{"Written by":"David Nutt","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/","url":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/","name":"How shadows can help robots understand human touch","isPartOf":{"@id":"https:\/\/www.therobotreport.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage"},"image":{"@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage"},"thumbnailUrl":"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png","datePublished":"2021-02-09T17:16:10+00:00","dateModified":"2021-02-09T17:28:51+00:00","author":{"@id":"https:\/\/www.therobotreport.com\/#\/schema\/person\/230ee33dc921939a0f745c93671f50e0"},"description":"Using shadow images, robots can detect a range of human physical interactions, from pats to punches to hugs, without ever relying on touch.","breadcrumb":{"@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#primaryimage","url":"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png","contentUrl":"https:\/\/www.therobotreport.com\/wp-content\/uploads\/2021\/02\/Screen-Shot-2021-02-09-at-12.10.09-PM.png","width":945,"height":444,"caption":"shadows"},{"@type":"BreadcrumbList","@id":"https:\/\/www.therobotreport.com\/shadow-help-robots-understand-human-touch\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.therobotreport.com\/"},{"@type":"ListItem","position":2,"name":"How shadows can help robots understand human touch"}]},{"@type":"WebSite","@id":"https:\/\/www.therobotreport.com\/#website","url":"https:\/\/www.therobotreport.com\/","name":"The Robot Report","description":"Robotics news, research and analysis","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.therobotreport.com\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.therobotreport.com\/#\/schema\/person\/230ee33dc921939a0f745c93671f50e0","name":"David Nutt","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.therobotreport.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/f0e31e560517200dd3e3f76c85492536?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f0e31e560517200dd3e3f76c85492536?s=96&d=mm&r=g","caption":"David Nutt"},"url":"https:\/\/www.therobotreport.com\/author\/dnutt\/"}]}},"_links":{"self":[{"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/posts\/558880"}],"collection":[{"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/users\/622"}],"replies":[{"embeddable":true,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/comments?post=558880"}],"version-history":[{"count":0,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/posts\/558880\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/media\/558882"}],"wp:attachment":[{"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/media?parent=558880"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/categories?post=558880"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.therobotreport.com\/wp-json\/wp\/v2\/tags?post=558880"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}