{"id":3109,"date":"2020-04-01T08:19:02","date_gmt":"2020-04-01T13:19:02","guid":{"rendered":"http:\/\/scienceandsf.com\/?p=3109"},"modified":"2020-04-01T08:19:03","modified_gmt":"2020-04-01T13:19:03","slug":"robot-report-for-mar-2020","status":"publish","type":"post","link":"https:\/\/scienceandsf.com\/index.php\/2020\/04\/01\/robot-report-for-mar-2020\/","title":{"rendered":"Robot Report for Mar 2020."},"content":{"rendered":"\n<p>There have been some very interesting engineering developments in both robotics and artificial intelligence (AI) recently. These new designs clearly show what I consider to be the main theme of these subjects, a convergence of the artificial and organic as engineers learn how to copy the abilities of living creatures, taking advantage of the strengths of biological systems in order to improve the functioning of their designs.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter\"><img loading=\"lazy\" decoding=\"async\" width=\"480\" height=\"360\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Robot1.jpg\" alt=\"\" class=\"wp-image-3110\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Robot1.jpg 480w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Robot1-300x225.jpg 300w\" sizes=\"auto, (max-width: 480px) 85vw, 480px\" \/><figcaption>Learning how to use the abilities of organic life in mechanical systems is about more than just making a robotic copy of ourselves. (Credit: YouTube)<\/figcaption><\/figure><\/div>\n\n\n\n<p>We are all aware of how awkward and clumsy the movements of robots appear when compared to the grace and dexterity of living creatures. The mechanical walk of a robot as depicted in SF movies of the 50s and 60s may be a clich\u00e9, but nevertheless it&#8217;s still pretty much true. Because of this inflexibility robots are usually designed for a single, repetitive task. Multi-tasking for robots is usually just out of the question. <\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1200\" height=\"675\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/robots-1939-worlds-fair-2.jpg\" alt=\"\" class=\"wp-image-3111\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/robots-1939-worlds-fair-2.jpg 1200w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/robots-1939-worlds-fair-2-300x169.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/robots-1939-worlds-fair-2-768x432.jpg 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>A Robot from the 1039 World&#8217;s Fair in New York City. He looks like he could hardly move! (Credit:History.com)<\/figcaption><\/figure>\n\n\n\n<p>With that in mind I&#8217;ll start today&#8217;s post by describing some of the work of Doctor Fumiya Iida of the Department of Engineering at the University of Cambridge in the UK. Throughout his 20-year career Dr. Iida has studied the anatomy of living creatures in an effort to improve the agility of his own robotic creations.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2000\" height=\"1890\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Fumiya-2000x1890.jpg\" alt=\"\" class=\"wp-image-3113\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Fumiya-2000x1890.jpg 2000w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Fumiya-300x283.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Fumiya-768x726.jpg 768w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Fumiya-1200x1134.jpg 1200w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>Doctor Fumiya Iida with some of his work. (Credit: Phil Mynott)<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"590\" height=\"288\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotPiano.jpg\" alt=\"\" class=\"wp-image-3115\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotPiano.jpg 590w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotPiano-300x146.jpg 300w\" sizes=\"auto, (max-width: 590px) 85vw, 590px\" \/><figcaption>Doctor Lida&#8217;s robotic hand for playing the piano. (Credit: Linked In)<\/figcaption><\/figure>\n\n\n\n<p>Dr. Iida has found inspiration from a wide range of different anatomical structures. Everything from the prehensile tail of a monkey to the sucker mouth of a leech can become for him a new way for a robot to move and manipulate objects. Dr. Iida and his colleagues refer to this program as &#8216;Bio Inspired Robotics&#8217;. Dr. Iida&#8217;s latest success has been the demonstration of a robot that can perform a labourious and backbreaking job that before now could only be accomplished by humans, picking lettuce.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/lettuce.jpg\" alt=\"\" class=\"wp-image-3117\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/lettuce.jpg 1024w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/lettuce-300x225.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/lettuce-768x576.jpg 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>Picking lettuce is extremely labour intensive. Automating this, and other vegetable harvesting would eliminate a great deal of low paid, hard labour. (Credit: Flickr)  <\/figcaption><\/figure>\n\n\n\n<p>Now\nat first you might think that picking lettuce would be an easy job to design a\nrobot to handle. After all lettuce heads are all planted evenly spaced in\nstraight rows. All a robotic picker has to do is go along the rows and grab the\nlettuce heads. <\/p>\n\n\n\n<p>It&#8217;s\nnot that simple, first of all a lettuce head is fairly soft and every\nindividual head of lettuce is a somewhat different size and shape. This makes\npicking the lettuce heads difficult for most robots, resulting in a\nconsiderable amount of damage to the lettuce. Also, the outermost leafs of a\nlettuce head are generally so dirty or damaged that they have to be removed, a\ntask that hitherto no robot has been able to carry out reliably.<\/p>\n\n\n\n<p>Putting all that he&#8217;s learned into the problem Dr. Iida utilized a combination of visual sensors, soft grippers and a pneumatically activated knife for his robot picker. First the robot uses its cameras to locate a lettuce head before positioning itself directly above it. Then lowering itself onto the lettuce the robot pushes the unwanted leaves down and out of the way before cutting the head at its base. The robot&#8217;s soft grippers then lift the head up and place it in a basket.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1067\" height=\"655\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/rob21888-fig-0009-m.jpg\" alt=\"\" class=\"wp-image-3114\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/rob21888-fig-0009-m.jpg 1067w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/rob21888-fig-0009-m-300x184.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/rob21888-fig-0009-m-768x471.jpg 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>Procedure used by Doctor Iida&#8217;s robot lettuce picker. (Credit: Wiley Online Library)<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1350\" height=\"836\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotLettuce.jpg\" alt=\"\" class=\"wp-image-3116\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotLettuce.jpg 1350w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotLettuce-300x186.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotLettuce-768x476.jpg 768w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/RobotLettuce-1200x743.jpg 1200w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>The lettuce picking robot in action. (Credit: The Robot Report)<\/figcaption><\/figure>\n\n\n\n<p>So far Dr. Iida&#8217;s robot has been able to achieve an 88% harvest success rate, good but it still needs improvement before it can replace human pickers. Nevertheless when perfected this technology could be adapted to other types of produce, finally automating what has remained one of the hardest and lowest paying of all jobs.<\/p>\n\n\n\n<p>So,\nif engineers are starting to construct robots to harvest our vegetables for us,\nwhat other boring repetitive jobs can they be built to take off our hands? Well\nresearchers at the Massachusetts Institute of Technology (MIT) are actually\ndeveloping robots that can learn to do common household chores like setting the\ntable by watching us do it!<\/p>\n\n\n\n<p>The technology has been given the name &#8216;Planning with Uncertain Specifications&#8217; or PUnS and the idea is to enable robots to perform human like planning based on observations rather than simply carrying out a list of instructions. By watching humans completing a task, like setting a table, the robot learns the goal of the task and a general idea of how to accomplish that goal. Known as &#8216;Linear Temporal Logic&#8217; or LTL these computer generated formulas serve as templates for the robot to follow in order to accomplish its goal.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"550\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Roomba.jpg\" alt=\"\" class=\"wp-image-3120\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Roomba.jpg 1050w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Roomba-300x157.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/Roomba-768x402.jpg 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>Robotic Vacuum clears are a simple example of PUnS. Whenever they bump up against an obstacle they just make a turn and head off in a different direction. Eventually they will clean the entire floor! (Credit: The New York Times)<\/figcaption><\/figure>\n\n\n\n<p>In the study the PUnS robot observed 40 humans carry out the task of setting a table and from those observations generated 25 LTL formulas for how to complete the task. At the same time the computer assigned to each formula a different confidence of success value. Starting with the highest value formula the robot was then ordered to attempt to complete the task and based on its performance it is either rewarded or punished. <\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1280\" height=\"720\" src=\"http:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/PUnS.jpg\" alt=\"\" class=\"wp-image-3118\" srcset=\"https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/PUnS.jpg 1280w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/PUnS-300x169.jpg 300w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/PUnS-768x432.jpg 768w, https:\/\/scienceandsf.com\/wp-content\/uploads\/2020\/03\/PUnS-1200x675.jpg 1200w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption>A Robotic arm that has learned, not be programmed to set a table! (Credit: YouTube)<\/figcaption><\/figure>\n\n\n\n<p>In a series of 20,000 tests starting with different initial conditions the robot only made a mistake 6 times. In one test for example, the fork is hidden at the start of a test. Despite not having all of the items required to completely set the table the robot went ahead and set the rest of the dinnerware correctly. Then, when the fork was reveled the robot picked it up and placed it in the correct position, completing the task. This degree of flexibility in an automated system is unprecedented and points the way to robots learning how to accomplish different jobs not by mindlessly following a long list of instructions, in other words a program, but rather the same way humans do, by watching someone else do it. So, robots are now being designed to move more like a living creature does, and computers are being programmed to learn more like a human does. It took evolution billions of years to give living creatures those abilities but by observing and copying biological systems our robots and computers are quickly catching up. Who knows where they&#8217;ll be in another few decades. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>There have been some very interesting engineering developments in both robotics and artificial intelligence (AI) recently. These new designs clearly show what I consider to be the main theme of these subjects, a convergence of the artificial and organic as engineers learn how to copy the abilities of living creatures, taking advantage of the strengths &hellip; <a href=\"https:\/\/scienceandsf.com\/index.php\/2020\/04\/01\/robot-report-for-mar-2020\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Robot Report for Mar 2020.&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[12],"tags":[1062,1060,1061],"class_list":["post-3109","post","type-post","status-publish","format-standard","hentry","category-science","tag-planning-with-uncertain-specifications","tag-robot-report","tag-robotic-lettuce-picker"],"_links":{"self":[{"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/posts\/3109","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/comments?post=3109"}],"version-history":[{"count":4,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/posts\/3109\/revisions"}],"predecessor-version":[{"id":3123,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/posts\/3109\/revisions\/3123"}],"wp:attachment":[{"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/media?parent=3109"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/categories?post=3109"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scienceandsf.com\/index.php\/wp-json\/wp\/v2\/tags?post=3109"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}