Powered by OpenAIRE graph
Found an issue? Give us feedback

Amazon Robotics

Amazon Robotics

2 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: EP/V008102/1
    Funder Contribution: 1,718,900 GBP

    To be really useful, robots need to interact with objects in the world. The current inability of robots to grasp diverse objects with efficiency and reliability severely limits their range of application. Agriculture, mining and environmental clean-up arejust three examples where - unlike a factory - the items to be handled could have a huge variety of shapes and appearances, need to be identified amongst clutter, and need to be grasped firmly for transport while avoiding damage. Secure grasp of unknown objects amongst clutter remains an unsolved problem for robotics, despite improvements in 3Dsensing and reconstruction, in manipulator sophistication and the recent use of large-scale machine learning. This project proposes a new approach inspired by the high competence exhibited by ants when performing the closely equivalent task of collecting and manipulating diverse food items. Ants have relatvely simple, robot-like 'grippers' (their mouth-parts, called 'mandibles'), limited sensing (mostly tactile, using their antennae) and tiny brains. Yet they are able to pick up and carry a wide diversity of food items, from seeds to other insect prey, which can vary enormously in shape, size, rigidity and manouverability. They can quickly choose between multiple items and find an effective position to make their grasp, readjusting if necessary. Replicating even part of this competence on robots would be a significant advance. Grasping thus makes an ideal target for applying biorobotic methods that my group has previously used with substantial success to understand and mimic insect navigation behaviours on robots. How does an ant pick up an object? The first part of this project will be to set up the methods required to observe and analyse in detail the behaviour of ants interacting with objects. At the same time we will start to build both simulated and real robot systems that allow us to imitate the actions of an ant as it positions its body, head and mouth to make a grasp; using an omnidirectional robot base with an arm and gripper. We will also examine and imitate the sensory systems usedby the ant to determine the position, shape and size of the object before making a grasp. What happens in the ant's brain when it picks up an object? The second part will explore what algorithms insect brains need to compute to be able to make efficient and effective grasping decisions. Grasping is a task that contains in miniature many key issues in robot intelligence. It involves tight coupling of physical, perceptual and control systems. It involves a hierarchy of control decisions (whether to grasp, how to position the body and actuators, precise contact, dealing with uncertainty, detecting failure). It requires fusion of sensory information and transformation into the action state space, and involves prediction, planning and adaptation. We aim tounderstand how insects solve these problems as a route to efficient and effective solutions for robotics. Can a robot perform as well as an ant? The final part will test the systems we have developed in real world tasks. The first task will be to perform an object clearing task, which will also allow benchmarking of the developed system against existing research. The second task will be based ona pressing problem in environmental clean-up: detection and removal of small plastic items from amongst shoreline rocksand gravel. This novel area of research promises significant pay-off from translating biological understanding into technical advance because it addresses an important unsolved challenge for which the ant is an ideal animal model.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/V052659/1
    Funder Contribution: 1,196,800 GBP

    This Fellowship focuses on robotic object manipulation. Object manipulation refers to all the different ways robots can interact with objects in their environments. Consider the example of packing different items into a box in a warehouse to ship the box to a customer. To perform this, a robot would need to pick and insert the items into the box one by one, while nudging, pushing and squeezing objects, to achieve a tight packing. The robot would need to plan and control its actions, as well as use sensors to estimate the positions and deformations of the objects in the box. The dominant approach to object manipulation in the literature is geometry-based, where the world is represented using shapes and configurations only. While this simplifies planning and control, it also results in robots that are extremely limited in their skills. The central vision of this Fellowship is to go beyond that, by enabling robots to reason about, plan in, and control the full physics of the world. This has the potential to transform robots' object manipulation skills and our lives, because robots will be able to perform a much diverse variety of object manipulation skills applicable to manufacturing, assembly, and services. This Fellowship will create fundamental algorithms --- algorithms that can be applied to different object manipulation problems by other researchers and engineers. However, this Fellowship will also target a particular application area: picking and packing of objects for warehouse automation. With the rapid advance of e-commerce over the past decade, there is a pressing need to have efficient warehouse automation systems, in the UK and the world. However, existing robotic systems do not have physics-based reasoning, which limits their applications drastically. The physics-based picking and packing approach that I propose will enable robots to reach into cluttered bins/shelves/bags, pushing, nudging, and squeezing arbitrary objects to search and retrieve a particular object or to pack multiple objects tightly for shipping --- skills that do not exist in any current system. There are significant challenges to using physics-based models during robotic manipulation. An important one is computational expense. We have low-level physics models and physics engines, similar to the ones used in computer games, which can be used by robots. However, computing such models are expensive (i.e. takes too much computer time) and robotic algorithms need to query such models thousands, and sometimes millions, of times before choosing an action, making it infeasible to use such a straightforward approach. Instead, I propose to develop and use hierarchical models of physics. At higher levels in this hierarchy are coarse, approximate physics models, i.e. models that are computationally cheap (i.e. fast to compute) but may be inaccurate. At lower levels in this hierarchy are fine models, i.e. models that are computationally expensive (i.e. slow to compute) but are accurate. I will investigate a variety of methods (including data-driven methods as well as parallel computing methods) to learn and compute such a hierarchy of physics models. I will also develop new planning, control, and state estimation algorithms that can use these new hierarchical physics models. I will also use these new algorithms and systems to accelerate the adoption of this technology in the UK. I will work with the EPSRC UK Robotics & Autonomous Systems Network and industrial stakeholders to develop a roadmap for the integration of autonomous picking and packing robots into the existing industrial workflows. I will also aim to create a new national organisation to focus on this important technology. To achieve these aims, I will work with many academic and industrial partners, including the Advanced Supply Chain Group, a leading UK-based supply chain and warehouse management company, as well as a key international company in this area, Amazon.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.