
Ultraleap
Ultraleap
7 Projects, page 1 of 2
assignment_turned_in Project2022 - 2026Partners:Ultraleap, UltraleapUltraleap,UltraleapFunder: UK Research and Innovation Project Code: MR/W013576/1Funder Contribution: 772,637 GBPWhen we touch a physical object, a sequence of mechanical events occurs whereby vibration is transmitted via the hard and soft tissues of the hand. The signals generated during object manipulation are then transduced into neural signals via ascending sensory pathways that our brain interprets as touch. When combined with signals from our other senses, memories and expectations, this information forms our realisation of the physical and psychological worlds. With modern technology, it is possible to generate immersive environments with breath-taking graphics, yet touch technologies (also known as haptics) capable of realistically and unobtrusively emulating the sense of touch have only just began to emerge. This future leaders fellowship (FLF) aims to unlock new potential in non-contact touch technologies by holistically understand both the physical and psychophysical dimensions of ultrasound mid-air haptics. To that end, we will lead ground-breaking R&D across acoustics, biophysics, neuroscience and artificial intelligence (AI). Mid-air haptics refers to electronically controlled collections of ultrasound speakers (phased arrays) that collectively generate complex acoustic fields in 3D space that can be touched and felt with our bare hands. Holographic 3D objects and surfaces can therefore be "haptified" and interacted with in mid-air, without the need to wear or hold any specialised controllers; a feature particularly appreciated in public display interfaces to limit the spread of pathogens. Coupled with augmented and virtual reality solutions, the technology allows the design and remote collaboration scenarios that are often seen in Sci-Fi movies such as Iron Man and Minority Report. R&D in mid-air haptics has been accelerating in recent years, yet has almost exclusively focused on hardware advancements, acoustic signal processing, and human-computer interaction (HCI) use cases. We believe that the true potential of ultrasound mid-air haptics is still unexplored, an opportunity uniquely available to be exploited by this FLF. Current mid-air haptics displays, such as those commercialised by Ultraleap only target one type of touch receptors (mechanoreceptors), which limits the device expressivity. Biophysical models capturing how acoustic waves interact with the skin are at their infancy and are experimentally unverified. Generative and computational models connecting phased array output, acoustic focusing waves, skin vibrations, mechanoreceptors, and psychophysical experiences are absent. This fellowship will be the first to thread these together. We will study ultrasonic mid-air haptics from first principles (i.e., acoustics and biophysics) all the way to perception and neurocognition. We will understand how localised acoustic energy generates non-localised skin vibrations, how those vibrations activate different touch receptors in the skin, and how receptors encode information that our somatosensory system then understands as touch. Once the forward problem is pieced together, our aim is to use machine learning to construct generative AI models enabling us to solve the inverse problem. What input ultrasound signals should be used to create the tactile sensation of holding a high-quality piece of paper? Today, there is no scientific way of answering such a question, even if we know that something like this is possible. Being able to bridge the different scientific fields related to ultrasonic mid-air haptics to create a holistic understanding of holographic touch is uniquely enabled by this FLF application. This 4-year, full-time, reduced hours FLF will support a cross-disciplinary and agile team of 2 postdoctoral research associates (RAs) led by the fellow, while being hosted at the only company in the world that is commercialising mid-air haptics, thus providing the fellowship with access to unique resources, engineering insights, and a direct pathway to economic and societal impact.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::f4253c20c5e0d3a5ddaac8d4d03590d2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::f4253c20c5e0d3a5ddaac8d4d03590d2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2022 - 2026Partners:Ultraleap, UltraleapUltraleap,UltraleapFunder: UK Research and Innovation Project Code: MR/V025511/1Funder Contribution: 798,384 GBPIn this FLF proposal, we expand the advances of mid-air haptic technology through the lens of multisensory experiences by applying the principles of human-computer interaction (HCI). Traditional user interfaces typically employ physical touch, e.g., pressing buttons and taping on touchscreens. Mid-air interaction technology replaces "taping on screens" with hand-gesture recognition algorithms and holographic touch. Cameras track your hands to activate actions on a computer system from a distance while phased ultrasonic arrays deliver tactile sensations to your palms and fingertips in mid-air. This unique combination enables novel interaction paradigms previously only seen in sci-fi movies (e.g., Minority Report). We can now interact with computers, digital objects and with each other in immersive 3D environments that we can not only see and hear, but can also touch and feel, without the need to wear or hold any additional controllers. To date, innovation in mid-air technologies has been focussed on hardware and software development to advance engineering methods related to accuracy, recognition and rendering. However, very little is known about how these technologies influence human behaviour and therefore it is unknown how to exploit human perception to improve interactivity and help society. This fellowship is at the forefront of enabling the evolution of those novel and innovative interaction paradigms through the specific lens of multisensory experiences not just taking a leap from physical to mid-air touch interactions, but also integrating it with other sensory modalities (vision, audio, smell). Through studying the cross-sensory associations and integration of mid-air touch into multisensory experiences, this fellowship aspires to create more emotionally engaging and compelling digital experiences where the human senses are as important as in the real world. Then, taking advantage of the effect of multisensory stimuli on human decision-making, we take a unique perspective on understanding the impact of such novel mid-air interfaces on societal responsibility. In particular, we focus on improving the users' sense of agency (SoA) when interacting with autonomous systems. Today, intelligent systems involve increased automation and make many decisions for us (cars, smart homes, robots), which tend to reduce the SoA i.e., the feeling of being in control of one's actions. By understanding the effects of mid-air interactions enhanced by multisensory experiences on the SoA, we aim to design and develop more responsive systems that provide the user with a feeling of being in control and strengthen responsible interactions. As a result of this project, we ambition to help create a future where the holographic display in autonomous vehicles can be touched and felt, and that this tactile dimension generated by focused ultrasound patterns is designed to promote a sense of trust towards the autonomous system, and a future where we can have a 3D video call with friends and family whilst being able to communicate our affection by sending tactile sensations generated by a beam of focused ultrasound and even perceive smells from their environment. This FLF will challenge existing interaction paradigms in HCI through a systematic understanding of the role of mid-air haptics from a multisensory perspective, aiming to design digital interactions where you can see, hear, touch and smell, and this multisensory experience helps to share agency between users and technology. Our approach will break from the conventional studies in mid-air technology by fostering a new and inclusive ecosystem of multidisciplinary research around mid-air technology involving psychology, neuroscience and HCI so that the impact of these technology is not limited to hardware and software, but also provides a positive impact on society with respect an accelerated digitisation of human experiences.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8dc138cb21ffcf827efa20e56172e8fc&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8dc138cb21ffcf827efa20e56172e8fc&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2021 - 2026Partners:UCL, Ultraleap, UltraleapUCL,Ultraleap,UltraleapFunder: UK Research and Innovation Project Code: EP/V037846/1Funder Contribution: 916,580 GBPThe UK is a world-leader in creating interactive applications that are enabled by computational manipulation of acoustic wave fronts. Three examples of such applications include mid-air haptics, directional audio, and volumetric 3D particle displays. Using a phased array of ultrasonic speakers that are precisely and individually controlled, we can create high pressure focal points in mid-air. Modulating these focal points in various ways, it is possible to 1) create rich tactile sensations when touched by the bare hands, 2) steer directional sounds that propagate over long distances un-attenuated, 3) levitate small particles that when rapidly moved in space emulate volumetric 3D shapes due to a phenomenon called persistence of vision. The exploitation of these amazing new technologies is uniquely available to Ultraleap (a UK based company) and Sussex University who have a long and productive history of collaboration. For example, Ultraleap is currently combining hand-tracking and ultrasonic mid-air haptic feedback solutions for applications ranging from VR training simulators, automotive interfaces, gaming machines, and next generation digital signage kiosks. Similarly, Sussex University is creating multimodal 3D displays based on rapidly updating ultrasonic phased arrays to create persistence of vision when moving acoustically levitated objects. A significant constraint to the wide-scale deployment of the underlying technology of phased arrays is the cost and complexity of a non-modular system because it limits applicability. For example, there is no one size fits all phased array with most integrated solutions needing to be custom developed. In this project, we will circumvent such problems altogether by creating simple and low cost modular spatial sound modulator (SSM) units i.e. smaller arrays of acoustic sources, to be placed around the interactive space, that can collectively out-perform the single large monolithic solution we currently have. Moreover, we will take a leap forward in sound-field control by removing scalability and reusability issues thus opening up the exploitation of phased array technologies into other applications domains that can benefit from the non-contact delivery of haptic feedback, steerable directional sound, and/or volumetric 3D particle displays. Specifically, we will draw on the well-developed literature of multi-agent game theory and distributed computing and use them to build a decentralised swarm architecture that can flexibly accommodate numerous SSM units. Each SSM will emit and modulate the sound-field nearby it while sharing a common awareness of the contextual details with its swarm host, while the desired collective behaviour will emerge from the interactions between multiple SSMs and their interactions with the environment. There are several anticipated benefits to our proposed approach. Firstly, by designing simple, independent SSMs we are able to address multiple commercial applications using the same primitive unit, while simultaneously streamlining the manufacturing pipeline. These modular units can be used individually or combined in a myriad of ways to create new applications. Secondly, by enabling a distributed control architecture, swarm SSMs can seamlessly and progressively scale up. By incrementally combining larger numbers of modular units, customers can initially use a small number of SSM units, and dynamically grow the capabilities of their interactive multimodal system by adding new devices according to the application needs. Finally, by using game theory we will enable SSMs to dynamically cooperate with each other as to meet application objectives independent of the application logic and the arrangement of our modular devices thus simplifying the development and design process and enabling creative designers to focus on the delivery of evermore immersive and multimodal experiences.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::0b8a836baadf676eb60c25818bb71d72&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::0b8a836baadf676eb60c25818bb71d72&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2023 - 2025Partners:Ultraleap, Advanced Manufacturing Research Centre, The Product Partnership, Autodesk, Autodesk Ltd +8 partnersUltraleap,Advanced Manufacturing Research Centre,The Product Partnership,Autodesk,Autodesk Ltd,University of Bristol,The Product Partnership,Advanced Manufacturing Research Centre,Ultraleap,University of Bristol,Bristol Digital Futures Institute,Bristol Digital Futures Institute,ADVANCED MANUFACTURING RESEARCH CENTREFunder: UK Research and Innovation Project Code: EP/W024152/1Funder Contribution: 344,002 GBPTo design the future of products we need the future of prototyping tools. Across the £30Bn+ consumer product markets, priorities such as demand for non-technical user voice vie against advanced products and tough time/cost targets. These pressures are acutely felt in the prototyping process, where models often number in the 100s for a single product, and are inflexible, technically advanced, and resource-intensive to create. To succeed and evolve prototyping needs to do more, quicker, cheaper, with higher accessibility. This project aims to enhance learning, accessibility, and efficiency during prototyping. It will explore feasibility and value of seamlessly integrating physical and digital prototyping into a single workflow. Recent and rapidly emerging technologies such as mixed reality, haptic interfaces, and gesture control have revolutionised the way we interact with the digital world. It's predicted that this tech will be ubiquitous by 2025, will be disruptive for the next decade, and will drive the way we work and interact across the future digital workplace, with engineering a top-5 sector to realise value. In prototyping, they will break down the physical-digital divide and create seamless experiences, where the strengths of each domain are realised simultaneously. This new physical-digital integrated workflow brings profound opportunities for both engineers and users, supporting technical activities and simplifying communication. Amongst many possibilities users may physically create and feel digital changes to prototypes in real-time, dynamically overlay advanced analyses onto physical models, and support early-stage decision-making with physical-digital, tactile, interactive prototypes. These capabilities will allow more learning per prototype, widen accessibility to technical design and streamline the prototyping process. However, we don't yet know how this exciting vision may be fulfilled, exactly what benefits, value or costs there may be, feasibility of implementation, or effective workflow approaches. The project will explore physical-digital workflow by creating and investigating several demonstrator platforms that combine and apply haptic, mixed reality, and gesture control technologies in targeted prototyping scenarios. Technologies will be explored to understand capability in isolated sprints, before prioritisation and development into focused demonstrator tools that allow us to explore integrated workflow across real prototyping cases, spanning activities, types, and stakeholders. Demonstrators will be evaluated and verified with end-users, industry partners, and the public to establish learning, speed, cost, and usage characteristics. Project outcomes will comprise workflows for integrated prototyping with knowledge of value, effectiveness, feasibility, and future opportunities. A 'toolkit' of implementations will also provide exemplars for industrial partners and academia and lead the effective use of integrated physical-digital workflow in engineering. All software and hardware will be open-sourced via Github and the project webpage, letting global researchers and the public create their own systems and build upon the work. Future work will extend capabilities in line with outcomes of the work, leading to the next generation of engineering design and prototyping tools. Industrial Partners The Product Partnership (Amalgam, Realise Design, and Cubik) and AMRC will bring prototyping, engineers, and end-user expertise and benefit from the workflows and technologies that are developed. OEMs Ultraleap and Autodesk will bring immersive technology expertise and access to cutting edge design systems, and will benefit from case study implementations and studies and future application opportunities. Bristol Digital Futures Institute will facilitate collaboration across 20+ partner businesses and the public, with outputs supporting their mission for digital solutions that tackle global problems.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::5aa56aa4b9a4dbfa3709677d097280e3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::5aa56aa4b9a4dbfa3709677d097280e3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2021 - 2025Partners:Oxfam, Ultraleap, Laudes Foundation, Arcade Ltd, IBM Hursley +86 partnersOxfam,Ultraleap,Laudes Foundation,Arcade Ltd,IBM Hursley,Fashion District,H&M Foundation,EPSRC Future Composites ManufacturingHub,SharpEnd,ON ROAD,ReLondon,Abertay University,UK Fashion & Textile Association,UK-CPI (dup'e),SharpEnd,IBM Hursley,Business Growth Hub,UK-CPI,Oxfam,THP,REGEMAT 3D SL,Fashion for Good BV,Technical Fibre Products Ltd,SUEZ Recycling and Recovery UK Ltd,Kiosk N1C,Swift Analytical LTd,Yoox Net-a-Porter Group,Business Growth Hub,James Cropper (United Kingdom),Novozymes A/S,Vireol Bio Industries plc,Henry Royce Institute,Reskinned Resources Ltd,Universität Innsbruck,JESMOND ENGINEERING,JESMOND ENGINEERING,Circular Systems,Wandsworth Borough Council,Materials and Design Exchange,NYC Economic Development Corpration,University of Warwick,University of Warwick,University of Innsbruck,HKRITA,Neurosketch,LMB Textile Recycling (Lawrence M Barry),THP,Novozymes A/S,Wilson Biochemicals Ltd,LMB Textile Recycling,IDEO,ReLondon,Manor Farms,University of Portsmouth,University of Portsmouth,Ultraleap,The Royal Society of Arts (RSA),RAFC,Neurosketch,Wilson Biochemicals Ltd,Fashion District,Materials and Design Exchange,London Cloth Company,University of Abertay Dundee,Pentland Brands,SUEZ RECYCLING AND RECOVERY UK LTD,IDEO,Royal College of Art,H&M Foundation,Fashion for Good BV,Manor Farms,Presca Teamwear,Circular Systems,ON ROAD,UK Fashion & Textile Association,Henry Royce Institute,Presca Teamwear,Fashion Revolution,Arcade Ltd,EPSRC Future Composites ManufacturingHub,Yoox Net-a-Porter Group,RSA (Royal Society for Arts),HKRITA,REGEMAT 3D SL,Pentland Brands,Laudes Foundation,Kiosk N1C,Reskinned Resources Ltd,Swift Analytical LTd,Fashion Revolution,Wandsworth Borough CouncilFunder: UK Research and Innovation Project Code: EP/V011766/1Funder Contribution: 4,436,880 GBPThe current global fashion supply chain is characterised by its lack of transparency, forced labour, poor working conditions, unequal power relationships and overproduction caused by fast fashion. Lacking ethics, the global fashion supply chain is also highly polluting. The total footprint of clothing in use in the UK, including global and territorial emissions, was 26.2 million tonnes CO2 in 2016, up from 24 million tonnes in 2012 (equivalent to over a third of household transport emissions). The Textiles Circularity Centre (TCC) proposes materials security for the UK by circularising resource flows of textiles. This will stimulate innovation and economic growth in the UK textile manufacturing, SME apparel and creative technology sectors, whilst reducing reliance on imported and environmentally and ethically impactful materials, and diversifying supply chains. The TCC will provide underpinning research understanding to enable the transition to a more circular economy that supports the brand 'designed and made in the UK'. To enact this vision, we will catalyse growth in the fashion and textiles sector by supporting the SME fashion-apparel community with innovations in materials and product manufacturing, access to circular materials through supply chain design, and consumer experiences. Central to our approach is to enable consumers to be agents of change by engaging them in new cultures of consumption. We will effect a symbiosis between novel materials manufacturing and agentive consumer experiences through a supply chain design comprised of innovative business models and digital tools. Using lab-proven biotechnology, we will transform bio-based waste-derived feedstock (post-consumer textiles, crop residues, municipal solid waste) into renewable polymers, fibres and flexible textile materials, as part of a CE transition strategy to replace imported cotton, wood pulp and synthetic polyester fibres and petrochemical finishes. We will innovate advanced manufacturing techniques that link biorefining of organic waste, 3D weaving, robotics and additive manufacturing to circular design and produce flexible continuous textiles and three-dimensional textile forms for apparel products. These techniques will enable manufacturing hubs to be located on the high street or in local communities, and will support SME apparel brands and retailers to offer on-site/on-demand manufacture of products for local customisation. These hubs would generate regional cultural and social benefits through business and related skills development. We will design a transparent supply chain for these textiles through industrial symbiosis between waste management, farming, bio-refinery, textile production, SME apparel brands, and consumer stakeholders. Apparel brands will access this supply chain through our digital 'Biomaterials Platform', through which they can access the materials and data on their provenance, properties, circularity, and life cycle extension strategies. Working with SME apparel brands, we will develop an in-store Configurator and novel affective and creative technologies to engage consumers in digitally immersive experiences and services that amplify couplings between the resource flow, human well being and satisfaction, thus creating a new culture of consumption. This dematerialisation approach will necessitate innovation in business models that add value to the apparel, in order to counter overproduction and detachment. Consumers will become key nodes in the circular value chain, enabling responsible and personalised engagement. As a human-centred design led centre, TCC is uniquely placed to generate these innovations that will catalyse significant business and skills growth in UK textile manufacturing, SME fashion-apparel, and creative technology sectors, and drastically reduce waste and carbon emissions, and environmental and ethical impacts for the textiles sector.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::037517b4c8140d3e68b71b650b383df7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::037517b4c8140d3e68b71b650b383df7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
chevron_right