Powered by OpenAIRE graph
Found an issue? Give us feedback

Private Address

Private Address

66 Projects, page 1 of 14
  • Funder: UK Research and Innovation Project Code: ST/Z510087/1
    Funder Contribution: 116,629 GBP

    The overarching aim of our research is to study the structure and behaviour of the dusty and icy material that dominates the space environments where stars and planets form. In particular we are interested to understand how ice particles collide, stick, aggregate and grow, since these are the first stages of planet formation. The challenge is that our icy particles are often very small (about the same diameter as the width of a human hair), and moving very slowly (relatively speaking just a few centimetres per second - which if you were swimming at the same pace would mean it could take you 45 minutes or more to swim one length of a 25 m swimming pool!!). At this velocity particles are influenced by gravity on Earth, and that makes it difficult to collide them together, so typically we conduct these experiments in microgravity. To complicate matters further the type of ice that dominates in space is not like an ice cube from a fridge, but more like a fluffy sponge - it's amorphous ice. So to be able to study such systems we combine constraints determined from observations from world-class telescopes with laboratory experiments, first conducted on Earth but then conducted on parabolic flights, or sub-orbital flights, to study icy grain aggregation. These experiments combine many techniques - but the dominant one is ultra-fast camera technology - much like the images you may have seen of slow-mo crash-test dummy images - we do the same - take multiple images (a video) of our particles colliding to work out what happens to them. And although this studentship is motivated by a science research question, what we really need is increasingly more sophisticated camera technologies to elucidate the collision outcome processes. So-called light-field tracking, enables us to identify the exact positions (locations in space) and velocities of our particles during the experiments. But it turns out ice is not very easy to spot. So the aim of this proposal is to develop hyper-spectral infrared camera technology - where instead of looking for the icy grains with visible light, we will look at them at a variety of infrared wavelengths (up to 4 filters) where ice has spectral features specifically associated with water, and therefore be able to identify between icy and dusty grains, and potentially between amorphous and crystalline ices too. The development of hyper spectral light-filed tracking camera technology will greatly benefit our research and enable us to study more complex systems and feed data back to the astronomy and space science community, but a hyper spectral light-field tracking IR camera has the potential to be a more widely applicable technology. Think of those areas like transport and food manufacture where ice play an important role. This project is reliant on the unique partnership between the OU (academia) and DIAL Ltd, an SME with patents and expertise in camera technologies. Without this partnership the proposed technology development, and its testing in a research environment could not be realised, and not lead on to potential applications beyond astronomy.

    more_vert
  • Funder: UK Research and Innovation Project Code: BB/Y513830/1
    Funder Contribution: 237,478 GBP

    The ability to accurately characterise complex variation in organismal colouration has important implications for multiple fields of bioscience research, and there is a need to develop efficient and effective new workflows for harnessing the vast potential of image-based biological datasets. Recent advances in artificial intelligence (AI) and computer vision provide a powerful opportunity to achieve this, but software pipelines making these approaches generalisable and readily accessible to the bioscience research community have yet to be developed. In this project we will build on our track record of colouration, AI and computer vision research to develop a 'next generation' software toolkit for extracting and analysing high dimensional colour pattern information from images. These tools will be integrated into a user-friendly interactive software package that will have wide applicability across the biosciences and will transform the ability of researchers to rapidly characterise colour pattern phenotypes from image datasets. To achieve this the project is divided into three work packages split into developing advanced tools for segmenting and analysing complex organismal colour pattern variation from images and then implementing these tools in a user-friendly interactive framework. In the first work package, we will adopt a cutting-edge hierarchical semantic segmentation strategy to develop models that are capable of accurately detecting not only a specimen within an image but also of simultaneously segmenting regions within the specimen, thus enabling a detailed analysis of its constituent parts. In the second work package, we will develop a powerful new workflow for colour pattern analysis that leverages the potential of deep learning. This new pipeline consists of multiple steps, each employing cutting edge techniques, and will equip researchers with the power to efficiently perform advanced colour pattern analysis in their system. In the final work package, we will incorporate the new segmentation and analysis tools into 'Phenolearn', an easy-to-use Python-based software program developed by us for biological image analysis using AI. Together, these work packages will produce a powerful and generalisable toolkit for extracting and analysing colour pattern information from biological images. The availability of these tools via this project has the potential to catalyse existing bioscience research programmes and to open up new fundamental and applied research areas involving colour pattern phenotyping that are currently intractable.

    more_vert
  • Funder: UK Research and Innovation Project Code: MR/Z503800/1
    Funder Contribution: 246,546 GBP

    We have invented a new and disruptive technology that can mitigate a common dose-limiting and/or debilitating side-effect of chemotherapy. We want to optimise this technology, embed it into a wearable product, and put it in/on the hands of cancer patients. Context & Need Chemotherapy-induced peripheral neuropathy (CIPN) is a form of nerve damage caused by chemotherapy drugs - it often starts in the hands and feet and causes pain, loss of function and visible damage. Depending on the type of chemotherapy, 30-90% (Beijers et al. 2012, Maihöfner et al. 2021) of patients will suffer from CIPN and its side effects. Given that the number of UK cancer survivors is set to increase from 3 million today to roughly 4 million by 2030 (Macmillian Cancer Support, 2022), the impact of CIPN will continue to grow. There are currently no effective treatments to prevent or reduce CIPN, and symptoms get worse as chemotherapy treatment progresses. This means that patients are often forced to stop their treatment, reducing the likelihood of their cancer being cured. Moreover, even if chemotherapy is successful, one in four cancer survivors suffer disabilities resulting from the side effects of chemotherapy, among which CIPN is dominant (Macmillian Cancer Support, 2022).The needs of chemotherapy patients are not being adequately met. Current solutions involve cooling of the skin, and/or compression of the hands and feet. They're effective to some extent, but their working principles need to be better understood. Crucially however, these approaches are not well-tolerated: they can be heavy, restrictive and generally uncomfortable. Their cold temperatures and pressure can cause pain and damage themselves, sometimes even frostbite. Furthermore, these technologies can be cumbersome to manage, meaning they are difficult for cancer care staff to use and support. These challenges significantly limit the uptake of these solutions. For the millions of people who could benefit from the prevention of CIPN, many won't—because these solutions create too much discomfort and can't be sustained. Our solution can do better. Proposed Solution (Potential Application & Benefits) We have discovered a way to reduce peripheral blood-flow during chemotherapy using targeted electrical stimulation. Reducing blood-flow will lower the amount of toxic chemotherapy drugs available for absorption by the nerve-endings in the fingertips and toes, therefore, protecting the peripheral nerves. We plan to create the 'Stim-mitts' (working product name); a medical device that can transform the experience and outcomes of cancer patients and survivors, who have undergone or will undergo chemotherapy. Uniquely, the Stim-mitts work quietly, affordably and without restricting hand movements—meaning that patients undergoing chemotherapy for hours or days are not bound helplessly to what are essentially arm restraints. This product stands to benefit an additional 10 million patients per year worldwide, and the impact of its use would be felt during chemotherapy and for the rest of their lives.

    more_vert
  • Funder: UK Research and Innovation Project Code: AH/Z505432/1
    Funder Contribution: 1,092,530 GBP

    The Deaf British Sign Language (BSL) using community in Wales have more challenges than hearing populations in accessing healthcare services (especially in emergency situations), health information, mental health care services and support, and family-related services and training. The Deaf in Wales also suffer from inconsistent interpreting services and poor communication in many health-care settings and, as a result, are also at greater risk of underdiagnoses and under-treatment of chronic diseases and tend to have poorer health than the general population (Sign Health, 2019; Shank & Foltz, 2019; Foltz & Shank, 2020; Foltz et al., 2022). Deaf people also have an incidence of mental health problems that is twice as high as the general population and face barriers accessing support services (Terry et al., 2021). Research has shown that access to nature and outdoor activities is a health asset (Houlden et al. 2018; Rebar et al. 2015); however, many of Wales's cultural and natural assets are not accessible because materials are rarely available in BSL. The goal of this project is to design, implement and evaluate Deaf-community-lead solutions for these known and documented health inequities and inequalities. We are a transdisciplinary team of academic and non-academic, hearing, and Deaf partners. Our project will use innovative social networking techniques, community outreach, focus groups, interviews, and custom video-based questionnaires and app technologies to identify sustainable, community-led, culturally, and linguistically driven solutions to improve the health and wellbeing of this community. We will then develop, implement, and evaluate these solutions in five areas that impact Deaf Welsh citizens' health and wellbeing. These areas are: (1) public health, (2) mental health, (3) interpreting services, (4) access to natural resources, parks, and use of natural environment resources and (5) language and communication. In the areas of (1) public health, (2) mental health, and (3) interpreting services we will work with the Deaf community as well as the NHS, Health Boards, interpreters, and other service providers to develop culturally and linguistically driven proposals to improve services at every stage of the healthcare delivery process. Solutions will focus on access to interpreters and BSL language services and improving d/Deaf awareness with respect to language and culture and d/Deaf rights with respect to the law. In the area of (4) natural resources, we will develop BSL video guide apps for better accessibility to Welsh natural resources, parks, and heritage sites at three locations across Wales. In the area of (5) language and communication, the Welsh dialect of BSL, it's lexicon and regional variations, remains undocumented and undescribed. This project will develop an on-line dictionary and corpus resources, with a particular focus on medical terminology, to aid interpreters and service providers in Wales. We will assess the potential positive impact of the resources that we are developing as part of this project and use the results obtained from this project to inform and influence current local and national health care policy, services, practices, and delivery to aid compliance with the 2010 Equality and 2015 Well-being of Future generations Acts.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/J01205X/1
    Funder Contribution: 817,019 GBP

    Any system used for a safety-critical task, like a pollution-monitoring unmanned aerial vehicle, a robot inspecting a nuclear plant or a human assistive nursebot in a hospital or at home, must have enough evidence to demonstrate its safety before we can use it. Gathering such evidence involves verification, the process of demonstrating that the implementation of a system meets the requirements laid down in its specification. Much work has been done to develop tools and methods for verification of microelectronic designs and software. When we try to verify an autonomous, intelligent system (AIS), with existing methods, two problems arise: First, traditional verification techniques rely on a specification that fully defines the functional behaviour of the system to be verified. But, we want to use an intelligent system - one that can adapt to circumstances, deciding what to do without being told exactly how - precisely so we can avoid having to specify a response for every possible scenario. There are usually far too many possible scenarios for this to be practical. Instead, we need flexible specifications expressed in terms of acceptable and required behaviour with associated precise limits for critical properties complemented by more vague indications of desired actions. Second, the control software to achieve dynamic adaptation is very complex, using iterative optimization algorithms to combine discrete and continuous decision-making. Although there has been much research on how to design these algorithms, their verification is still an open research question. The RIVERAS project aims to tackle both of these problems. First, we will develop a way of verifying a system with a flexible specification. This will require a formal way to write a specification, using a modelling language that can capture these flexible requirements. Then, fuzzy concepts will be used to analyse how well we meet the specification. Fuzzy concepts are graded and properties or statements involving them are true (or false) to some degree. This means that specifications may only be partialy satisfied which introduces new challenges when verifying them. Second, we will also develop ways of verifying control software that uses optimization, which is a general approach for making decisions. Given a cost model and a set of constraints that define permitted limits, an optimizer finds the best set of decisions to maximize or minimize the cost while staying within permitted limits. Most planning problems for intelligent systems can be expressed in the form of optimization and research on control theory proves properties that help us understand how well it should work. We will use the properties established with control theory as a specification to demonstrate that the optimizer software does what it should. Moreover, we will integrate these properties into the software. This allows us to detect, contain and correct failures should they occur. Finally, we will integrate all these developments into an innovative "Design for Verification" (DFV) method. Engineers who use our DFV methods when specifying and designing an intelligent system, and when producing its optimization-based control software, will immediately be able to use our verification methods to determine if they have done it right. This will be far easier and a lot more efficient than designing it first, without thinking about verification, and then figuring out how to verify afterwards. To help refine our methods and to evaluate them afterwards, RIVERAS will try them out on real robots. For example, we will design an intelligent exploration system for a Mars rover, implement it on a robot on Earth, and produce all the verification evidence to demonstrate it works as intended.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • 4
  • 5
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.