Powered by OpenAIRE graph
Found an issue? Give us feedback

University of Graz

University of Graz

3 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: EP/X010740/1
    Funder Contribution: 348,065 GBP

    Inverse problems are concerned with the reconstruction of the causes of a physical phenomena from given observational data. They have wide applications in many problems in science and engineering such as medical imaging, signal processing, and machine learning. Iterative methods are a particularly powerful paradigm for solving a wide variety of inverse problems. They are often posed by defining an objective function that contains information about data fidelity and assumptions about the sought quantity, which is then minimised through an iterative process. Mathematics has played a critical role in analysing inverse problems and corresponding algorithms. Recent advances in data acquisition and precision have resulted in datasets of increasing size for a vast number of problems, including computed and positron emission tomography. This increase in data size poses significant computational challenges for traditional reconstruction methods, which typically require the use of all the observational data in each iteration. Stochastic iterative methods address this computational bottleneck by using only a small subset of observation in each iteration. The resulting methods are highly scalable, and have been successfully deployed in a wide range of problems. However, the use of stochastic methods has thus far been limited to a restrictive set of geometric assumptions, requiring Hilbert or Euclidean spaces. The proposed fellowship aims to address these issues by developing stochastic gradient methods for solving inverse problems posed in Banach spaces. The use of non-Hilbert spaces is gaining increased attention within inverse problems and machine learning communities. Banach spaces offer much richer geometric structures, and are a natural problem domain for many problems in partial differential equation and medical tomography. Moreover, Banach-space norms are advantageous for preservation of important properties, such as sparsity. This fellowship will introduce modern optimisation methods into classical Banach space theory and its successful completion will create novel research opportunities for inverse problems and machine learning.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/V011790/1
    Funder Contribution: 648,804 GBP

    This project will build the foundational techniques needed to understand what is necessary from an ensemble of computer model simulations to provide robust reliable knowledge. Such knowledge includes progress in scientific understanding of complex multi-component systems on the one hand, and guidance for societal decisions on the other. The focus is on climate models, where ensembles are a core element of research activities. The research will involve, indeed requires, integrating expertise across a range of disciplines. Global climate models (GCMs) are complex, high-dimensional, discretized systems. They use the latest computer technology to solve a large number of simultaneous differential equations. Different disciplines and researchers view them in radically different ways and hence have very different perspectives on how to explore their errors and uncertainties. For physicists they are interpreted as representing physical understanding so the term "model error" encompasses their failure to effectively represent what we know about physical processes. For nonlinear dynamicists they are high dimensional systems of nonlinear equations so there is an expectation that profoundly different results could potentially arise as a consequence of uncertainty in both initial conditions and model formulation (model errors) of even the smallest degree. For risk analysts and forecasters they are generators of timeseries from which errors can be judged by relation to historic observations, although physicists and statisticians might be concerned that the extrapolatory nature of the climate change problem undermines such an assessment. For "users" such as adaptation planners and policy makers they provide climate projections which represent the starting point for their own work; any uncertainties provided by the scientists are assumed to be reliable estimates of the best current knowledge. This variety of perspectives leads to many different ways of interpreting the errors and uncertainties, and creates conflicting demands on the models themselves. Projection uncertainties are typically quantified from ensembles of simulations which come in a variety of shapes and sizes. These ensembles are used to explore the impact of initial condition uncertainty (ICU - the consequence of not knowing the current state of the climate system when trying to make simulations of the future) and of model uncertainty (MU - the consequence of our models being different from reality). Today's ensembles (and models) have been built under the constraint of limited computational capacity so their designs start from the question: "what's the best we can do with today's technology?" By contrast, one of the unique and innovative aspects of this project is that its starting point is "what type and size of ensembles are necessary to provide the information we want?" It will develop designs for "aspirational ensembles" i.e. ensembles that are necessary to answer a particular set of questions without regard to current computational limitations. From this foundation it will evaluate the best way to approach the trade-offs necessary in building practical ensembles which DO allow for current computational limitations. The approach taken will be twofold. First will be to use low-dimensional nonlinear systems to study the consequences of nonlinearity for ensemble design in climate like situations. Second will be to build a collaborative, collective picture of the demands and constraints on model ensembles from a wide range of different disciplinary and national perspectives. The project will provide a solid foundation for future ensemble designs and will inform the widely debated computational conflict between uncertainty exploration, resolution and complexity. The latter two of these are much studied but there has been no comprehensive assessment of former. This project will fill that gap from the perspective of what is needed rather than what is available.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/S022473/1
    Funder Contribution: 5,345,840 GBP

    The CDT in Molecules to Product addresses an overarching concern articulated by industry operating in the area of complex chemical products. It centres on the lack of a pipeline of doctoral graduates who understand the cross-scale issues that need to be addressed within the chemicals continuum. Translating their concern into a vision, the focus of the CDT is to train a new generation of research leaders with the skills and expertise to navigate the journey from a selected molecule or molecular system through to the final product that delivers the desired structure and required performance. To address this vision, three inter-related Themes form the foundation of the CDT - Product Functionalisation and Performance, Product Characterisation, and Process Modelling between Scales. More specifically, industry has identified a real need to recruit PGR graduates with the interdisciplinary skills covered by the CDT research and training programme. As future leaders they will be instrumental in delivering enhanced process and product understanding, and hence the manufacture of a desired end effect such as taste, dissolution or stability. For example, if industry is better informed regarding the effect of the manufacturing process on existing products, can the process be made more efficient and cost effective through identifying what changes can be made to the current process? Alternatively, if there is an enhanced understanding of the effect of raw materials, could stages in the process be removed, i.e. are some stages simply historical and not needed. For radically new products that have been developed, is it possible through characterisation techniques to understand (i) the role/effect of each component/raw material on the final product; and (ii) how the product structure is impacted by the process conditions both chemical and mechanical? Finally, can predictive models be developed to realise effective scale up? Such a focus will assist industry to mitigate against wasted development time and costs allowing them to focus on products and processes where the risk of failure is reduced. Although the ethos of the CDT embraces a wide range of sectors, it will focus primarily on companies within speciality chemicals, home and personal care, fast moving consumer goods, food and beverage, and pharma/biopharma sectors. The focus of the CDT is not singular to technical challenges: a core element will be to incorporate the concept of 'Education for Innovation' as described in The Royal Academy of Engineering Report, 'Educating engineers to drive the innovation economy'. This will be facilitated through the inclusion of innovation and enterprise as key strands within the research training programme. Through the combination of technical, entrepreneurial and business skills, the PGR students will have a unique set of skills that will set them apart from their peers and ultimately become the next generation of leaders in industry/academia. The training and research agendas are dependent on strong engagement with multi-national companies, SMEs, start-ups and stakeholders. Core input includes the offering, and supervision of research projects; hosting of students on site for a minimum period of 3 months; the provision of mentoring to students; engagement with the training through the shaping and delivery of modules and the provision of in-house courses. Additional to this will be, where relevant, access to materials and products that form the basis of projects, the provision of software, access to on-site equipment and the loan of equipment. In summary, the vision underpinning the CDT is too big and complex to be tackled through individual PhD projects - it is only through bringing academia and industry together from across multiple disciplines that a solution will be achievable. The CDT structure is the only route to addressing the overarching vision in a structured manner to realise delivery of the new approach to product development.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.