Powered by OpenAIRE graph
Found an issue? Give us feedback

Berlin University of Technology

Country: Germany

Berlin University of Technology

10 Projects, page 1 of 2
  • Funder: UK Research and Innovation Project Code: AH/V008331/1
    Funder Contribution: 33,167 GBP

    The project brings together academics, performers, sound curators and archivists, and sound engineers and technicians to engage in discussion concerning the use and status of early recordings (1890-1945) as sources for the study of performance practice and performance history, establishing foundations for further collaborative research and knowledge exchange in the area. Researchers and performers have been using early recordings as primary sources for the study of performance and music history for the last thirty years, in topics ranging from the minutiae of performance practice in specific styles and instruments, to the radical transformations that early recording technologies introduced in listening practices and in discourses around music and performance. During this period, technological advances have made early recordings more widely accessible, with collections and archives around the world digitizing their holdings and making them available online for free or at negligible cost. However, most such research activity has been conducted in relative isolation, and opportunities for researchers to engage in discussion about their work with an audience of their peers are few and far between. This lack of connectedness has prevented the field from tackling ambitious, comparative research questions centring around systematic historical change, and detracted from its relevance and visibility both in the broader field of musicology and among non-academic performers and general concert audiences. The project proposes to tackle these issues through the following interconnected collaborative activities: -Five symposia (4 in different cities across the UK, 1 hosted by partner TU Berlin) will provide opportunities for experts (musicologists, performers, sound curators, archivists and engineers engaged in sound curation and digitizing initiatives - both based in and outside HEI) to engage in methodological discussion with the aim of both co-creating collaborative resources and identifying collaborative research and knowledge exchange opportunities in the field. -A concert series attached to the symposia will allow audiences across the UK to familiarize themselves with practice-led research conducted by network members, while allowing the latter to reflect, in conversation with other network members, on good practice, opportunities and challenges for knowledge exchange. -A series of video interviews with network members filmed at the symposia and concert series will make accessible an array of approaches to early recordings to other HEI and non-HEI experts, as well as to musicologists, performers and performance students beyond the immediate area of study. These videos will be accompanied by an open-access handbook for similar audiences, expanding on the issues raised in the interviews (to be published after the grant period). The project will also establish a permanent forum for those interested in early recordings as sources for the study of performance practice and history. This open international research network will organize regular conferences and meetings, fostering collaborative activities between its members. The forum's establishment will be supported by an 'early recording roadmap', drafted collaboratively by network-members, identifying urgent research questions and flagging up potential areas for knowledge exchange collaborations.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/D055075/1
    Funder Contribution: 184,068 GBP

    Oscillations are everywhere, ranging from perfectly ordered periodic and quasiperiodic to completely disordered, irregular ones, often described by probabilistic laws. Turbulence, climate changes, neuron spiking, heart electrical activity / all of these are examples of processes where irregular oscillations are possible and play a prominent role.The irregularity of oscillations can have two different origins: deterministic and stochastic.In the former case, although the dynamics of the oscillating system is defined by deterministic laws, the oscillations themselves are very sensitive to the initial state of the system: even a very small change in this initial state can lead to a substantial difference in the behaviour. This kind of oscillations is usually called deterministic chaos. Another type of irregular oscillations occurs when the dynamics of the system is defined by random fluctuations existing within, or applied to, the system. Notably, in spite of their randomness, the noise-induced oscillations in some systems can look quite regular and resemble very much the deterministic ones. One of the brightest examples is a sensory neuron which demonstrates no oscillations unless the signal on its input exceeds a certain level, after which the neuron generates one electrical pulse whose shape and duration are defined deterministically and almost do not depend on the input. The usual signal coming from an environment is in fact a random signal, that being applied to the neuron can generate a sequence of pulses looking quite coherent. Amazingly, irregular oscillations of this kind are widely spread in nature and technology. Besides the neurons and neuron networks, such oscillations can arise in semiconductor nanostructures, chemical reactions, some engineering mechanisms like drill string and many others. It is obvious that our ability to control irregular oscillations for example by making them more predictable is hugely beneficial to industry, technology, medicine, etc. Control means that by imposing some, preferrably small, forcing or feedback on the system, one is able to change the amplitude, timescale or regularity of oscillations, or even to cease them altogether. In the last decade a good progress was made in the control of deterministic chaos. The advanced nonlinear control methods exploit the fact that deterministic trajectories with the desired timescales already exist in the system, but are unstable and thus invisible in experiment, and the control tools just stabilize them. However, the systems where oscillations occur only due to noise have no such trajectories and no deterministic timescales. All timescales or orbits can be introduced only in the statistical sense. The control of oscillations that are purely random has never (or rarely, depending on what one means by control) been addressed previously.The main aim of the current research is to develop a general effective method for the control of oscillations induced merely by external random fluctuations that would be feasible as applied to real-life problems. As a control tool delayed feedback will be considered, which looks the most promising from the viewpoint of simplicity and efficiency. The main objectives are: (i) To develop qualitative theories for the delayed feedback as applied to minimal models that describe a large class of nonlinear systems in which noise can induce oscillations. (ii) To establish if the method is applicable to more realistic models like neuron-like networks. (iii) To verify if the delayed feedback can work in a real experiment on control of heart rate and of its variability in experiments with healthy human volunteers. The work will be carried out in Loughborough University in collaboration with Technical University of Berlin and the Department of Cardiovascular Sciences of Leicester University.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/R044732/1
    Funder Contribution: 99,492 GBP

    Modern genetic data sets are vast, both in terms of the number of sequenced individuals and the length of the sequenced DNA segments. Patterns within these data sets carry information about the biological and demographic histories of the population, which cannot usually be observed directly. The central tool connecting observed patterns to predictions and inference is the Kingman coalescent: a random tree that provides a model for the unobserved ancestry of the sampled DNA sequences. Since the ancestry is unobserved, inferences are made by averaging over all possible ancestries. In simple cases the average over ancestries can be calculated analytically, but in most biologically relevant scenarios the average has to be approximated. This is usually done by simulating an ensemble of possible ancestral trees, and treating the ensemble average as an approximation of the true, unknown average. The quality of the approximation depends on the degree to which the ensemble is representative of the set of all possible ancestries. Ensuring that an ensemble is both representative, and not infeasibly large, is a challenging problem. Existing methods for producing ensembles split into two categories: importance sampling (IS), and Markov chain Monte Carlo (MCMC), of which the latter is typically more flexible and easier to implement. Both are known to scale poorly with the size and complexity of the data set. This proposal seeks to improve the scalability of state of the art MCMC methods in three related ways: 1. Much work has been done to characterise optimal IS algorithms, which have been observed to perform roughly as well as naive implementations of the more flexible MCMC. Preliminary results for this project show that optimality results for IS can also be used to characterise optimal MCMC algorithms, but this has never been done. This work will investigate and thoroughly benchmark the performance of the resulting, optimised MCMC algorithms. 2. The practical utility of MCMC algorithms has improved dramatically through so-called optimal scaling results, which provide a guide for how to tweak the algorithm as the data set grows. However, these typically apply only to settings in which the distribution being simulated consists of independent, real-valued components. In genetics, the distributions of interests consist of trees, and is hence much more complicated. This project will investigate extensions of optimal scaling results to tree-valued settings using recently developed machinery of optimal scaling via Dirichlet forms, which are a natural way to analyse tree-valued algorithms. 3. A recently published algorithm called msprime uses a novel data structure, called a sparse tree, to improve the speed and memory consumption of naive coalescent simulation by many orders of magnitude. This does not immediately translate to improved inference algorithms, because naive simulation typically results in ensembles that are poor representations of the true average. The sparse tree structure cannot be directly inserted into an MCMC algorithm, but preliminary work has identified several ways in which MCMC can be modified to use data structures resembling sparse trees. This project will implement and benchmark all of the resulting algorithms to determine which of these ways is the most effective. The end result of these three streams will be a highly optimised, flexible, open source algorithm for inference in genetics. It will have unprecedented performance on large data sets due to a combination of mathematical optimisation (objectives 1 and 2) and optimisation of the underlying data structure (objective 3). MCMC algorithms also provide automatic, rigorous uncertainty quantification for their estimates, which many state-of-the-art competitors are not able to provide. This makes MCMC particularly well suited to e.g. clinical practice, where understanding uncertainties is crucial for medical outcomes.

    more_vert
  • Funder: UK Research and Innovation Project Code: AH/X004031/1
    Funder Contribution: 30,865 GBP

    In the museum sector today, many organisations are campaigning for fairer recruitment and career structures. In the UK, Museum Detox highlights the 'systems of inequality' facing people of colour, while Fair Museum Jobs campaign for museum recruitment based on 'fairness, transparency, equity and inclusivity'. This new network will support and develop such campaigns by investigating the historical roots of the museum professions and the structures that supported them, from the birth of the modern museum (c. 1850) to the present day. It asks how a variety of museum professions came into being, and how they acted to produce particular competences and ways of working. It considers how they reflected and produced hierarchies, especially around race, class, gender, sexuality and disability, and how such hierarchies were challenged and negotiated by those excluded and disempowered by museums. The project is based on the principle that modern museums developed and are developing in an increasingly globalised world, and aims to investigate professional development transnationally, recognising that people, training methods and standards were all highly mobile. It understands museums both as part of colonising Western scholarship, and as sites of potential resistance and social justice. Critically, the network also seeks to develop productive links between academics and museum professionals, creating spaces, practices and outputs for dialogue between past, present and future, and to conceptualise historical practice as a tool to improve accessible professionalisation today. The network will have three strands: 1) it will think about who is included and excluded by barriers to museum work and professional structures. This means understanding boundaries of race, class, gender, sexuality and disability, and how they have been created and maintained, and considering distinctions between amateur and professional, and 'insider' and 'outsider'; 2) it will consider how museum careers have and continue to play out within museums: we will analyse training, promotion, the range of distinct roles open to museum staff, occupational organisations, and the hierarchies between different kinds of professional in the museum, including the curator, designer, conservator, educator and volunteer; 3) it will investigate the effects of transnational forces on museum professionals, including colonialism, 'development' and war: we will consider professionals' mobility (willing or unwilling), networks of patronage and influence, locations for training and transnational bodies, e.g International Council of Museums (ICOM; est. 1946). The network will support the development of deep, transnational historical scholarship through three international workshops and a special issue journal. It will also produce and disseminate a set of tools through which the museum sector can campaign against barriers to accessible professionalisation. A central network aim is to inform collective ways of working across HE and the museum sector: the workshops will seek to move beyond the delivery of a series of formal papers, prioritising practical engagement with source material and data from historic and contemporary museums, and shared writing and output production. The network is committed to the development and contribution of emerging and early career researchers and practitioners, and marginalised voices from the museum sector, and aims to democratise the process of research itself by making sure a wider range of voices are heard. Based on ongoing project evaluation, co-written outputs will consider the methods, benefits and challenges of collaboration between historians and museum practitioners, and the ways in which historical methods can be used to understand the museum professions and professionalisation. The network aims to offer a model for a scholarly but campaigning understanding of the museum as a historically produced but contemporaneously responsive instititions.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/R008205/1
    Funder Contribution: 100,852 GBP

    A path models the evolution of a variable in a certain state space. The state space could represent physical quantities, such as the position of a gas particle, or data such as future sea levels. A common feature in these examples is that they are random processes. Since at each time a random path could move in any direction, its trajectory would be erratic and not smooth in general. Remarkable theories of calculus have been developed to describe how these oscillatory paths affect each other. A first major success was Itô's theory which applies to systems driven by Brownian motion, a canonical mathematical model for random particle motion. Another breakthrough occurred in the late 1990s with the advent of rough path theory. Unlike Itô's construction, rough path theory is able to handle paths that move in much more irregular directions than Brownian motion. It has also led to breakthroughs on the modelling of surface growth, an achievement recognized by the award of the Fields Medal to Martin Hairer in 2014. Meanwhile, many successful applications of rough path theory have been established, ranging from new numerical and statistical methods to an international award-winning algorithm for Chinese handwriting recognition. Most of these applications use a tool, known as the signature, to analyze irregular paths. The signature is purpose-built to describe paths that move so randomly in for example, a square, that they can fill the entire square. The first term of the signature captures the one dimensional aspects of the path, such as the displacement. The second term represents two dimensional aspects such as the area, and so on. Successive terms in the signature will tell us higher and higher dimensional information about the path. The signature has a complex structure and this means that many fundamental problems have remained unresolved. For example: Problem 1: How do we calculate the average values of signatures of random paths? Problem 2: How is the signature related to the other key features of paths? As rough path-based methods demonstrate their initial promise, these problems have emerged as the main challenges hindering further development. This state of affairs is the main motivation for our current proposal. Instead of studying the signature directly, we will first examine the properties of functions on signatures. Crucially, most recent advances on signatures have used the qualitative properties of these functions. Their quantitative aspects have remained underused, possibly due to their complex structure. We will develop new methods for understanding these structures, making novel use of important tools from other areas of mathematics, including Lie algebra, hyperbolic geometry and stochastic analysis. The study of Problems 1 and 2 is expected to reveal the deep relationship between the signature and other important ideas in mathematics, such as the notion of length. This is a worthwhile pursuit because many mathematical breakthroughs were born out of linking two hitherto unrelated ideas, with the proof of Fermat's Last Theorem being a famous example. A key element of this project is to disseminate our new results in rough path theory beyond our usual audience in probability theory, as the biggest gains will come from reaching those who have not been aware of rough path theory and its potential relevance to their work. There will also be impact beyond academia. Scientists have observed that many real-world random processes, such as river flow and stock prices, have rough path behaviour. If we can resolve Problem 1, it will extend the existing applications of signatures to these real-world processes. For Problem 2, any progress will provide crucial insights into why signature-based methods work and could lead to tangible improvements to the efficiency of, for instance, recognition methods that use the signature.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.