
QinetiQ (Malvern)
QinetiQ (Malvern)
21 Projects, page 1 of 5
assignment_turned_in Project2007 - 2012Partners:University of Sheffield, Jeol UK Ltd, Qioptiq Ltd, QinetiQ (Malvern), [no title available] +3 partnersUniversity of Sheffield,Jeol UK Ltd,Qioptiq Ltd,QinetiQ (Malvern),[no title available],Jeol UK Ltd,University of Sheffield,QinetiQFunder: UK Research and Innovation Project Code: EP/E034055/1Funder Contribution: 4,327,930 GBPAt the beginning of the 20th century, scientists discovered how to measure the size and spacing of atoms using a technique called diffraction, which led to a revolution in the understanding of chemistry, biology and solid-state physics. X-rays and electrons behave like waves, but with a wavelength which is much smaller than the spacing between the atoms of a solid. These waves scatter and interfere with one another, producing strong beams coming out of the object at particular angles. By measuring these angles, and knowing the wavelength of the waves, the separation of atoms could be calculated. It was using this method that Watson and Crick determined the structure of DNA in the 1950s. However, diffraction is only useful if the object is a regular lattice structure. In order to look at more complicated atomic structures, scientists have relied on electron or X-ray microscopes. In a standard microscope, a lens is used to produce a magnified image, but the method still relies on the waves that make up the radiation (light, electrons or X-rays) interfering with one another to build up the image. With light, this is experimentally easy, but with very-short wavelength radiation (a fraction of an atomic diameter), the tiniest error in the lens or the experimental apparatus makes the waves interfere incorrectly, ruining the image. For this reason, a typical electron or X-ray microscope image is about one hundred times more blurred than the theoretical limit defined by the wavelength.In this project, we aim to unify the strengths of the above apparently very different techniques to get the best-ever pictures of individual atoms in any structure (which is not necessarily crystalline). Our approach is to use a conventional (relatively bad) X-ray or electron lens to form a patch of moderately-focussed illumination (like burning a hole in a piece of paper with the sun's rays through a magnifying glass). In fact, we do not need a lens at all! Just a moveable aperture put in front of the object of interest will suffice. We then record the intensity of the diffraction pattern which emerges from the other side of the object on a good-quality high-resolution detector, for several positions of the illuminating beam. This data does not look anything like the object, but we have worked out a way of calculating a very good image of the object by a process called 'phase-retrieval'. To make an image of an object we have to know what's called the relative phase (the different arrival times) of the waves that get scattered from it. In diffraction, this information is lost, although some of it is preserved (badly) by a lens. Our data is a complex mixture of diffraction and image data, but the key innovation in this project is that we can use a computer to calculate the phase of the very high resolution data which could never be seen by the lens alone. Other workers in the United States have demonstrated very limited versions of this new approach, but we have a much more sophisticated computational method which eliminates essentially all earlier restrictions.The new method, which has received patent protection, could be implemented on existing electron or X-ray microscopes, greatly enhancing their imaging capability. It is even possible to contemplate a solid-state optical microscope, built into a single chip with no optical elements at all. All the weakness and difficulties and costs of lenses would be replaced by a combination of good quality detectors and computers. Our ultimate aim is to be able to image in 3D directly (using X-rays or electrons) any molecular structure, although this will require a great deal of research. The work put forward in this proposal will build the Basic Technology foundations of this new approach to the ultimate microscope.
more_vert assignment_turned_in Project2008 - 2013Partners:Qioptiq Ltd, QinetiQ, Intel Corporation, University of St Andrews, University of St Andrews +2 partnersQioptiq Ltd,QinetiQ,Intel Corporation,University of St Andrews,University of St Andrews,QinetiQ (Malvern),Intel (United States)Funder: UK Research and Innovation Project Code: EP/F001622/1Funder Contribution: 1,155,940 GBPSilicon Photonics is a field that has seen rapid growth and dramatic changes in the past 5 years. According to the MIT Communications Technology Roadmap, which aims to establish a common architecture platform across market sectors with a potential $20B in annual revenue, silicon photonics is among the top ten emerging technologies. This has in part been a consequence of the recent involvement of large semiconductor companies in the USA such as Intel and IBM, who have realised the enormous potential of the technology, as well as large investment in the field by DARPA in the USA under the Electronic and Photonic Integrated Circuit (EPIC) initiative. Significant investment in the technology has also followed in Japan, Korea, and to a lesser extent in the European Union (IMEC and LETI). The technology offers an opportunity to revolutionise a range of application areas by providing excellent performance at moderate cost due primarily to the fact that silicon is a thoroughly studied material, and unsurpassed in quality of fabrication with very high yield due to decades of investment from the microelectronics industry. The proposed work is a collaboration between 5 UK Universities (Surrey, St. Andrews, Leeds, Warwick and Southampton) with input from the industrial sector both in the UK and the USA. We will target primarily the interconnect applications, as they are receiving the most attention worldwide and have the largest potential for wealth creation, based on the scalability of silicon-based processes. However, we will ensure that our approach is more broadly applicable to other applications. This can be achieved by targeting device functions that are generic, and introducing specificity only when a particular application is targeted. The generic device functions we envisage are as follows: Optical modulation; coupling from fibre to sub-micron silicon waveguides; interfacing of optical signals within sub micron waveguides; optical filtering; optical/electronic integration; optical detection; optical amplification. In each of these areas we propose to design, fabricate, and test devices that will improve the current state of the art. Subsequently we will integrate these optical devices with electronics to further improve the state of the art in optical/electronic integration in silicon.We have included in our list of objectives, benchmark targets for each of our proposed devices to give a clear and unequivocal statement of ambition and intent.We believe we have assembled an excellent consortium to deliver the proposed work, and to enable the UK to compete on an international level. The combination of skills and expertise is unique in the UK and entirely complementary within the consortium. Further, each member of the consortium is recognised as a leading international researcher in their field.The results of this work have the potential to have very significant impact to wealth creation opportunities within the UK and around the world. For example emerging applications such as optical interconnect, both intra-chip, and inter-chip, as well as board to board and rack to rack, and Fibre To The Home for internet and other large bandwidth applications, will require highly cost effective and mass production solutions. Silicon Photonics is a seen as a leading candidate technology in these application areas if suitable performance can be achieved
more_vert assignment_turned_in Project2007 - 2010Partners:HPLB, QinetiQ, QinetiQ (Malvern), British Telecom, British Telecommunications plc +4 partnersHPLB,QinetiQ,QinetiQ (Malvern),British Telecom,British Telecommunications plc,Imperial College London,BT Group (United Kingdom),Qioptiq Ltd,Hewlett-Packard LtdFunder: UK Research and Innovation Project Code: EP/D076633/1Funder Contribution: 353,183 GBPMark Weiser's vision of ubiquitous computing, in which computers become transparently and seamlessly woven into the many activities of our daily lives, is slowly becoming a reality. Researchers have created prototype ubiquitous computing environments such as 'smart homes' that can automatically sense the presence of a resident in a particular room and change some aspect of the environment of the room such as turning on the lights, or 'smart museums' that can play recorded information about the museum artefact a visitor is standing in front of. There seem to be limitless possibilities for the kinds of environments and applications that can be developed for ubiquitous computing, yet the very nature of ubiquitous computing creates new and significant challenges for engineers who would like to build these environments and applications. Anybody who has ever used a computer has experienced the extreme frustration of using a software package that doesn't work the way it's supposed to, or that unceremoniously crashes in the middle of its operation, or that runs extremely slowly, or that transmits sensitive information such as credit card numbers over untrusted networks. For ubiquitous computing to achieve true transparent and seamless integration with its surroundings, it is important to prevent such mishaps, crashes, inefficiencies and insecurities from happening to the greatest extent possible. This project will define and implement a suite of sound, systematic methods that engineers can use to create correctly functioning, efficient and secure ubiquitous computing environments and applications. The research will be conducted and evaluated using the smart urban spaces and applications being developed in another ubiquitous computing project called Cityware.
more_vert assignment_turned_in Project2007 - 2007Partners:Qioptiq Ltd, BT Laboratories, BT Laboratories, QinetiQ, Djinnisys Corp. +4 partnersQioptiq Ltd,BT Laboratories,BT Laboratories,QinetiQ,Djinnisys Corp.,Djinnisys Corp.,University of Birmingham,University of Birmingham,QinetiQ (Malvern)Funder: UK Research and Innovation Project Code: EP/D07956X/1Funder Contribution: 583,564 GBPLarge-scale distributed systems, such as the Internet, broadband wireless at home and mobile phone networks, raise many challenges for the design and engineering of the underlying infrastructure. Such systems crucially depend on robust and efficient communication and coordination protocols that ensure that the overall system is self-organising, timely and energy-efficient, possibly in the presence of unreliable network services and malicious or uncooperative agents. New protocols for distributed coordination are being introduced to manage the limited resources. They increasingly often rely on randomisation, which plays an important role in achieving de-centralisation, and resource awareness, for example adapting to the power level. The combination of randomness and nondeterminism that arises from the scheduling of distributed components introduces complex behaviours that may be difficult to reason about. Assuring correctness, dependability and quality of service of such distributed systems is thus a non-trivial task that necessitates a rigorous approach, and methods for quantitative evaluation of such systems against properties such as ``the probability of battery level dropping below minimum within 5 seconds is guaranteed to be below 0.01 in all critical situations'', are needed. Theoretical foundations of such quantitative analysis have been proposed, with some implemented in software tools and evaluated through case studies. However, no tools and techniques can directly address real programming languages endowed with features such as random choice and timing delays.This proposal is to further develop the foundations for reasoning about probabilistic systems to enable quantitative analysis of real programming languages. The research will involve extending the successful quantitative probabilistic model checker PRISM (www.cs.bham.ac.uk/~dxp/prism/) via predicate abstraction, and develop additional enhancements to the PRISM toolkit in collaboration with the extensive user community. The resulting techniques will also be relevant for other domains in which probabilistic model checking has proved successful, e.g. performance analysis, planning and systems biology.
more_vert assignment_turned_in Project2007 - 2012Partners:University of Bristol, Macquarie University, QinetiQ, University of Bristol, Hewlett-Packard Ltd +8 partnersUniversity of Bristol,Macquarie University,QinetiQ,University of Bristol,Hewlett-Packard Ltd,CIP,QinetiQ (Malvern),Materials Modelling NPL,Materials Modelling NPL,Macquarie University,HPLB,Centre for Integrated Photonics,Qioptiq LtdFunder: UK Research and Innovation Project Code: EP/F010524/1Funder Contribution: 1,056,310 GBPQuantum mechanics tells us how the world works at its most fundamental level. It predicts very strange behaviour that can typically only be observed when things are very cold and very small. It has an inbuilt element of chance, allows superpositions of two different states, and admits super-strong correlations between objects that would be nonsensical in our everyday world / entanglement . Despite this strange behaviour, quantum mechanics is the most successful theory that we have ever had / it predicts what will happen almost perfectly!Although the theory of quantum mechanics was invited at the beginning of the last century, quantum information science has only emerged in the last decades to consider what additional power and functionality can be realised by specifically harnessing quantum mechanical effects in the encoding, transmission and processing of information. Anticipated future technologies include quantum computers with tremendous computational power, quantum metrology which promises the most precise measurements possible, and quantum cryptography which is already being used in commercial communication systems, and offers perfect security.Single particles of light / photons / are excellent quantum bits or qubits, because they suffer from almost no noise. They also have great potential for application in future quantum technologies: schemes for all optical quantum computers are a leading contender, and photons are the obvious choice for both quantum communication and for quantum metrology schemes for measuring optical path lengths. There have already been a number of impressive proof-of-principle demonstrations of photonic information science.However, photonic quantum technologies have reached a roadblock: they are stuck in the research laboratory. All of the demonstrations to date have relied on imperfect, unscalable and bulky elements with single photons travelling in air. This is not suitable for future technologies. In addition there has been no integration of these critical components which will be essential for the realisation of scalable and practical technologies. This project aims to address these problems by developing single photon sources based on diamond nanocrystals, optical wires on optical chips, and superconducting single photon detectors, to the high performance levels required. It also aims to integrate all of these components on a single optical chip, and thus bring photonic quantum technologies out of the laboratory towards the marketplace.
more_vert
chevron_left - 1
- 2
- 3
- 4
- 5
chevron_right