Powered by OpenAIRE graph
Found an issue? Give us feedback

Sun Microsystems

Country: United Kingdom

Sun Microsystems

11 Projects, page 1 of 3
  • Funder: UK Research and Innovation Project Code: EP/H017461/1
    Funder Contribution: 1,029,470 GBP

    The use of computers and computer programs is pervasive nowadays, but every computer user knows that programs go wrong. While it is just annoying when our favourite text editor loses a bit of our work, the consequences are potentially much more serious when a computer program that, for instance, controls parts of an airplane goes wrong. Software validation and verification are central to the development of this sort of application. In fact, the software industry in general spends a very large amount of money in these activities. One of the measures taken to promote correctness of programs is the use of a restricted set of features available in programming languages. This usually means that most of the more recent advances in software engineering are left out. In this project, we propose to provide development, validation, and verification facilities that allow object-orientation and a modern real-time computational model to be used for the programming of safety-critical systems. In particular, we will work with one of the most popular programming languages: Java, or more specifically, its profiles for high-integrity engineering proposed by the Open Group. As our main case study, we will verify parts of the controller of the first Java Powered Industrial Robot, developed by Sun. One of our collaborators, a senior engineer in Sun tells in an interview that Distributed Real-Time Systems are really hard to build and the engineering community doesn't really know how to build them in a coherent repeatable way. (java.dzone.com/articles) Real-Time Java is entering the industrial automation and automotive markets. Lawyers did not allow the Java Robot to get anywhere near a human, even in a JavaOne conference demo. To proceed in that kind market, better support is needed.Programming is just one aspect of the development of a modern system; typically, a large number of extra artefacts are produced to guide and justify its design. Just like several models of a large building are produced before bricks and mortar are put together, several specification and design models of a program are developed and used before programs are written. These models assist in the validation and verification of the program. To take our civil engineering metaphor one step further, we observe that, just like there can be various models of a building that reflect several points of view, like electricity cabling, plumbing, and floor plans, for example, we also have several models of a system. Different modelling and design notations concentrate on different aspects of the program: data models, concurrent and reactive behaviour, timing, and so on. No single notation or technique covers all the aspects of the problem, and a combination of them needs to be employed in the development of large complex systems. In this project, we propose to investigate a novel integrated approach to validation and verification. Our aim is to provide a sound and practical technique that covers data modelling, concurrency, distribution, and timing. For that, we plan to investigate the extension and combined use of validation and verification techniques that have been successfully applied in industry. We do not seek an ad hoc combination of notations and tools, but a justified approach that provides a reliable foundation for the use of practical techniques. We will have succeeded if we verify a substantial part of the robot controller: using a model written in our notation, we will apply our techniques to verify parts of the existing implementation, execute it using our verified implementation of Safety-critical Java. Measure of success will be provided by our industrial partners and the influence of our results in their practice or business plans.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/F067135/1
    Funder Contribution: 123,385 GBP

    We propose a project to assess the viability of the many opportunities that are becoming available to develop innovative Internet-based services. The investigation will be carried out as a collaboration between the Imperial College Internet Centre and the Imperial College Innovation Studies Centre and will involve a range of commercial partners including Vodafone, Sun Microsystems, Transport for London, The 451 Group, the BBC, O2 and BT. The project will assess the economic and technical feasibility of a range of potential innovative Internet services, assess any generic underlying factors that need to be addressed in order to realise them, identify the most promising and determine the economic, regulatory and technical obstacles that need to be overcome to fully realise these opportunities. The outcome will be an overall assessment of these opportunities and a development and exploitation plan for those services identified as the most viable and economically important.Recent developments in computing technologies, including service oriented architectures, encapsulation, Grid computing and virtualisation have provided the opportunity to repackage many ICT processes as composable, use-on-demand services. This provides the possibility that the next-generation Internet could be refactored as a series of markets in use-on-demand, pay-per-use services. This in turn provides the opportunity to develop many new innovative services or to repackage existing services in more efficient manners. Being virtual or software-based many of these service opportunities are low-cost with very low barriers to entry and thus represent attractive opportunities for innovative enterprises.We will conduct a series of studies to assess the feasibility and viability of the above services. These studies will be carried in collaboration with our commercial partners by interviews and discussion. These studies will address the economic and technical feasibility of the ideas put forward here and seek to identify any steps: economic, regulatory or technical, that need to be overcome in order to realise these opportunities to the full. An initial assessment of the field will be made to identify any generic or underlying factors that need to be addressed. An evaluation will then be made to identify the most immediately promising opportunities. Those identified will be studied in more depth. The output of these studies will include an overall assessment of the field and, for those opportunities studied in depth, a roadmap detailing the business and technical developments needed to realise these opportunities. At the same time we will discuss with our commercial partners a potential development and exploitation plan. In this way we hope to be in a position to be able to rapidly develop and exploit the most promising opportunities thus identified.We identify service opportunities in third-party authored Web Services, Software as a Service, Virtual Facilities, Consumer Support Services, Community Services, Computational Resources Brokering and a General Service Broker.An initial assessment of the field will be made to identify any generic or underlying factors that need to be addressed. It may be that a few factors, e.g. secure payment, trust, are common among all applications and that their resolution would enable a whole family of opportunities to be realised. An evaluation will then be made to identify the most immediately promising opportunities. Those identified will be studied in more depth.The output of these studies will include an overall assessment of the field and, for those opportunities studied in depth, a roadmap detailing the business and technical developments needed to realise these opportunities. At the same time we will discuss with our commercial partners a potential development and exploitation plan. In this way we hope to be in a position to be able to rapidly develop and exploit the most promising opportunities.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/D076935/1
    Funder Contribution: 929,809 GBP

    The SESAME consortium is a newly-formed multidisciplinary group that proposes to investigate the use of wireless sensor-based systems in enhancing the performance of elite athletes and young athletes who have been identified as having world class potential. The project has goals of enhancing performance, improving coach education, and advancing sports science. Despite a specific focus on athletics, the technical approach and its solutions will be deliberately generic, to enable their subsequent application to a wider range of training and healthcare scenarios. At present, only a limited set of sensing technologies are available for the coaching of elite athletes, including motion capture, fixed force plates and video recording for feedback. However, they often disrupt the sporting activity and the data they return are difficult to interpret to provide appropriate feedback. Wireless sensing technologies, ranging from accelerometry and magnetometry through to accurate positioning systems, have the capacity to revolutionise the field, by providing information about limb positioning and orientation, athlete location, muscular function, and physiological status, all in real time. Through the SESAME project, dynamic data will come from wearable non-intrusive sensors, augmented by passive video capture. Raw sensor data will be processed to extract meaningful information using a combination of sensor fusion and stochastic signal processing to derive information that is meaningful to coaches and athletes. This will take place in the knowledge that human biomechanics constrains movement and will take account of errors introduced by sensor attachment mechanisms and sensor mispositioning. Biomechanical and physiological performance models will be informed by captured sensor data, and from them idealised movements and the performance effects of deviations will be captured.A comprehensive study of human factors is essential if coaches and athletes are to derive real benefit from SESAME. Ethnographic studies will be undertaken with coaches - to build expert domain-specific knowledge, to capture their cognitive models of performance, and to assist in the design of user interfaces. Feedback to coaches and athletes will be in two forms: (i) graphical, both as a data stream that has been processed to respect the coaches' cognitive models and by overlaying sensor data on video; (ii) as real-time feedback if feasible: e.g. using buzzers. Analysis of an athlete's performance is not only a real-time activity: a definitive record of sensor data, decision support recommendations, medical advice and any clinical events will be maintained, allowing users to take account of relevant medical inputs. Such an approach also allows for comparative studies between athletes and the mining of such information both to improve biological performance models and to understand the effect of deviation from the ideal and precursors to injury. The focus of the work will be on running - specifically sprinting. However, given the national importance of the 2012 Olympic Games we will also explore the possibility of using the technology in other athletic disciplines, more general forms of exercise, and rehabilitation following injury. Should time permit, wider applications such as gait analysis for cerebral palsy patients will also be explored. Athletic training is a highly demanding application domain from the viewpoint of wireless sensor networking / it is necessary to develop and integrate novel sensors, QoS-driven real-time networking, and system autoconfiguration, all using an extensible generic software infrastructure. Consequently, solving problems in this challenging domain will provide a necessary building block for the solution of more generic problems in ubiquitous and sentient computing.The SESAME consortium contains a blend of expertise that is essential for progress in deploying technology in this domain.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/H005633/1
    Funder Contribution: 1,523,820 GBP

    Computer systems have been pervasive for many years, but despite this, and despite the huge resources devoted to their construction, they are still typically insecure, prone to failure, and hard to use. Major failures are commonplace, in sharp contrast with the products of other engineering industries, and dealing with them, and with the day-to-day lesser flaws, has huge economic and social costs. The core technical difficulty is system complexity: the range of behavior, the large scale, and the legacy of old design choices combine to make it hard to understand these systems well enough to engineer them well. My main research goal is to develop intellectual tools that suffice for solid system-building, analogous to the applied mathematics of more traditional engineering disciplines. This must be grounded on real systems - it cannot be done in theoretical isolation. My approach, as documented in the Track Record, is to focus on the key articulation points in the hierarchy of abstractions used to build systems: programming languages, processor instruction sets, network protocols, and so forth. These are relatively stable points in a rapidly changing environment, are critical to all system development, and are small enough that a modest team can address them. Each demands different research: new language constructs, new specification, reasoning, and testing techniques, and so forth. In this Fellowship I will pursue this approach, focussing on the problems in building computer systems above the intricate relaxed memory models of modern multiprocessors. Multiprocessor systems are now the norm (as further speed-up of sequential processors has recently become impractical), but programming them is very challenging. A key difficulty is that these systems do not provide a sequentially consistent memory, in which events appear to occur in a single global time order, but instead permit subtle reorderings, rendering intuitive global-time reasoning unsound. Much previous work across a range of Computer Science, in programming languages, program logics, concurrency theory, model checking, and so on, makes the now-unrealistic assumption of sequential consistency, and it must now be revisited in this more complex setting.I will develop precise mathematical models of the behavior of real-world multiprocessors that take such reorderings into account, and develop semantics and reasoning techniques above them. Using those, I will consider the verification of high-performance concurrent algorithms (as used in operating system and hypervisor kernels), the design of higher-level languages, and verified compilation of those languages to real machines. This will enable future applications to be developed above a high-confidence and high-performance substrate. It should also have a broader beneficial effect on research in Computer Science, drawing together mathematically well-founded theory and systems-building practice.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/H008063/1
    Funder Contribution: 102,184 GBP

    Information holds the key to success. The accuracy of data determines operational performance, regulatory compliance and the effectiveness of business strategy. Organisations in every industry worldwide have an increased awareness of the costs and risks caused by data that is inconsistent, inaccurate, stale or deliberately falsified. The Data Warehousing Institute estimates that poor quality data costs US businesses $600 billion annually. Many companies are investing in data quality solutions that help increase transparency and productivity and, as a result, the data quality market is experiencing rapid growth. Whilst these companies are making progress on internal clean up and consolidation tasks, such activities require large amounts of manual effort. A new breakthrough which provides theoretical background and practical algorithms for data quality management has been pioneered at the School of Informatics. This approach, based on a novel extension of classical dependency theory, increases the level of automation and improves accuracy in the data quality process. In 2008, Prof. Wenfei Fan was awarded with the British Computer Society Roger Needham award along with the Chinese Yangtze River Scholar award for his research in this area. Building on the output of this award-winning research, we aim to deliver a concept system, Quaid, which scales to real commercial datasets and addresses the needs of industrial customers. Through Quaid, we envisage new products and services will be generated from existing digital data sources.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.