Powered by OpenAIRE graph
Found an issue? Give us feedback

Raytrix

RAYTRIX GMBH
Country: Germany
6 Projects, page 1 of 2
  • Funder: European Commission Project Code: 101070679
    Overall Budget: 8,822,240 EURFunder Contribution: 8,822,240 EUR

    Today only very light AI processing tasks are executed in ubiquitous IoT endpoint devices, where sensor data are generated and access to energy is usually constrained. However, this approach is not scalable and results in high penalties in terms of security, privacy, cost, energy consumption, and latency as data need to travel from endpoint devices to remote processing systems such as data centres. Inefficiencies are especially evident in energy consumption. To keep up pace with the exponentially growing amount of data (e.g., video) and allow more advanced, accurate, safe and timely interactions with the surrounding environment, next-generation endpoint devices will need to run AI algorithms (e.g., computer vision) and other compute intense tasks with very low latency (i.e., units of ms or less) and energy envelops (i.e., tens of mW or less). NimbleAI will harness the latest advances in microelectronics and integrated circuit technology to create an integral neuromorphic sensing-processing solution to efficiently run accurate and diverse computer vision algorithms in resource- and area-constrained chips destined to endpoint devices. Biology will be a major source of inspiration in NimbleAI, especially with a focus to reproduce adaptivity and experience-induced plasticity that allow biological structures to continuously become more efficient in processing dynamic visual stimuli. NimbleAI is expected to allow significant improvements compared to state-of-the-art (e.g., commercially available neuromorphic chips), and at least 100x improvement in energy efficiency and 50x shorter latency compared to state-of-the-practice (e.g., CPU/GPU/NPU/TPUs processing frame-based video). NimbleAI will also take a holistic approach for ensuring safety and security at different architecture levels, including silicon level.

    more_vert
  • Funder: European Commission Project Code: 768883
    Overall Budget: 5,641,190 EURFunder Contribution: 4,763,040 EUR

    For the latest generation of micro-fabricated devices that are currently being developed, no suitable in-line production inspection equipment is available, simply because current inspection equipment expects planar processing while most of the devices are often highly 3D in nature e.g. medical. This lack of automated processing feedback makes it difficult to steer process development towards higher yields in micro-components and MEMS production. Another visible problem is the need to document and record process data, even on the individual device level, with the degree of traceability as is required for example, for medical devices fabricated under ISO13485. Both factors in the end limit the possibility of reliable and cost effective manufacturing of MEMS and micro-components. Thus, CITCOM has been proposed to address the industrial needs of MEMS and micro-manufacturing which will offer an in-line production inspection and measurement system for micro-components. The system will be developed and demonstrated at TRL7. The system will be based on optical and X-ray techniques combined with computer tomography and advance robotic system capable of analyzing defects that occur in production of micro components e.g. stains, debris, fracture, abnormal displacements, chemical composition of surface coatings, surface traces etc. enabling 98% yield and 100% reliability. Ultimately, CITCOM will cut such costs by 60% as it will offer a system with automated knowledge and inspection data based process feedback that will allow the detection and traceability of faults that may occur in MEMS production, especially for critical applications like aerospace, space and healthcare. CITCOM will give Europe a technological and competitive advantage in the growing manufacturing and production industry. The consortium behind this action is strongly driven by industrial need and problem having Philips and Microsemi as end users and validators of the technology.

    more_vert
  • Funder: European Commission Project Code: 101096838
    Overall Budget: 9,421,130 EURFunder Contribution: 9,024,620 EUR

    6G eXperimental research infRastructure to enable next-generation XR services (6G-XR) will develop an evolvable experimental infrastructure for the duration of the SNS programme that covers demonstrating the performance of key B5G/6G candidate technologies, components, and architectures to keep the infrastuctures valid now, in mid-term and in long-term. It will demonstrate technological feasibility of “better than 5G” KPIs, innovative radio spectrum technologies and the use and sharing applicable to beyond 5G and 6G spectrum, validate a representative end-to-end beyond 5G architecture (and later 6G) including end-to-end service provisioning with slicing capabilities, and at cloud implementation level (Open RAN). Furthermore, 6G-XR shall validate multi access edge computing scenarios and their integration into a complete cloud continuum, support innovative use cases with vertical actors, beyond 5G capabilities, and support showcasing events. In addition, 6G-XR demonstrates and validates performance of innovative 6G applications with a focus on demanding immersive applications such as holographics, digital twins and XR/VR. 6G-XR will support impactful contribution to standards and demonstrate the technological feasibility of key societal requirements and objectives such as energy reduction at both platform and network levels.

    more_vert
  • Funder: European Commission Project Code: 833704
    Overall Budget: 6,984,730 EURFunder Contribution: 6,984,730 EUR

    The D4FLY project will augment the current capabilities and capacities of border authorities in countering emerging threats in document and identity verification (e.g., forged documents, impostor fraud, morphed faces) at manual and highly automated border control points and in the issuance process of genuine documents. The confluence of D4FLY set of tools and systems will improve the quality of verification and reduce major time sinks in the processes thus enabling real on-the-move border crossing experience for travelers. Novel sensor hardware based on advanced lightfield cameras and novel algorithms developed in the project will enhance verification accuracy and robustness via the combined usage of 2D+thermal face, 3D face, iris and somatotype biometrics. Analytical means to identify known criminals based on somatotype and 3D face data generated from mugshots and observation data will be developed. Various operational needs of end-users with different threat landscapes constitute the backbone of D4FLY development efforts. D4FLY will create a resilient document verification system that can verify a multitude of physical and electronic security features (e.g. Kinegrams®, MLIs, CLIs), detect complex forms of electronic fraud and advanced morphing, and identify fraud in breeder documents. The potential benefit of blockchain technology in identity verification will also be investigated. The D4FLY solution will consist of a border control kiosk geared with enhanced enrolment, verification and detection capabilities; smartphones applications for improved performance and verification capabilities; and a non-stop on-the-move system for biometric verification. The innovation will be validated against European societal values, fundamental rights, privacy, data protection and applicable legislation. Four different border control points and one document fraud expertise center will form the project’s testing and demonstration ground.

    more_vert
  • Funder: European Commission Project Code: 101135025
    Overall Budget: 7,655,710 EURFunder Contribution: 7,655,710 EUR

    The concept of presence can be understood as a synthesis of interrelated psychophysical ingredients where multiple perceptual dimensions intervene. A better understanding on how specific aspects, such as plausibility (the illusion that virtual events are really happening), co-presence (the illusion of being with others), or place illusion (the feeling of being there) impact XR experiences is key to improve their quality. The availability and performance of current technologies do not reach high levels of presence in XR, which is essential to get us closer than ever to the perennial VR dream: to be anywhere, doing anything, together with others, from any place. PRESENCE will impact multiple dimensions of presence in physical-digital worlds, addressing three main challenges: i) how to create realistic visual interactions among remote humans, delivering high-end holoportation based on live volumetric capturing, compression and optimization techniques, under heterogeneous computation and network conditions; ii) how to provide realistic touch among remote users and synthetic objects, developing novel haptic systems and enabling spatial multi device synchronisation in multi user scenarios; iii) how to produce realistic social interactions among avatars and agents, generating AI virtual humans, representing actual users or AI agents. PRESENCE will ensure the future uptake of research results following a threefold evaluation method: 1) each technology will be independently evaluated to understand its impact on the illusion of presence; 2) each component will be evaluated by the integration team, providing scientific and technical feedback in order to facilitate their use in each project iteration as well as beyond the project scope, towards technology transfer and exploitation; 3) all components will be integrated in two demonstrators (professional and social setups), following a human-centred design approach and ultimately evaluating the user experience.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.