
Immersion (France)
Immersion (France)
13 Projects, page 1 of 3
assignment_turned_in Project2010 - 2013Partners:Immersion (France), ENTENTE POUR LA FORÊT MÉDITERRANÉENNE, SEDU, CEN, Crisisplan +3 partnersImmersion (France),ENTENTE POUR LA FORÊT MÉDITERRANÉENNE,SEDU,CEN,Crisisplan,CRS4,CNR,Groupe Up (France)Funder: European Commission Project Code: 242341more_vert Open Access Mandate for Publications and Research data assignment_turned_in Project2022 - 2025Partners:EUROPEAN BROADCASTING UNION, AFP, Immersion (France), WLT, MT +16 partnersEUROPEAN BROADCASTING UNION,AFP,Immersion (France),WLT,MT,F6S IE,BALTIC FILM AND CREATIVE TECH CLUSTER,LIT,RTV SLOVENIJA,CREATIVE SATOR E STUDIO LDA,SPARKNEWS,STORYPACT GMBH,VUB,IMC,CERTH,Trinity College Dublin, Ireland,VRAI,KHORA APS,TEILIFIS NA GAEILGE TG4,NWO-I,BEELD EN GELUIDFunder: European Commission Project Code: 101070109Overall Budget: 10,087,300 EURFunder Contribution: 8,997,630 EURThe future of media experiences is still to be written, and the maturity of eXtended Reality (XR) and Artificial Intelligence (AI) technologies provides a unique window of opportunity for the European Creative and Cultural Sector (CCS) to reimagine digital co-creation, interaction and engagement possibilities. TransMIXR, a consortium of eight representatives from the CCS and twelve leading industrial and academic partners, combines the necessary interdisciplinary skillsets and domain expertise to create a range of human-centric tools for remote content production and consumption via social virtual reality. The project’s TransMIXR platform will provide (i) a distributed XR Creation Environment that supports remote collaboration practices, and (ii) XR Media Experience Environment for the delivery and consumption of highly evocative and highly-social immersive media experiences. The ground-breaking AI techniques for the understanding and processing of complex media content will enable the reuse of heterogeneous assets across immersive content delivery platforms. Using the Living Labs methodology, TransMIXR will develop and evaluate four pilots that bring the vision of future media experiences to life in four CCS domains: (i) news media & broadcasting, (ii) performing arts, and (iii) cultural heritage. Additionally, the project will harness the creativity of the CCS and will forge interdisciplinary collaborations to demonstrate how immersive social experiences could be transferred to new application areas beyond the CCS. The results of the project will contribute to the Media and Audiovisual Action Plan from the European Commission, in particular boosting the adoption of XR technologies, opening new business models opportunities in new application areas and markets, and gaining worldwide leadership in XR technologies while being deeply grounded in European values such as veracity, diversity, connectedness and universality.
more_vert - CNR,CRS4,ESRI R&D Z,KUL,BLOM CGR SPA,Immersion (France),Groupe Up (France)Funder: European Commission Project Code: 231199
more_vert Open Access Mandate for Publications and Research data assignment_turned_in Project2025 - 2027Partners:University of York, IMT, ECL, Immersion (France), CodiumAI +7 partnersUniversity of York,IMT,ECL,Immersion (France),CodiumAI,F6STECH,NBG,Unparallel Innovation (Portugal),University of L'Aquila,INTRASOFT International,ALES,LISTFunder: European Commission Project Code: 101189664Funder Contribution: 5,218,190 EURThe reliable application of LLM-based agents to SE requires a tremendous increase in their accuracy and minimisation of their bias. While LLMs continue increasing in size and performance, it seems that phenomena like hallucinations of a single agent are substantially inevitable, since they are linked to the fundamental inference mechanism in generative models. On the other hand, evidence starts accumulating about the possibility of achieving the required performance by collaboration and debate among groups of agents. As it happens among humans, quality of work increases with specialisation of workers on tasks, organised collaboration, and discussion among workers with different backgrounds. Differently from humans, the instantiation of multiple required AI agents, and the collaboration and discussion among them, are very fast and cheap, making this approach particularly convenient. MOSAICO proposes the theoretical and technical framework to implement this approach and to scale it to very large groups of collaborating agents, i.e. AI-agent communities. The developed solutions are composed into an integrated MOSAICO platform, handling communication, orchestration, governance, quality assessment, benchmarking and reuse of AI agents. MOSAICO is integrated with existing development environments, to present the results to software engineers, and allow expert users to intervene in the AI decisions. The performance and reliability of MOSAICO technologies and tools to achieve given software engineering tasks are assessed within 4 different use cases scenarios coming from immersive technologies, bank/financing, aerospace and Internet of Things sectors. The long-term adoption of MOSAICO results and technologies will be ensured by open sourcing the code and fostering an open collaboration, such as open-source initiatives, to enhance user engagement in the MOSAICO community.
more_vert Open Access Mandate for Publications and Research data assignment_turned_in Project2019 - 2022Partners:UV, Smartex (Italy), Goa University, INRIA, TECNALIA +4 partnersUV,Smartex (Italy),Goa University,INRIA,TECNALIA,Immersion (France),TECNALIA SERBIA DOO BEOGRAD,AAU,Manus Technology Group B.V.Funder: European Commission Project Code: 856718Overall Budget: 3,799,950 EURFunder Contribution: 3,799,950 EURTACTILITY is a multidisciplinary innovation and research action with the overall aim of including rich and meaningful tactile information into the novel interaction systems through technology for closed-loop tactile interaction with virtual environments. By mimicking the characteristics of the natural tactile feedback, it will substantially increase the quality of immersive VR experience used locally or remotely (tele-manipulation). The approach is based on transcutaneous electro-tactile stimulation delivered through electrical pulses with high resolution spatio-temporal distribution. To achieve it, significant development of technologies for transcutaneous stimulation, textile-based multi-pad electrodes and tactile sensation electronic skin, coupled with ground-breaking research of perception of elicited tactile sensations in VR, is needed. The key novelty is in the combination of: 1) the ground-breaking research of perception of electrotactile stimuli for the identification of the stimulation parameters and methods that evoke natural like tactile sensations, 2) the advanced hardware, that will integrate the novel high-resolution electrotactile stimulation system and state of the art artificial electronic skin patches with smart textile technologies and VR control devices in a wearable mobile system, and 3) the novel firmware, that handles real-time encoding and transmission of tactile information from virtual objects in VR, as well as from the distant tactile sensors (artificial skins) placed on robotic or human hands. Proposed research and innovation action would result in a next generation of interactive systems with higher quality experience for both local and remote (e.g., tele-manipulation) applications. Ultimately, TACTILITY will enable high fidelity experience through low-cost, user friendly, wearable and mobile technology.
more_vert
chevron_left - 1
- 2
- 3
chevron_right