Powered by OpenAIRE graph
Found an issue? Give us feedback

Information Commissioners Office

Country: United Kingdom

Information Commissioners Office

3 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: EP/L021285/1
    Funder Contribution: 699,390 GBP

    Lifelogging and self-quantifying used to be niche areas for athletes, people with certain medical conditions, and those with the time, money, and motivation to use expensive specialist equipment to monitor themselves. Now the technology for ordinary people to track and analyze many aspects of their lives is becoming both affordable and invisible, requiring little effort or expense to collect. There are many business models based on mining the so-called 'digital exhaust' of people's online activity to provide apparently free services. The fact that so many more people are now able to automatically log so many aspects of their lives (beyond which web pages they visited) is creating opportunities for new business models to actually provide services for the people generating the data. For example, some people may wish to sell their data for cash rather than give it away, some may wish to donate it to worthy scientific causes, such as health research, while others may wish to share data only in a non-identifying aggregated form or perhaps not at all. Lifelogging data can range from relatively benign (such as number of keystrokes typed in a day) to the highly personal (such as the emotional arousal state) and the ways in which the data is shared may be highly nuanced. This project seeks to understand how the privacy and sharing requirements vary across different demographic groups and to build a sharing and privacy infrastructure specifically designed for lifelogging data.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/W005271/1
    Funder Contribution: 1,283,040 GBP

    Vision: In this fellowship, I aim to address a major challenge in the adoption of user-centred privacy-enhancing technologies: Can we leverage novel architectures to provide private, trusted, personalised, and dynamically- configurable models on consumer devices to cater for heterogenous environments and user requirements? Importantly, such properties must provide assurances for the data integrity and model authenticity/trustworthiness, while respecting the privacy of the individuals taking part in training and improving such models. Innovation and adoption in this space require collaborations between device manufacturers, platform providers, network operators, regulators, and the users. The objectives of this fellowship will take us far beyond the status-quo, one-size-fits-all solutions, providing a framework for personalised, trustworthy, and confidential edge computing, with ability to respect dynamic policies, in particular when dealing with sensitive models and data from the consumer Internet of Things (IoT) devices. In this fellowship, I aim to address these challenges by designing and evaluating an ecosystem where analytics from, and interaction with, consumer IoT devices can happen with trust in the model and authenticity, while enabling auditing and personalisation, hence pushing today's boundaries on all-or-nothing privacy and enabling new economic models. This approach requires designing for capabilities beyond the current trusted memory and processing limitations of the devices, and a cooperative dialogue and ecosystem involving service providers, ISPs, regulators, device manufacturers, and the end users. By designing our framework around the latest architectural and security features in edge devices, before they become commercially available, we provision for Model Privacy and a User-Centred IoT ecosystem, where service providers can have trust in the authenticity, attestability, and trustworthiness of the valuable models running on user devices, without the users having to reveal sensitive personal information to these cloud-based centralised systems. This approach will enable advanced and sensitive edge-based analytics to be performed, without jeopardising the individuals' privacy. Importantly, we aim to integrate mechanisms for data authenticity and attestation into our proposed framework, to enable trust in models and the data used by them. Such privacy-preserving technologies have the capacity to enable new form of sensitive analytics, without sharing raw data and thereby providing legal balancing capabilities that might enable certain sensitive (or currently unlawful) data analysis.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/K039989/1
    Funder Contribution: 662,804 GBP

    Despite being asked to "agree" constantly to terms of service, we do not currently have "meaningful consent." It is unclear whether having simple and meaningful consent mechanisms would change business fundamentally or enhance new kinds of economics around personal data sharing. Since consent is deemed necessary and part of a social contract for fairness, however, without meaningful consent, that social contract is effectively broken and the best intent of our laws undermined. Our research challenges to address this gap are interdisciplinary: meaningful consent has implications for transforming current digital economy data practices; change will require potentially new business models, and certainly new forms of interaction to highlight policy without over burdening citizens as we go about our business. We have set out a vision to achieve an understanding of meaningful consent through a combination of interdisciplinary expert and citizen activities to deliver useful policy, business and technology guidelines.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.