personal data and trust network inaugural event 11 march 2015 - record

122
#PDT 1 www.pdtn.org #PDT Inaugural Event Digital Catapult Centre, 101 Euston Road, London, 11 March 2015

Upload: digital-catapult

Post on 20-Jul-2015

1.537 views

Category:

Documents


1 download

TRANSCRIPT

www.pdtn.org

#PDT

1

www.pdtn.org

#PDT

Inaugural EventDigital Catapult Centre, 101 Euston Road, London, 11 March 2015

www.pdtn.org

#PDT

Executive summary Slide 3

What happened Slides 4-13

What was discussed / produced Slides 14-29

Annex: presentations and notes Slides 30-122

www.pdtn.org

#PDT

Executive summary

Around 100 experts in the field of personal data, privacy and trust met at the Digital Catapult Centre to

• Learn about the purpose and aims of the network

• Hear perspectives from industry on the key issues

• Hear examples of the world-leading research in this topic that has been conducted in the UK over the last few years

• Identify the most important priorities for the network …

• … and make a start on determining how the network would address them

Priorities identified were:

• How can we share best practice?• What are the implications of digital social

innovation?• How should we deal with (EU) regulation?• How can consent in pervasive environments

best be managed?• Who owns the rights to use personal data?

Groups worked to begin to answer these questions: their work is presented in this document

There was great energy in the room from participants representing many different types of organisation, and a clear commitment to work together

3

www.pdtn.org

#PDT

What happened Slides 4-13

Read the slide headings to get an overview in 30 seconds

www.pdtn.org

#PDT

Andy and Matt welcomed delegates

Andy Green briefly explained the role of the Catapult and the

centres around the UK (Sunderland, Bradford and Brighton),

and stressed the significance of the personal data and trust

issue

He said there was an opportunity for network members to

gain insight from each other and contribute to greater

understanding

Matt Stroud explained the genesis of the network, and its role

in helping to unlock value for multiple parties from personal

data

He explained how the network drew together members with

multiple perspectives on the issues

5

What we need is for the

innovators in the room to

come together and

contribute new ways to

think about the issues

There are practical

issues here – but

also ethical issues

See slides 31-37 for more detail

www.pdtn.org

#PDT

Jon introduced Alex Craven who presented professional

and personal perspectives on personal data and trust

Jon Kingsbury ran through the agenda, and pointed out that the

room contained representative of a world-class research base,

businesses (innovative, data-focused SMEs and larger companies

for whom the issue was becoming more important, and professional

services companies working in this area), policy makers, public-

sector organisations and trade associations in multiple sectors

Alex Craven spoke about how ad agencies increasingly make use of

personal data in their work, and about the potential for measurable

public good in its use.

He spoke about how he believed that there could be a way to give

individuals proper control over the use of their data. His idea is

called “Our Data Mutual”

6

Privacy is dead …

we must move on

See slides 38-53 for more detail

www.pdtn.org

#PDT

Rav spoke about his work with banks, and the

approach taken by corporations to data and trust …

Rav spoke about “conduct risk” and compliance

with regulations – including the impact this is

having on systems and processes

He said the issues faced were similar in other

big companies with huge volumes of data, and

said they are pretty good at managing data

He said that big financial services companies

and their consultants don’t have all the bright

ideas, and that he hoped the network would help

build consensus and also a critical mass of

thinking

7

I’m not sure how the market is

turning – it’s like a kids’ football

match … we are all chasing

the ball wherever it goes

See slides 54-55 for more detail

www.pdtn.org

#PDT

… and Jon led a discussion of the main issues arising

from these presentations

Questions and discussion points range

across many topics:

• Personalised online advertising and mashups

of multiple data sets

• Trust frameworks

• Making sense of ‘big data’

• Examples of the use of personal data for good

• Language used within the network

• Privacy and EU regulations

• Who owns ‘personal’ data?

• The special case of health data in specific

contexts

8

See slides 15-21 for more detail

www.pdtn.org

#PDT

After a networking lunch, delegates heard about funding

opportunities from Innovate UK competitions …

Jon explained that much public funding was based around collaboration; he promoted the KTN’s Digital Business Briefing, and three colleagues spoke briefly about their competition-based funding programmes:

• Jonny Voon – Protecting data in industry looking at digital disruption (cyber attacks) – opening 31 March (GBP4 million)

• Tom Fiddian – Enhancing user experience using personal data opens 16 March; a feasibility design study (GBP2M)

• Agata Samojlowicz – Enhancing user experience in retail (up to GBP4M two-stage collaborative R&D) opens 16 March

9

www.pdtn.org

#PDT

… and heard three presentations on personal

data research

Jerome Ma explained the purpose of the

Research Council’s Digital Economy theme, and

introduced thee speakers from the digital

economy research hubs:

• Derek McAuley (Horizon Digital Economy

Research Institute)

• Paul Watson (SiDE Hub)

• Pete Edwards (dot.rural Digital Economy

Hub)

They gave thought-provoking details of some of

the personal data and trust issues of their

research work

10

See slides 56-113 for more detail

I can tell if you

didn’t have a

shower this

morning

Sometimes giving people more

control over the use of their data

might increase their personal risk

Rural bus service

planning might result in

individuals being traced

www.pdtn.org

#PDT

Groups then thought about how they might work together

on specific priority issues …

The initial five priority areas identified were:

1. Sharing best practice

2. Digital social innovation

3. Dealing with (EU) regulation

4. Consent in pervasive environments

5. Who owns the rights to use the data

Groups note down key points from their

discussions

11

See slides 22-29 for more detail

www.pdtn.org

#PDT

… and briefly fed back their discussions to

all delegates

Sharing best practice

• Objective models of risk; voluntary certification

Digital social innovation

• There is a “pyramid of trust”

Dealing with (EU) regulation

• Let’s make use of it – there are good bits

Consent in pervasive environments

• It’s not informed, and it’s not consent

Who owns the rights to use the data

• It’s complicated!

12

See slides 22-29 for more detail

www.pdtn.org

#PDT

Matt explained possible next steps for the network

and thanked participants for their work

Jon said he felt the day had generated some very interesting debate. He said there were two ways to take this conversation forward:

• The website• Sharing personal contact details (delegates would be

emailed to ask if they were happy to share their details)

Jon said he had realised there was a very large cohort of people that really understand the issues and the technicalities of the issues

Matt closed by saying where the network might go from here. He said there were several things that might happen:

• Future regular meetings – quarterly? And thematic meetings – around the country; may use the National Virtual Incubator (teleconferencing facility); he asked if anyone might like to host a meeting

• Digital presence (website, a quarterly digital journal)• Community interest groups – vertical or horizontal, e.g.,

a privacy working group (or security or psychology) –who might want to lead or get involved?, and a PIMS provider forum

• Grow the membership base – writing papers and working with the media, as well as spreading the word within relevant organisations

• Other? Suggestions please

13

See slides 114-122 for more detail

www.pdtn.org

#PDT

What was discussed / produced Slides 14-29

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [1]

Q: Advertising and big data – where is the cutting edge of personal data use?

A: [Alex] It’s personalised advertising online (e.g., mashing online activity data with datasets from Experian for instance).

This sort of thing is diverting advertising revenue from traditional channels

Q: What about IoT? We are all becoming generators of our own data – is this a service provider data gold rush? And what

are the killer apps?

A: It should be a gold rush – but the trust framework needs to be in place first

Q: Fintech Innovation Lab – how many innovations are coming here that are geared around mining personal data?

A: [Rav] I haven’t seen many … but … personal data has always been there – what’s changed is the way the data can be

captured, and quantified. But there’s too much, it’s fractured and siloed. This creates opportunities for arbitrage between

data silos, for the consumer and for organisations. The big game-changer is when companies work out how to use this

data in different ways to create and sell products. One of the biggest issues we have is the capture of social media data;

social media is highly qualitative and you need to interpret it to use it. We need to have a discussion about trust that looks

at this. The next killer apps will be those that make sense of qualitative data (literally “making sense of it”, and making use

of the data – e.g., predictive analytics, statistics)

15

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [2]

Q: We run digital service for academics; Alex’s examples are a good example of why consent doesn’t work (installing cookies … mortgage applications – of course stop sending me information on mortgages when I already have one!) – so there are some things that should be “never do”; some that are “of course do it”; and some that are in the middle. I have no problem with finding this in proposed or current European legislation – so what we need to do is identify what things fit into which category – I like the idea of a “mutual” doing this

Response: Do you think there are organisations or services that get it right?

A: Learning analytics (e.g., can a university improve its education to students by identifying students it could help in a different way?; or providing federated access management – a service provider doesn’t need to know all the details of a student to grant access to specific applications in specific ways

Q: I campaign in this area. An appeal: be precise about language. The EU legislation is data protection not privacy legislation. I am concerned about the way we are talking about privacy – it is a fundamental human right and needs to be respected. In trying to frame how people can come to trust institutions and companies, a balance of consensual, safe and transparent things must be arrived at – this varies by context. But all three must be addressed – for control and understanding what that control actually is. Many of us here are tackling very difficult but not intractable problems. To find mutual benefit we must recognised that this is inherently wrapped up with “privacy” – though definitions of privacy are subjective and can’t be easily predicted

16

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [3]

Q: You say EU frameworks won’t work – what does this mean? I think it will (EU Directive 95 etc).

A [Alex]: Doing this top down is fundamentally wrong – we need data protection and you can’t have trust if it’s mandated

from the top down; if it’s my data, then I want to say how it’s used. The EU should not say how it can be used. The

principle is wrong and there should be no one-size-fits-all European decision. But there is no alternative being put up

against the EU way

Comment: There is a publication “The Lord of The Things” – when data is “ours”, are we just “stakeholders” in our data?

Q [“Patients Like Me”]: We should think about things at a community level – you can see some really interesting things on

sharing personal data in the health sector. Patients Like Me is one of many similar groups / communities online doing the

“quantified self” – sleep patterns, blood pressure etc …increasingly YouTube channels are created as well “like

embarrassing bodies”! Very useful and informative

A: Lowering the cost and inertia of signing up to things like this is important

17

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [4]

Q: Thinking about the Tesco diabetes thing – what is the appetite from banks etc for sharing of data for social good?

A [Rav]: They know about life events. Typically, if you are going to divorce, the party you are divorcing changes spending

patterns a year before it happens. Financial services organisations, if they choose to, can know more about you than

Tesco because they can match more types of data – and they can link family bank accounts etc; Tesco doesn’t do this.

Banks also have many years’ of data (they have to keep it for legal reasons). They can profile customers to a frightening

degree. Most financial services organisations choose not to do this because their customers don’t want them to. There are

strict guidelines about this … but they could do an awful lot more than they do. For the social good of mining this data –

leveraging a small amount of my data – I don’t think it would be a problem, but it must be driven by the customer, banks

can’t do I themselves

Q: I don’t believe there’s no need for regulation – it’s essential. In Europe we have two fundamental rights – privacy and

data protection – you can’t get away from this. But we are thinking only of personal data here. I live in a multi connected

world: there are types of data all over the place that might impact on my privacy – it’s not enough to think about personal

data alone. So how do we deal with privacy in a context-based way? We need to find a way to help people protect their

privacy. There has been work on “meaningful consent” at Southampton University and elsewhere – how do we get the

work out of the lab into peoples’ hands?

18

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [5]

Point: I have an app that can track my emotional state using my phone. I can’t see a way beyond individuals being

responsible for their own data and privacy – I don’t trust any privacy network – people click to give consent, but they don’t

know where the data goes – they must know this

Point [Patient Opinion]: The language at today’s meeting has been all consumerist. We are treating trust as a black box,

but it varies a lot (there’s a difference between what trust means as a patient and as a consumer) – having a Mercedes is

different to having a heart attack. When you want something for yourself, that’s one thing; wanting something for the public

good is different – can the network address this aspect, and keep the distinction clear? There’s a danger of skewing

everything to the consumer angle.

A: Yes – it’s your network – we can do this if we want to – there is a whole range of interests represented in the room

included medical

Point: I manage Warwick University’s Hub of All Things – we address some of the issues that have emerged: our project

recognises the need for people to own their own personal data and manage its availability in different contexts, where the

value can be understood (e.g., retailers, healthcare, wholesale etc – to get different types of value). We are creating a tool

– we recognise the opportunities and challenges and we are looking to have 1000s of people collecting their data into a

repository

19

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [6]

Q: Loughborough University – speaking as consumer: do we need to be careful in assuming that trust is always good?

Alex’s point about mortgages hit home – anything that improves the process is good. I had to go online, search products,

do some sums, sit with an advisor, and I came out with different, better product. There are benefits of engaging fully like

this – if we trust automated solutions, we might cut out benefits of traditional personal interactions. Do we need a series of

nuanced approaches to trust / scepticism?

A [Alex]: What you describe is horrible – I don’t have time to do what you did. I want to do it quicker online. In my work I

want to do it my way; you can do it your way. There is an opportunity to do something in between too – and turn your trust

up or down

A [Rav]: We are on a journey here – it’s not going to change overnight – and it’s a generational thing

Q: Let’s get the data owner back in the picture. There are billions of data creators facing a handful of big brands. Normally

when you own something, you can sell it for money. How can the individual get a share of the value of their data?

A: Yes – the Data Mutual can only be funded that way – like a Tesco Clubcard

20

www.pdtn.org

#PDT

Notes from the morning plenary Q&A session [7]

A [Rav]: But what is the currency? Not everything is monetisable. Data is everything to do with you: it’s not just you, it’s the

wider context – so where does the value get created? It’s not just because of your data, it’s because of the context of that

data (that you don’t own)

Point [Governor Technology (Richard Beaumont)]: We have learned how nuanced the decisions are that websites make

when using personal data. Consent and control plays a lot in trust (as does accountability) – it all needs being lined up to

give strong trust and a strong economy, especially if you want it to be fair

A [Rav]: One of the network’s fundamental challenge is about data literacy: generations are coming through that aren’t

aware of what is data, privacy and trust. So let’s go to the grass roots – e.g., cookie caches – there is a whole subset of

society that is completely data-unaware. If we can address this, that would be a big step forward, I suggest

21

www.pdtn.org

#PDT

Group 1: Sharing best practice

• Best practice must be user-centric – user control; instead of common standards, it can involve certification and verification

• Best practice looks at objective models of risk; risk is very hard to quantify for individuals, and for people doing risk assessment, but there are commonalities across organisations

• Common risk models could be identified, with common mitigations – this can be a good way of sharing best practice

• Along with this we propose voluntary certification

22

www.pdtn.org

#PDT

Group 2: Digital social innovation [1]

• What are the principles you need to operate by to

generate trust? Here’s out trust pyramid: we are

trying to get to being trustworthy (not trusted)

• The building blocks are user empowerment in the

process: transparency, and accountability or

power to remove data – this is a remedy

• There are operational principles that companies

must adhere to: ‘security by design’, ‘privacy by

default’, and other things: open business model

(be clear about how money is going to be made)

and data minimisation (important in the big data

era) …

23

www.pdtn.org

#PDT

Group 2: Digital social innovation [2]

• … we know we can’t keep data secure, so we must work to minimise the data that we keep; and be

clear that there is no covert tracking or profiling going on

• The most interesting discussions we had on our trust pyramid were those to do with “remedy” – is

removal of data really empowering? And ultimately how ‘validatable’ is all this?

24

www.pdtn.org

#PDT

Group 3: Dealing with (EU) regulation

• The bad stuff in the regulations will hit us anyway, so

how do we make the best of the good stuff?

• Two things are “privacy by design” and “privacy

impact assessments” – they could be positive tools to

encourage people to trust us. We could present these

in citizen-friendly ways

• Also, if you think that consent doesn’t fit your

application, there are five other things allowed

25

www.pdtn.org

#PDT

Group 4: Consent in pervasive environments [1]

• Informed consent problems: it’s not informed; and it’s not consent

• The consumer doesn’t know what’s going on or understand risk,

costs or benefits of giving consent

• What we should do is “surprise minimization” – nothing that

happens should surprise the consumer

• You can’t consent if you don’t understand, so you must “empower”

users. It’s a dynamic process. People are willing to be fluid in data

exchange if feedback exists – something needs to support this

dynamic process, such as trust agents

• The main thing is to enable a “supported user” – with

visualisations

26

www.pdtn.org

#PDT

Group 4: Consent in pervasive environments [2] 27

www.pdtn.org

#PDT

Group 5: Who owns the rights to use the data [1]

• You own the rights to your data (enshrined by Magna Carta). The individual is a creator of data, so the individual should own it

• But data must be interpreted in some cases – e.g., by a doctor. Sometimes you might not trust your GP, and you want access to your own data not mediated by the GP

• Data is linked to community groups – data is collected within a context. Using the data is not like consuming it; you need to protect the access rights – and enforce this

• You must work out how enforcement can be managed – it’s complex. Content protection? Authorities need rules, and the issues extends to secondary and tertiary use of data – it’s easy to lose control. How can you constrain the inheritance of the data and access rights across multiple users? This could be controlled and enforced through technology and enshrined in law

28

www.pdtn.org

#PDT

Group 5: Who owns the rights to use the data [2]

• You might want “authorised witness” – a notary –certifying the data as yours. In the medical domain it’s often a committee that certifies who can do what.

• There are differences of opinion about this, though – the goals of research are evolving

• There is some more complexity – co-ownership of some data. For example, consider a delivery driver who might have stayed for hours at a pub. His car belongs to a fleet. It might be a Ford (Ford might have rights). The payload owner has rights / interests in what’s happening to the car too

• We talked about taxonomy and ontology and instantiation (because we are computer scientists)

29

www.pdtn.org

#PDT

Annex: Notes from presentations,

slides presented, and plenary discussions Slides 30-122

www.pdtn.org

#PDT

Notes from Andy Green’s opening talk

Andy Green stressed the significance of the personal data and trust issue – along with security, he said these were the two most

important issues facing the development of the digital economy

He said there was a spectrum of opinion on the issue of personal data and privacy – but the consensus is that only one or two bad

events would change the balance

He said the biggest brands understand the significance of dealing properly with private data – they are extraordinarily careful with this

data – it’s not a legal issue, it’s a consumer moral boundary issue. It’s complex issue, and there has been work on codes of practice –

but what we need is for the innovators in the room to come together and contribute new ideas and new ways to think about the issue.

The other side of the issue is about value - I get great value from people knowing about me; but I can see the importance of

protection too

Andy added that for the long term, the issue will lead to a big evolution of the Internet – we will have to rethink rights management. It’s

not easy – but we need to think about it… and work out some policies for it all

He concluded by saying that this network is a collaborative venture – and the area is important to lots of us here. He hoped that

people would find other to talk to, find customers and so on, and gain insight from each other by talking about, and understanding the

issues

31

www.pdtn.org

#PDT

Notes from Matt Stroud’s talk

Matt explained why the network had been set up, by drawing a parallel with the development of the railways in Victorian times – the

real value generated then arose because of its enablement of other services; the Internet was very similar – and the data carried by

the Internet had already generated huge economic benefits

Private data was harder to unlock value from, though, because of the complexity around the different aspects of that data – personal

trust and legal and political, for instance

It was important for people with an interest in this area to get together and work out how we could look at the central challenges

around trust in the use of personal data. There are practical issues here – but also ethical issues (the data affects people)

The network created today draws together the academic community, SMEs and corporate enterprises: all have different, valuable

perspectives. The network is a physical and virtual environment for these groups to work together

Some issues are open innovation and collaboration (useful for corporates); SMEs can meet potential customers and research the

thought leadership from universities; academia can learn what the commercial world’s challenges are and find opportunities to

commercialise their work

He encouraged people to register as a member of the network

32

www.pdtn.org

#PDT

33

www.pdtn.org

#PDT

Purpose of the networkDr Matt StroudHead of Personal Data & Trust, Digital Catapult

www.pdtn.org

#PDT

Personal Data and Trust Innovators Network:

The rationale

• The next growth of the Internet is likely to rely on the successful generation

and management of personal data. High levels of trust and confidence in

these data are a pre-requisite for successful new services, which have both

huge economic and social potential.

• The Personal Data and Trust Network, building on world-class research and

business insight, will help organisations to develop the next-generation of

personal data management, giving the UK clear advantages for consumers

and citizens.

www.pdtn.org

#PDT

Personal Data and Trust Network:

The Community

SME’s

Universities Corporates

www.pdtn.org

#PDT

Benefits of membership….

Corporate benefits:• Access to potential innovation partners

• Supports corporate open innovation programmes

• Gain insight to evolving market trends, capabilities and opportunities

• Visibility of academic research & innovation

SME benefits:• Opportunity to meet and work with potential customers

• Opportunities to meet other innovation partners to further differentiate your product

• Visibility of and contribution to, cutting edge thinking

• Identification of commercially important problems

Academic benefits:• Problem definition of commercially important problems

• Build research roadmap

• Partners to commercialise capabilities

36

www.pdtn.org

#PDT

Membership of the network is free.

So that we can best organize events, could you register @:

http://www.PDTN.org or

http://www.digitalcatapultcentre.org.uk (in the “Get involved” section)

Personal Data and Trust Innovators Network:

Registering

www.pdtn.org

#PDT

Notes from Alex Craven’s presentation [1]

Alex runs an advertising agency and has trust issues – he spoke about his professional and personal issues

His agency uses individuals’ Twitter and other data, from advertisers and elsewhere as part of its work for ITV – there was a huge

amount of data. Twitter was a very large source of useful data – e.g., about the conversation that happens when the X Factor is

broadcast. There is real value and his clients and he makes money from it

Alex is a member of the Open Data Institute, and is thinking about how data can be used for good; for instance, analysis of Tesco

Clubcard data can identify potential diabetes sufferers two years before they present themselves – saving treatment costs worth a

huge mount of health service money; McKinsey reckons big data can save an enormous amount of healthcare money

But as a consumer he is less interested in the technology, and more interested in the trust issue

He sees lots of technology innovation, but where is the trust? He is trying to launch “Our Data Mutual” – and he presented briefly

about it. He made the point that banks protecting you from identity theft is for their own benefit, not yours

Ipsos Mori reported last year on public attitudes to trust (seen this) – Bloom has won a contract to do some research in this area; he

picked out some highlights from the Ipsos Mori report – including where trust was low (large companies) and that people could not

see the benefit of the use of much personal data by companies (or the state, or academia…)

38

www.pdtn.org

#PDT

Notes from Alex Craven’s presentation [2]

So how do we get trust into private organisations?

There are contradictions here: He made the point about people’s stated attitudes to trust and privacy, and their practice (e.g., with

how they use social media)

Alex said he felt we are at a crossroads – and this forum today is critical and must arrive at a consensus. Prohibition will not work. We

can’t go back. We can’t do nothing (it’s the wild west, and the only people benefiting are a few corporates) – society can benefit, and

we can do better

Privacy is dead – it never existed. We must move on, and we must do something. We need a trust framework that enables a market

to exist while giving individuals some control over the use of their data

39

W HERE IS THE TRUST?

@ALEXCRAVEN

@BLOOMAGENCY

Tesco clubcard can identify type 2 diabetes 2 years before you present

yourself to your doctors with symptoms – the rise of diabetes could bankrupt

the NHS within a generation

50% of global advertising spend is wasted - $250bn

Data can save 8% on US health care - $300bn *McKinsey

Our data opportunity

Where is the trust?

46

Version 1 | Public © Ipsos MORI

Version 1 | PUBLICPublic attitudes to the use and sharing of their

dataResearch for the Royal Statistical Society by Ipsos MORI

July 201423/07/14

Royal Statistical Society Ipsos MORI report finds ‘data trust deficit’

http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-

trust-deficit-with-lessons-for-policymakers

‘In particular, there may be big benefits to be had from data

sharing within government, but to get a public mandate

policymakers must be clear about the benefits and show how they

will safeguard individual privacy’

The Problem

48

Version 1 | Public © Ipsos MORI

University researchers

Government-funded – 50%

Company-funded – 45%

Charities

Who provide public services –42%

For themselves – 36%

Companies

Who provide public services –36%

For themselves

Mentioning controls – 27%

No mention of controls –26%

A hierarchy of support for data sharing?

By whom For whom

But at this level

controls make no

difference?

If we are asked whether we are worried about privacy we are, yet we repeatedly

ignore this in our every day lives.

We want to receive the benefit of our data but we cant really be bothered to do

anything about it.

We assume we are being protected from abuse of our data but actually we are not.

We contradict ourselves

PRIVACY MUTUAL BENEFIT

GO BACK WILD WEST

There is no going back – privacy is dead, in fact it never existed, move on..

EU regulation wont work

Wild west abuses the citizen and will eventually ‘eat itself whole’ fuelling the privacy

hawks

There is a huge opportunity but we wont realise it until we can establish trust

There is clear evidence for a requirement for new institutions to establish a trusted

environment

Who’s data is it anyway?

Where is the trust?

Any questions? Please

contact:

Alex Craven

[email protected]

@alexcraven

@bloomagency

www.pdtn.org

#PDT

Notes from Rav Bain’s talk [1]

Rav advises banks on “conduct risk” – helping them conduct themselves better, and to reduce operational risks. Banks must comply

with lots of new regulation – an eyewatering amount. They spend billions on compliance, and there is a lot of data involved. Senior

executives will be held personally responsible for breaches of trust – so they must rely on their companies’ strong policies and

frameworks and systems; they have to trust their colleagues to do the right thing – and measure those colleagues!

There must be microscopic tracking and surveillance of trading-floor activity and branch activity. Rav mentioned MIFID and MIFID 2

regulations (governing the certain types of financial products) and having transparency of a trade before and after it happens. He said

to achieve this you need to rearchitect systems. It is a multi-dimensional prolem for banks

Organisations like the Personal Data and Trust Network will create a body of work that can be used by corporates – and not just

banks; telcos and other corporates that manage personal data or large amounts of data all have similar issues

Rav said that if you are made to do something about this you need to work out what you are going to do? We all agree that there

should be an appropriate level of regulation (and compliance with it)

He gave an example: think about data entering the banking system with a transaction (e.g., applying for a loan). Do you want this

data held forever? So how do you decide how to get rid of data at the right time? He said he was not sure how the market is turning –

it feels like kids’ football – we’re all chasing the ball wherever it goes

54

www.pdtn.org

#PDT

Notes from Rav Bain’s talk [2]

In defence of big corporates, Rav said they are pretty good at managing your data (though there are exceptions). A typical UK high

street bank has 8-15m customers – and they look after a lot of data. But we need to crack down on the exceptions. And we should

think in fresh ways about the issues

Big banks and other financial services companies and consultancies don’t have all the bright ideas – there are many ideas “out

there”. A network like this will not just build consensus but will build a critical mass of thinking. If we find Europe is more heavily

regulated on privacy, and we are quicker and more agile, then can we make this work for the UK?

55

www.rcuk.ac.uk/digitaleconomy

Personal data and trust research

Personal Data and Trust Network Inaugural Event

Digital Catapult, London, Wednesday 11 March 2015

Jerome Ma

RCUK Digital Economy Theme

[email protected]

www.rcuk.ac.uk/digitaleconomy

@RCUK_DE #PDT

www.rcuk.ac.uk/digitaleconomy

RCUK Digital Economy Theme

• >£150M since 2008; 400 user partners

• Co-creation approach (users, society, business and/or

government)

• Interdisciplinarity is key

Rapidly realise the transformational impact

of digital technologies on the UK

www.rcuk.ac.uk/digitaleconomy

DE Theme priorities: 2015 onwards

Trust, identity, privacy & security

What are the urgent challenges/questions?

Research roadmap

www.rcuk.ac.uk/digitaleconomy

University Expertise

Aberdeen computational models of norms & trust

Birmingham applied criminology

Buckinghamshireinformation security management; cybercrime; compliance

Cambridge systems for privacy preservation

Cardiffpolitics and social issues of the creative & digital economies

City cloud security and trust; identity management

Edinburgh design informatics; big data ethics

Imperialdefence and security; language-based computer security and data analytics.

KCL computational models of trust & reputation

Lancasterprivacy; identity management, access control models; reputation

Leeds digital identity

Leicesteronline trust, privacy, security & surveillance issues; cyber-ethics.

Loughborough empathy and trust in online communication

Newcastleuser experience; cybersecurity, defence, critical infrastructure protection

Nottingham personal data in media, services & products

Northumbria identity, trust & security in new social media

Oxford big data ethics; cyber security centre

Queens Belfasttrust, e-commerce; online buying behaviour

Southampton AI; autonomous systems; meaningful consent

Strathclyde internet law

UCL mobile systems; cybersecurity

Warwicknovel service business models; attitudes towards data security, trust & privacy

Wolverhampton online behaviour

Edinburgh

Aberdeen

Newcastle & Northumbria

Southampton

Oxford

Nottingham

UCL

Imperial

Birmingham

Cardiff

Strathclyde

Lancaster

CambridgeWarwick

Wolverhampton LoughboroughLeicester

Belfast

Leeds

KCLCity

UK PD&T research: expertise & location

www.rcuk.ac.uk/digitaleconomy

£12M each (£2M partnership funding)

dot.rural, Aberdeen

SocialInclusionDE, Newcastle &Dundee

Horizon, Nottingham

3 Digital Economy Research Hubs

www.pdtn.org

#PDT

Notes from Derek McAuley’s presentation [1]

Mac spoke about the work of Horizon, and the lifelong contextual footprint of a person – the digital traces created explicitly and implicitly. He said there were opportunities and challenges in personal data and Horizon was exploring these through lots of projects (of 9-12 months each). Topics are refined, but work is done in short bursts. He gave the example of “smart meters” – the data collected and the privacy implications of knowing, per second, people’s energy usage and the implied knowledge that is created from this (e.g., “I know you didn’t have a shower this morning …”)

Condition monitoring of domestic appliances would be good (and enabled by smart meters), but the privacy issue raises its head. What will actually get deployed in the UK? Only once a day reading (or opt out to a minimum frequency of once a month, or opt into once every half hour). But has the privacy issue been solved? No. And we’ve lost the ability to do some great applications.

In my view this is simple to resolve: it’s small data and it should be processed in isolation. The reason we have lost this great opportunity is because of the privacy and the lack of value recognition. We need distributed computing to do this properly. Distribute the code and the computation instead of doing “big data”. There is academic work that proves you can do distributed processing to extract value without sending personal data to a single point

We also did work to evaluate the simplicity or otherwise of terms and conditions of usage, to examine what counts as “consent”. We published work in 2013; bettingexpert.com analysed all the betting sites’ T&Cs and found you have to be 23 years old to understand the English, let alone the legal position. This work was widely cited

61

www.pdtn.org

#PDT

Notes from Derek McAuley’s presentation [2]

Should we come up with T&C templates? Possibly. Government responded positively with the British Standards Institute.

Let’s see where that goes from here.

Final example – we have to teach people how to design these privacy and trust conditions into the products and services

they are designing. Ideation Cards (not a new idea) can be used here. In most design processes, data protection and

privacy isn’t high on the list of priorities – it is usually retrofitted at the end of the process. So we use ideation cards – a

location-based truancy tracker was an example design we thought about. There are lots of murky issues around that

service concept. Using the cards really helps people to think about them

What is “Legitimate Interest” – is it good enough that it’s “it’s legal”? Possibly not. There’s a Catch 22 about having to track

people to check that they don’t want to be tracked, for instance

62

TL; DR

Derek McAuley

11th March 2015

• What is/has Horizon been doing

• Three examples

– Technology

– Policy

– Design

• Topic de jour

The talk

64

The lifelong contextual footprint

• The footprint – the digital traces we create explicitly and implicitly as we go about our everyday lives at home, at work and at leisure.

• The contextual – these digital traces enable personal technologies to infer our activities and provide appropriate services.

• The lifelong – an inclusive digital society must consider how these digital footprints are handled throughout our lives, literally from cradle to grave.

The opportunity

65

Broad range of applications, core topic:

“Lifelong Contextual Footprint”

Eye test - the current projects

66

Rollout across UK by 2020

..but…

Readings once a day, or opt in to 30 mins or opt out to 1 month.

Privacy and smart meters

The way of small data: analyze at edge and then aggregate

• Privacy & performance & scalability

Data on the edge

68

Cloud

Edge

Analyze

Ag

gre

gate

Mainstream thinking: aggregate into cloud then analyze

Solve for voltages…

Examples…

69

DAR [1]

[1] “Teletraffic Science”, ITC12. Elsevier, 1989.

Simplicity

70

Policy impact

71

May 2013

Luger, E., Moran, S., and Rodden, T. Consent for all: Revealing the

hidden complexity of terms and conditions. Proceedings of ACM

CHI2013: Human Factors in Computing Systems (pp. 2687-2696).

ACM.

Policy impact

72

May 2013

www.bettingexpert.com uses it

to rank T&Cs for betting sites!

Policy impact

73

March 2014

Written evidence to Social Media

Inquiry by Select Committee

Policy impact

74

June 2014

Verbal evidence to

Select Committee

Policy impact

75

Nov 2014

Report with strong

links to our input

Policy impact

76 Nov 2014

Widespread

coverage

& The University of Nottingham

The design process

• Define/constrain design problems within broader problem space (Golembewski &

Selby, 2010)

• To surface human values (Friedman & Hendry, 2012)

• Support intra/inter-familial communication (Mackay, 2004)

• Encourage thought around security threats (Security Cards, Washington)

• Support use of creative methods within design process (IDEO)

• Support exploration of issues around online privacy (Privacy Game)

Ideation Cards in Use

Tracking Truancy

Truancy is a key problem in

urban deprived areas and is

costly to the state. The

commissioning body (govt

department) wants a

location-based social

'tracking' system that will

allow parents and teachers to'

track' truants. This system

makes use of location data.

Limited Connection

The system should be

able to operate with

limited/sporadic

connectivity

Children

Children and

adolescents

Explicit Consent

Data should only be collected where a user

has been given information about the

nature of collection, and then specifically

and explicitly agreed to it.

The form of that information or how it is to

be delivered is not defined. Highlighting

risks to users and enabling

negotiation/withdrawal with the system

over data collection is a challenge.

Consent is also not a static concept either

(e.g. given once, does not mean forever).

Topic de jour

83

http://www.horizon.ac.uk

Questions?

[email protected]

www.pdtn.org

#PDT

Notes from Paul Watson’s presentation [1]

We focus in our hub on those who are socially excluded – it could be up to 20% of the population, including the old, those

with disability, those without skills or jobs…

We felt that there was potential for digital technology to make a difference to the socially excluded. We have run many

projects over five years – and many of them have had a security or trust issue. I will pick three examples to show you that

illustrate things I have found:

Digital technologies can give insights as well as solutions: you get a view into people’s lives that you can use to solve

their problems. Trust of older people in technology is lower than it is for younger people, and this is an issue for

government as it moves services online. We explored, using clickstream analysis, how people navigate around websites; it

varies by age. We researched the nature of trust and how it relates to the Web. Older people’s trust is often based on

brand – we were able to test this through research. Eye-tracking technology was used to better design web pages that you

want people to trust. “Certification” of the brand on the web page is really difficult to do; web page design aesthetics were

more important than brand

Target socially excluded people: example of victims of domestic violence are often subject to abusers seeing what they

do online. Deleting search history etc is only partially successful because it arouses suspicion. You need to be much more

subtle than that – routines that selectively clear search history or weblogs etc.

85

www.pdtn.org

#PDT

Notes from Paul Watson’s presentation [2]

Design for scalability: this generates security issues. Example of healthcare and wearables; IT systems choice (the

cloud for no risk; internal IT for anything with risk – this inhibits cheap, fast, scalable service development). So partition

applications to make use of cloud resources – e.g., only send anonymous data to the cloud; keep attributed data internally

Create multidisciplinary teams – security, domain and systems experts, coproducing

86

Paul WatsonNewcastle University

The result of related factorsthat prevent individuals or groupsfrom participating fully in the

economic, social & political lifeof society

Social Exclusion

How canDigital Technologies transform the

lives of excluded people & communities?

Lesson 1: Digital technologies can give insights, as well as solutions

Ex. Older people’s trust of the Web

Lesson 2. Target socially excluded people & communities

Ex. Victims of Domestic Violence

Lesson 3: Design for scalability

Ex. Healthcare

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results(s3)

A. Smith378456729

p = 30%q = 27.4r = 34

Options

Application

Public Cloud

Internal IT

Risk?Yes No

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results

(s3)

A. Smith378456729

p = 30%q = 27.4r = 34

Public Cloud

Private Cloud

?

Or Can we Partition Applications?

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results

(s3)

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results

(s3)

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results

(s3)

Read Patient Data(s0)

Anonymize(s1)

Analyze(s2)

Write Results

(s3)

1

4

2

3

New Method Generates secure Partitioning options:

Lessons we’ve learnt

L1: Digital technologies can give insights, as well as solutions

L2: Target socially excluded people & communities

L3: Design for scalability

L4: Create multidisciplinary teams:• security, domain & systems experts … and users

www.side.ac.uk

www.pdtn.org

#PDT

Notes from Pete Edward’s presentation [1]

We are focused on rural challenges – not just rural broadband coverage, but much wider. In almost all our projects, there

have been issues that emerged relating to personal data and trust

I will talk about the themes that emerged across our projects that are relevant to this debate. Others were also looking at

specific relevant things (Facebook data, CareData fiasco, selling of personal data by individuals to the highest bidder…)

• Keep it user-centric – attitudes vary by demography, the kind of data, the context of the data usage … so we look at the

issues through an attitudinal lens – trust, risk, transparency and control (I think risk is sometimes forgotten)

• Example of work using smartphone data for rural bus planning – this raises anonymity issues because in rural areas,

individuals could be traced

• Mobile devices and wearables to support people with chronic disease in rural areas – attitudes to personal health data

sharing. We looked at different categories of personal health data (e.g., exercise regime, diet, mood). There is a

massive variation based on age, health etc…. Exercise data is not considered private; mood data is considered very

sensitive. Also looked at who people would be happy to share with – e.g., people don’t trust government, universities,

companies; they do trust their GP

102

www.pdtn.org

#PDT

Notes from Pete Edward’s presentation [2]

• “Conducting privacy impact assessments code of practice” (ICO’s office); identifying privacy-related risk – individuals

don’t understand risk, and especially risk in the digital world. This is a huge problem we all face. “Trusted Zone” idea for

personal health data – what about leaking the data outside the zone – even if it’s good and valid to do so; how can we

build a model that is flexible enough to cope with this and what controls need to be in place to mitigate perceived risk?

• Managing inferential risk – people generally don’t think about this. Social networks have data flying around – way

beyond where people expect it to go; also, sharing can result in unexpected risk – e.g., releasing two or three pieces of

data to different people that can be put together

• Control – how to allow users a measure of control over their data. How do you represent controls? Make them simple

and effective and understandable (like informed consent issue). Too much control can introduce new risks – in the event

of an accident, say, do you want your health data shared in a way that’s outside your normal “policy”?

103

myData with Attitude

Peter Edwards

dot.rural Digital Economy HubUniversity of Aberdeen

Personal Data Landscape

Trust Transparency

Control

Risk

Influenced by the foundational principles

of Privacy by Design.

“Keep it User-Centric”

Attitudes

Attitudes to Data Sharing

Your primary mental health

Your adherence and compliance

Your alcohol intake

Your contact information

Your criminal record

Your exercise level

Your medical history

Your medication

Your mood

Your personal characteristics

Your reproductive health

Your specialist mental health

Your substance abuse

Your test results

0 25 50 75 100

% respondents

Card

Sensitivity

Not sensitive

Quite sensitive

Highly sensitive

335 NHS users surveyed

Recruited though market research (Research Now)

Online card-sort exercise to identify sensitivity attitudes and sharing preferences of health-related information

Opinions about sensitivity of personal data items vary dramatically

Attitudes to Data Sharing #2

Risk

Trusted Zone

A

B C

owner

requester recipient

data

How to protect information when sharing is desirable, but policies are incomplete?

Trusted individuals may need to share our data with unknown third parties

What’s the perceived risk for A, in allowing B to share with C?

How does A’s trust in B, and the sensitivity of the data, influence this risk factor?

How can controls (such as payments, monitoring or reputation) mitigate some of this perceived risk?

Identifying privacy related risks key part of PIA process

Considerable body of evidence that individuals do not understand / appreciate personal data risk

Managing Inferential Risk

What is the probability that users' data become available to others in a networked context?

How may data owners manage unsafe inferences that others may draw whilst sharing data?

How to assess the benefits of sharing (utility) vs possible risks?

Controls

• Allowing users a degree of control over their data– helping users specify their wishes

– monitoring behaviour of data accessors

• Controls can mitigate perceived risk of sharing data– Simple and effective controls? Must be clear to data owners and accessors

– Right level of controls? Too much control may introduce new risk of data being unavailable in critical situations.

Transparency

Who controls a device and has access to the data generated?

For what purpose are the data collected?

Assurance of behaviour?

Making IoT device capabilities and behaviours (data provenance) transparent to users.

Thanks

Acknowledgements: Stanislav Beran, Chris Burnett, Liang Chen, David Corsar, Paul Gault, Tim Norman, Edoardo Pignotti, Karen Salt …

www.pdtn.org

#PDT

114

www.pdtn.org

#PDT

The future of the networkDr Matt StroudHead of Personal Data & Trust, Digital Catapult

www.pdtn.org

#PDT

Where next? 115

• Future regular meetings

• Build digital presence

• Form community interest groups

• Create research roadmap

• Grow membership base

www.pdtn.org

#PDT

Future regular meetings 116

• We intend to hold regular (quarterly?) meetings, physically bring together

practitioners and researchers in Personal Data & Trust

• In addition there will be thematic meetings around the country

• We may use the “National Virtual Incubator” teleconferencing system to

make the London accessible from around the country

• Would your institution like to host a physical or virtual gathering?

www.pdtn.org

#PDT

Build digital presence 117

• Our new website:

PDTN.org

• Will be complimented by a quarterly PDTN Review journal

• Covering the Networks activities

• Features on members activities

• Expert articles

www.pdtn.org

#PDT

Form community interest groups 118

Some will be sector specific “verticals” such as banking and others will be “horizontal” such as security or psychology. By way of illustration, two early groups are:

• Privacy Working GroupWorking to define a “best practice” privacy standard that companies can be certified against

• PIMS provider forumCollective challenges and opportunities faced by providers of personal information management services

These are the first of many. If there’s a personal data and trust related topic which you feel would benefit from an open working group and would like to establish, please let us know.

www.pdtn.org

#PDT

Create research roadmap 119

• Bring together Industry and the Research Councils to create a

research roadmap which drives economic and social growth

• Spearheaded by a number of events run by the KTN and the Digital

Economy Hubs around the country

• SME’s and Corporates will be invited to contribute views and work to

identify key challenges

• Output will be fed into the Research Councils and IUK to guide future

research and calls.

www.pdtn.org

#PDT

Grow membership base 120

• We will grow the network by reaching out to our networks & social media and

working with media brands:

• We are writing white papers and working with the media to drive interest

• We are informing the organisations who have worked with us, IUK, Catapult,

KTN, Research Councils and Digital Economy Hubs

• We are Posting, Blogging & Tweeting

• Your colleagues, customers and collaborators will derive value too...

…let them know!

www.pdtn.org

#PDT

Your community, be part of it! 121

• Want to do a blog for the web site

• Want to write an article for the PDTN review journal

• Got an idea for a “community interest group”

• Want to join or lead a group

• Want to host an event

• Got an idea for the research roadmap

Then e-mail us: [email protected]

[email protected]

www.pdtn.org

#PDT

Thank You

Personal Data & Trust Networkwww.pdtn.org

#PDT