2015 “privacy in action” speaker series january 28, 2015 innovative technologies to increase...
TRANSCRIPT
2015 “Privacy In Action” Speaker Series
January 28, 2015
Innovative Technologies to Increase Privacy
Dial-in: 1-855-767-1051 and Conference ID: 34619328
PRIVACY AND RECORDS MANAGEMENT
Administrative Items
• Do not use your computer microphone to participate in this meeting. Lync will be used only as a display. Please dial in using the following information:
– Phone number: 1-855-767-1051– Conference ID: 34619328
• Please mute your computer microphone and speakers. This will eliminate feedback on the line and make it easier for you and your colleagues to hear the presentation.
• The presenters will address questions during the Q&A session at the end of each presentation. For those online, please feel free to type your questions into the Lync Instant Messenger.
• Send technical issues to [email protected].
PRIVACY AND RECORDS MANAGEMENT
Welcome and Introduction of Speakers
LaShaunne’ DavidDirector, VA Privacy Service
U. S. Department of Veterans Affairs
REPORT TO THE PRESIDENTBIG DATA AND PRIVACY:A TECHNOLOGICAL PERSPECTIVE
May 2014
Charge from President Obama January 17, 2014 speech requesting analysis of big-data implications for
policy
Scoping study, focusing on the wider economy and society
PCAST report to inform and accompany White House report Objectives of the PCAST report
Assess current technologies for managing and analyzing big data and preserving privacy
Consider how such technologies are evolving
Explain what technological capabilities and trends imply for design and enforcement of public policy to protect privacy in big-data contexts
Presidents Council of Advisors on Science and Technology (PCAST)
PCAST Working Group Members & Staff
Susan Graham, Co-Chair, UC BerkeleyWilliam Press, Co-Chair, University of Texas S. James Gates, Jr., University of MarylandMark Gorenberg, Zetta Venture PartnersJohn P. Holdren, OSTP DirectorEric Lander, Broad Institute of Harvard and MITCraig Mundie, Microsoft Corp.Maxine Savitz, National Academy of EngineeringEric Schmidt, Google, Inc.
Marjory S. Blumenthal, PCAST Executive DirectorMichael Johnson, OSTP (NSIA Assistant Director )
Other PCAST Members
Rosina Bierbaum, University of MichiganChristine Cassel, National Quality ForumChristopher Chyba, Princeton UniversityShirley Ann Jackson, RPI Chad Mirkin, Northwestern UniversityMario Molina, UC San DiegoEd Penhoet, Alta PartnersBarbara Schaal, Washington UniversityDaniel Schrag, Harvard University
Changing Technological Contexts
Privacy history conditioned on “small data” Collection of data/development of data sets used w/conventional
statistics
Context of a personal relationship (e.g., personal physician, local shop)
Big data attributes Quantity and variety of data available to be processed (3 Vs) Scale of analysis that can be applied to those data
(“analytics”) Expansion of metadata
Laws have not always kept pace w/technological realities
People Emit Data Continuously . . .
Born digital Generated for computer(s)
Clicks and taps, GPS, cookies
Born analog Byproduct of the physical world
Sensors collect (often invisibly)
Over-collection? Digital convergence?
Big-data analytics create new information Data mining and machine learning
Data fusion and integration (data from different sources)
Image/speech recognition
Social-network analysis (self-censorship won’t help…)
The Cloud as Dominant Infrastructure Easy ingestion, access, and use of data Replication and distribution Infrastructure for mobility (e.g., smart-phone
apps) Potential security benefits from automation,
procedures, oversight Democratization of analytics
Cybersecurity and Privacy: Distinctions and Dependency Cybersecurity: technologies enforce policies for computer
use and communication Systems to protect identity and to authenticate (are you who you
say)
Harder to codify privacy policy for tech implementation
Poor cybersecurity is a threat to privacy, but . . .
Violations of privacy are possible with no failure in computer security Misuse of data, fusion of data
Cybersecurity: necessary but not sufficient
Technologies and Strategies for Privacy Protection
Cryptography and encryption Anonymization and de-identification Data deletion and ephemerality Notice and consent
Areas of Concern: Examples
Healthcare: Personalized medicine (including genetic info); mobile
devices that monitor
Education: New online platforms collect masses of data, enable
longitudinal datasets
Home: More ways of collecting, storing, and communicating
What Might the Future Look Like?
Taylor Rodriguez packs for a trip, leaves suitcase outside home for pick-up Camera on streetlight watches the suitcase, which has an RFID tag (anti-theft)
Her suitcase is picked up at night by delivery company Shipper knows Taylor’s itinerary and plans
Self-driving car arrives, its instructions for her itinerary delivered by the cloud
No boarding passes or queues at the airport Everyone is tracked by phone, facial recognition, gait, emotional state, RFID
tags
In this world, the cloud and robotic aides are trustworthy WRT personal privacy Improvements in convenience and security of everyday life become
possible . . .
Not an endorsement, just food for thought!
PCAST Perspectives and Conclusions
New sources of big data are abundant; new analytics tools will emerge New data aggregation and processing can bring enormous economic and social benefits.
Unintentional leaking of data and deliberate systemic attacks on privacy are potential risks
Cannot always recognize privacy-sensitive data when collected—may emerge w/analytics, may be able to home in on the moment of particularization to an individual
Government role to prevent breaches of privacy that can harm individuals, groups Tech plus law/regulation to generate incentives, contend with measure-countermeasure
cycle
Data collectors, data analyzers, and users of analyzed data as different actors Policy can intervene at various stages of this value chain
Attention to collecting practices may reduce risk, but use is the most technically feasible place to apply regulation
Technological feasibility matters
Recommendation 1: Policy attention should focus more on the actual uses of big data and less on its collection and analysis
Any adverse consequences of big data arise from a program/app interacting with raw data or information refined via analytics
Policies focused on the regulation of data collection, storage, retention, a priori limitations on applications, and analysis (absent identifiable actual uses of the data or products of analysis) are unlikely to yield effective strategies for improving privacy
It is not the data themselves that cause the harm, nor the program itself (absent any data), but the confluence of the two
Recommendation 2: Policies and regulation should not embed particular technological solutions, but rather should be stated in terms of intended outcomes Technology alone is not sufficient to protect privacy To avoid overly lagging the technology, policy concerning
privacy protection should address the purpose—the “what” — rather than prescribe the mechanism—the “how”
Controlling the use of personal data is more effective than regulating technologies of data collection, storage, and retention (these may evolve rapidly)
Recommendation 3: With support from OSTP, the NITRD agencies should strengthen U.S. research in privacy-related technologies and in the relevant areas of social science that inform the successful application of those technologies Some of the technology for controlling uses already exists Research and research funding are needed for (1)
technologies that help to protect privacy, (2) social mechanisms that influence privacy-preserving behavior, and (3) legal options that are robust to changes in technology and create appropriate balance among economic opportunity, national priorities, and privacy protection
Recommendation 4: OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection Career paths for professionals (e.g., digital-privacy
experts both on the software-development side and on the technical-management side)
Programs that provide education leading to privacy expertise are essential and need encouragement
Recommendation 5: The United States should adopt policies that stimulate the use of practical privacy-protecting technologies that exist today. It can exhibit global leadership both by its convening power and also by its own procurement practices Nurture the commercial potential of privacy-enhancing
technologies through U.S. government procurement and through the larger policy framework
Promote the creation and adoption of standards Cloud computing offers positive new opportunities for privacy
Privacy-Preserving Cloud Services?
PCAST is not aware of more effective innovation or strategies being developed abroad
White House/“Podesta” Policy Recommendations
1. Advance the Consumer Privacy Bill of Rights
2. Pass National Data Breach Legislation
3. Extend Privacy Protections to Non-U.S. Persons
4. Ensure Data Collected on Students in School is used for Educational Purposes
5. Expand Technical Expertise to Stop Discrimination
6. Amend the Electronic Communications Privacy Act
Lucia SavageChief Privacy Officer
ONC Update and Data Segmentation for Privacy (DS4P) Update
Veterans Administration Data Privacy Day
January 28, 2015
10 Year Interoperability Vision ( fall 2014)
• Leverage health IT to increase health care quality, lower health care costs and increase population health
• Focus on supporting health broadly, including but not limited to health care delivery
• Build incrementally over time from current technology – multiple methods of exchange required
• Establish best minimum possible interoperability for all; create opportunities for innovation
• Empower and maintain focus on individuals
23http://healthit.gov/sites/default/files/ONC10yearInteroperabilityConceptPaper.pdf
Vision for the Decade Ahead – Improvements Due to the Sharing of Interoperable Data
2017Connect Care:
Ensure providers and individuals can send, receive, find and use a
basic set of essential health information
2020Connect for Health:
Expand sources and users of information; Continue improving
quality and lowering cost; Increase automation
Scale broadly
2024Learning Health System:
Precision medicineReduce time from evidence to
practiceVirtuous learning cycle
24
Core technical standards and functions
Certification to support adoption and optimization of health IT products & services
Privacy and security protections for health information
Supportive business, clinical, cultural, and regulatory environments
Rules of engagement and governance
Data Segmentation for Privacy (DS4P)
• Goal:– Develop technical standards, develop use case, and pilot use case
testing whether:• Patient choice to disclose, or not, information regulated by 42 CFR Part 2
(substance abuse treatment at a federally regulated facility), can be captured, documented and persisted electronically.
Office of the National Coordinator for Health Information Technology
• Purpose of Use: – Defines the allowed purposes for the disclosure (e.g. Treatment, Emergency
Treatment etc).
• Obligations:– Refrain Codes: Specific obligations being placed on the receiving system (e.g. do
not re-disclose without consent)
• Confidentiality Codes:– Used by systems to help convey or enforce rules
regarding access to data requiring enhanced protection. Uses “highest watermark” approach.
Types of Privacy Metadata used by DS4P
26
Selected Standards
27
STANDARD: HL7 Implementation Guide: Data Segmentation for Privacy (DS4P), Release 1(Includes Content Profile, Profile for Direct, Profile for exchange)
Capability Standards/Profiles used by the HL7 DS4P R1 Standard
Specific Usage
Metadata Vocabularies (for Transport and/or Document Metadata)
HL7 RefrainPolicy Conveys specific prohibitions on the use of disclosed health information (e.g. prohibition of redisclosure without consent)
HL7 PurposeofUse Conveys the purpose of the disclosure of health information (e.g. treatment, research, emergency)
HL7 BasicConfidentialityCodeKind Used to represent confidentiality codes associated with disclosed health information (e.g. restricted) as specified in the HL7 Healthcare Security Classification standard (HCS).
HL7 ObligationCode Used to convey specific obligations associated with disclosed health information (e.g. encryption)
HL7 ActPolicyType Used to convey a type of policyHL7 SensitivityPrivacyPolicy Used to convey the sensitivity level of a specific policy
• HL7 normative standard which has been approved by ANSI May 2014• Standards facilitate tagging at document and section level
• ONC pilots tested at document level
DS4P Standards
28
DS4P PILOT ACCOMPLISHMENTSData Segmentation for Privacy Initiative
29
NETSMART Pilot:
• The Netsmart DS4P Part 2 solution has been implemented with the community services referral network in Tampa Bay (2-1-1 system), helping them manage restricted data associated with programs regulated by 42 CFR part 2.
Pilot Accomplishments
30
VA/SAMHSA Pilot:
• The pilot was successfully tested and demonstrated in multiple venues, including the Interoperability showcase at HIMSS 2013 and the HL7 Plenary meeting in Baltimore, September 2013.
• VA have extended the DS4P capabilities to demonstrate utilization of FHIR for DS4P (demonstrated at HL7 in Jan 14, in real time, using resources from Australia, Canada and USA).
Pilot Accomplishments
31
HITPC Recommendations re Incorporating Standards into EHRs
• Context– ONC contemplating expanding certification program to “voluntary”
EHRs for Behavioral Health and Long Term and Acute Care• No MU incentives• Aim of promoting exchange of data with primary care providers
Office of the National Coordinator for Health Information Technology
Questions?
PRIVACY AND RECORDS MANAGEMENT
Thanks for Attending!
• Thank you for attending the first of four 2015 VA Privacy Service “Speaker Series.” – We value your feedback, opinions and comments! – After this session, you will receive a short questionnaire via email. Please take a moment to
complete upon receipt.
• To self-certify Lync Meeting attendance in the Talent Management System (TMS), search:– Item Title: Privacy In Action - Speaker Series 2015: Innovative Technologies to Increase Privacy– TMS ID: 3901065
• Visit the new VA Privacy Service website at http://vaww.oprm.va.gov/privacy/ to learn more about Privacy within VA.