saa-acrl/rbms joint task force on the development of standardized holdings counts and measures for...

Post on 27-Dec-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

SAA-ACRL/RBMS Joint Task Force on the Development of Standardized Holdings

Counts and Measures for Archival Repositories

and Special Collections Libraries________________________________________________________________________________________________________________________________________________________________________________________________________

Katy RawdonCoordinator of Technical Services

Special Collections Research Centerkrawdon@temple.edu

@lemurchild#saa15 #s204

What is SAA-ACRL/RBMS JTF-HCM?

__________________________________________________________________________________________________

• Stands for: SAA-ACRL/RBMS Joint Task Force on the Development of Standardized Holdings Counts and Measures for Archival Repositories and Special Collections Libraries.

• Task force put together jointly by the Society of American Archivists and the Rare Books & Manuscripts Section (RBMS) of the Association of College and Research Libraries (ACRL), which is a division of the American Library Association.

• A second task force was assembled at the same time: SAA-ACRL/RBMS Joint Task Force on the Development of Standardized Statistical Measures for Public Services in Archival Repositories and Special Collections Libraries.

• A third joint task force has more recently been convened: SAA-ACRL/RBMS Joint Task Force on Primary Source Literacy.

Task Force Membership

_____________________________________________________________________

Officers• Martha O’Hara Conway, Co-Chair, ACRL/RBMS, University of

Michigan• Emily R. Novak Gustainis, Co-Chair, SAA, Harvard University

Membership• Alvan Bregman (ACRL/RBMS), Queen's University, Canada• Adriana Cuervo (SAA), Rutgers University• Rachel D'Agostino (ACRL/RBMS), Library Company of Philadelphia• Lara Friedman-Shedlov (ACRL/RBMS), University of Minnesota• Angela Fritz (SAA), University of Arkansas Libraries• Lisa Miller (SAA rep), Hoover Institution Archives, Stanford

University• Katy Rawdon (ACRL/RBMS), Temple University• Cyndi Shein (SAA), University of Nevada, Las Vegas Libraries

What is the task force doing?

__________________________________________________________________________________________________

• Archivists and librarians are increasingly aware of the importance of assessment, but we lack standardized measures.

• Task Force’s official charge is to “Develop a set of guidelines -- metrics, definitions, and best practices -- for quantifying holdings of archival repositories and special collections libraries, paying particular attention to both the wide range of types and formats of material typically held and the different ways in which collection material is managed and described.” (see SAA microsite for full description).

• Task force is convened for two years through the 2016 SAA Annual Meeting, with an option for one additional year.

Work so far__________________________________________________________________________________________________

• Meet primarily via conference call, and in person at SAA Annual Meeting, RBMS Conference, and ALA Midwinter (approx. 17 meetings to date).

• Reviewed how we count holdings in our own collections, as well as a “landscape review” of existing categories, vocabularies, and ways of counting – also, reasons for counting.

• Posted a call for survey instruments, worksheets, methodologies – and discussed results.

• Discussed (at great length) categories of materials and their definitions – particularly born-digital materials.

Final product__________________________________________________________________________________________________

• Guidelines on how to count and measure various types of materials.

• Definitions of and guidance for assigning materials to different categories.

• Tiered approach: Minimum, Optimum, Added Value.

• Possibly (and possibly later…) provide tools for assisting with counting.

Further resources __________________________________________________________________________________________________

• You’re invited to our meeting! The Holdings Counts and Measures AND the Public Services task forces meet on Friday at 1pm, Convention Center Room 14

• RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage, Fall 2012; 13 (2). Special issue on assessment

• SAA microsites for joint task forces (contain descriptions, announcements, members, minutes):

Task Force on the Development of Standardized Holdings Counts and Measures

Task Force on the Development of Standardized Statistical Measures for Public Services

Task Force on Primary Source Literacy

___________________________________________________________________________________________________________________________________________

Tourism Dollars: Evaluating Your Archives' Impact

David Carmicheal

• Georgia Archives• Conducted multiple times• 2002, 2006, 2008

Two surveys

Geneal-ogist76%

Survey 1: Who visits?

Realization

In-state69%

Out-of-state31%

Survey 2:Tourism

Tourism

Tourism

Tourism

Tourism

Tourism

Tourism

Tourism

Tourism

• Provided hard data• Allowed us to extrapolate the economic impact

of archives visitors• Suggested areas of economic development

(reunions)• Encouraged Tourism officials to consider archives• Provided user feedback• Became baseline for future comparison

Benefits

Metrics and Change ManagementFynnette Eaton

Goals of Change Management include: Ensuring normal work continues Building and maintaining momentum Dealing with the human side of change to minimize resistance Managing transition to new way of performing the work

Measuring Success Staff producing the required number of descriptions or

responding to reference requests at the same level while change is being implemented?

Are more staff speaking positively about the changes being discussed?

After the changes have taken place, is the staff as productive as your had predicted?

Baseline Audit from a System Perspective using MIT’s

Drupal TRAC Tool

Courtney C. Mumma, MAS/MLIS Artefactual @archivematica @accesstomemory

SAA2015 - Cleveland August 2015

Trustworthy Repositories Audit & Certification (TRAC) checklist is an auditing tool to assess the reliability,

commitment and readiness of institutions to assume long-term preservation responsibilities, as

described by the OAIS ISO 14721:2012 standard

Internal TRAC review in Drupal➢ Frame for iterative review + internal and external audit○ V.1.0 @ ICPSR - 2007 TRAC○ V.2.0 @ MIT - 2012 ISO 16363

incorporated➢ Download from Artefactual website:○ www.archivematica.org/wiki/Internal_aud

it_tool○ populated with Archivematica data

➢ Staff-only action items and notes

➢ Audit login with notes field for peer/external audit support

➢ Incremental, cumulative ratings and results

➢ Revision tracking➢ Natural language

question field

Accumulate results

RolesStakeholders➢ Senior management

➢ Coordination group

➢ Operations group

➢ Information technology

➢ Administration: Finance or HR

➢ Acquisitions

➢ Preservation

➢ Dissemination

➢ Rights management

➢ External advisory group

Assign responsibilities

Thanks! Questions and more info: ➢ Dr. Nancy McGovern & Matt Bernhardt

➢ Archivematica discussion list:○ groups.google.com/forum/#!forum/

archivematica

➢ Internal review questions unrelated to tool: NDSA Standards and Practices Working Group○ digitalpreservation.gov/ndsa/

working_groups/standards.html

© 2015 Grant Thornton LLP | All rights reserved

Information Governance Assessment Tools

Elizabeth W. Adkins

Society of American Archivists Annual Meeting Presentation

August 20, 2015

© 2015 Grant Thornton LLP | All rights reserved

Generally Accepted Recordkeeping PrinciplesAKA "The Principles" – Issued by ARMA International http://www.arma.org/r2/generally-accepted-br-recordkeeping-principles

ACCOUNTABILITY Senior leader held accountable for information governance

INTEGRITY Reasonable and suitable guarantee of authenticity and reliability of information

PROTECTION Reasonable level of protection for personal or other sensitive information

COMPLIANCE Comply with applicable laws and other binding authorities, as well as internal policies

AVAILABILITY Ensure timely, efficient, and accurate retrieval of information

RETENTION Retain information for an appropriate time, based on legal, regulatory, & business analysis

DISPOSITION Disposition of information in accordance with policies, applicable laws, & regulations

TRANSPARENCY Document policies, processes and activities in a manner that is easily available & understood

© 2015 Grant Thornton LLP | All rights reserved

Information Economics Process Assessment Issued by Compliance, Governance & Oversight Councilhttps://www.cgoc.com/files/CGOC_Information_Economics_Process_Assessment_Kit.pdf

• Assess 18 information lifecycle processes for Legal (6 processes), RIM (1 process), Business (2 processes), Privacy (2 processes), and IT (4 processes)

• Assess each process against four levels of maturity:o Level 1: Ad hoc, inconsistento Level 2: Silo'ed, manualo Level 3: Silo'ed, consistent & instrumentedo Level 4: Integrated, instrumented enterprise

processes

© 2015 Grant Thornton LLP | All rights reserved

Comparing Assessment ToolsARMA's Next Level IG Assessment vs. CGOC's Information Economics Process Assessment

ARMA CGOC

• Issued by professional association

• Driven primarily from RIM perspective

• $4,995 to $5,995 for 12-18 month license

• 66 questions covering the 8 principles

• Lots of professional jargon in the questions

• Hosted tool permits benchmarking, raises security concerns

• Issued by forum sponsored by IBM

• Reflects input from IT, RIM, and Legal professionals

• Free to registered site users

• 18 processes to be rated, on a scale of 1 to 4

• Simple, easy to use

• Manually generate a risk heat map and process score card

Measure Up: Assessment Tools and Techniques

from the Field How to Assess Your Archives

Using Digital Preservation Capability Metrics

Lori Ashley – Tournesol Consulting

Digital Preservation Assessment Options

• Digital Preservation Capability Self-Assessment and DPCMM• http://www.DigitalOk.org/• http://www.securelyrooted.com/dpcmm

• European Ladder • http://trusteddigitalrepository.eu/

• Data Seal of Approval• http://datasealofapproval.org/en/

• ISO 16363• http://public.ccsds.org/publications/archive/652x0m1.pdf

http://www.iso16363.org/ • TRAC (Trustworthy Repositories Audit & Certification)

• http://www.crl.edu/sites/default/files/attachments/pages/trac_0.pdf• Self-audit against ISO 16363 (e.g., Nestor Seal for Trustworthy

Digital Archives)• http://www.langzeitarchivierung.de/Subsites/nestor/EN/nestor-Siege

l/siegel_node.html

Digital Preservation Capability Maturity Model© (DPCMM)

How DPCMM Works

• 15 Key Process Areas (KPAs) from Digital Preservation Policy to Access

• 5 Capability Levels from Nominal to Optimal

• Combined into an easy-to-use scorecard

DPCMM v2.7• Framework for assessing and benchmarking digital

preservation capability across governance, policy, processes and systems

• Draws together requirements ISO 14721 and ISO 16363

• Helps to establish priorities and communicate requirements to stakeholders

• Road tested and proven at organizations worldwide – including US State and Territorial Archives (CoSA) and ICA

DPCMM v2.7 White Paper: http://www.securelyrooted.com/dpcmm/

Register for the DPC Self-Assessment: http://www.DigitalOK.org/

Features Common to Maturity Models1. Commitment to perform2. Ability to perform3. Activities performed4. Measurement and analysis5. Verifying implementation

Build and Sustain Capabilities

Level 1

L

evel 2

L

evel 3

Level 4

Level 5

Robust Measurement & Analysis

Digital Preservation Infrastructure

Digital Preservation Services

Preservation Repository

The views expressed herein are those of the author and should not be attributed to the IMF, its Executive Board, or its management.

Information Governance at the IMF: possible solution

AUGUST, 2015

Salvador Barragan

Definition of Information Governance The establishment of enterprise wide policies and procedures and the execution and enforcement of these to control and manage information as an enterprise resource.

Why is Information Governance Important at the IMF: some key areasDiscover information not used and secure highly sensitive information

Control classification and declassification of all content

Apply retention and disposition to content regardless of location

Implementation of a Information Policy Hub as a way to address Information Governance

The Archives and Records Management section (ARM) launched a proof of concept in February (2015) to evaluate and prove the validity of a Information Policy Hub. The Three key areas that were addressed:

Data Map – Inventory of the repositories (SharePoint, File Shares and Exchange)

Categorization of content

Applying retention, deposition & declassification policies

Data MappingDashboard with different reports

Dashboard with different reports

Categorization of contentCategorized documents based on the training text or training documents

Automatic clusters in 2D visual display

Apply Policies

Policy rules: Delete with review

Items awaiting for review

Apply Policies

Policy rules: Put Hold

Policy executed and content locked

Results and Next StepsProof of Concept was successful it proved the validity of the Information Policy Hub for unstructured content. However, we did not test how well it would do for structured content. (It did capture the metadata for each object).

Move from a proof of concept to full implementation

Test Structured content data bases

top related