1_deliverables_as_per_dow.doc - ea.gr - 6.1 validation plan.docx  · web viewthis methodology will...

129
Validation plan Grant Agreement Number ECP-2008-EDU- 428045 OpenScienceResources: Towards the development of a Shared Digital Repository for Formal and Informal Science Education Validation Plan Deliverable number D 6.1 Dissemination level Public Delivery date July 2010 Status Final Author(s) Claudio Dondi, Michela Moretti, Chiara Picco / Menon Network Marlen Goldschmidt, Franz Bogner / UBT 1/129

Upload: truongkien

Post on 01-Feb-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Grant Agreement Number ECP-2008-EDU-428045

OpenScienceResources: Towards the development of a Shared Digital Repository

for Formal and Informal Science Education

Validation Plan

Deliverable number D 6.1

Dissemination level Public

Delivery date July 2010

Status Final

Author(s)

Claudio Dondi, Michela Moretti, Chiara Picco / Menon Network

Marlen Goldschmidt, Franz Bogner / UBT

eContentplus

This project is funded under the eContentplus programme1, a multiannual Community programme to make digital content in Europe more accessible, usable and

exploitable.

1 OJ L 79, 24.3.2005, p. 1.

1/95

Page 2: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Summary

This report represents the first deliverable in the framework of work package 6, Trials and Validation; it aims to provide a detailed description of all aspects of the validation of the OSR project, its approach and its methodology in a wide variety of usage contexts. The validation will be very comprehensive including all major user groups interacting with the system in all foreseeable interactions and contextual settings. These trials are not only meant for validation purposes (focusing on technological issues and user experiences), but also for involving users (museum staff and museum visitors) so that they can provide direction to the project.

This document describes in detail all validation activities that will be undertaken throughout the lifetime of the project. This validation plan includes details on the theoretical approach, on the examples of users’ scenarios prepared for the OSR portal, on the methods of validation. Further, the report presents an overview of the tools that will be used for the validation, on the exact duration and purposes of the validation phases of the project.

Finally, in the last chapter of this document a detailed operational planning will be described. The implementation of the validation process will aim to ensure user acceptance of the proposed methodologies and tools, optimal adaptation to the local environments and realistic evaluation of the effects of use, since as already mentioned, the OSR consortium aims to deploy a heavily user centred approach.

2/95

Page 3: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Table of Content1 VALIDATION CONTEXT...............................................................................................................9

1.1 VALNET FRAMEWORK AND METHODOLOGY IN OSR.............................................................111.2 MULTI-ASPECT DIMENSIONAL ANALYSIS IN OSR.....................................................................11

1.2.1 Pedagogical.........................................................................................................................111.2.2 Organisational/institutional.................................................................................................131.2.3 Technological.......................................................................................................................141.2.4 Economic.............................................................................................................................141.2.5 Cultural................................................................................................................................14

2 VALIDATION OBJECTS AND CRITERIA................................................................................15

2.1 OSR OBJECTS FOR VALIDATION................................................................................................152.1.1 OSR portal as a whole.........................................................................................................162.1.2 Pedagogical model.............................................................................................................162.1.3 Organisational model..........................................................................................................172.1.4 Learning Objects..................................................................................................................172.1.5 Educational pathway...........................................................................................................182.1.6 Content sensitive search and retrieval tools........................................................................192.1.7 Educational metadata authoring system..............................................................................192.1.8 Social Tagging system.........................................................................................................202.1.9 Community of users.............................................................................................................212.1.10 Learning experience........................................................................................................212.1.11 Authoring/teaching experience.......................................................................................23

2.2 QUALITY CRITERIA FOR VALIDATION : KEY CRITERIA AND KEY MEASURES............................242.3 EXTERNAL - WHAT THE USER PERCEIVES.................................................................................25

2.3.1 Searchabilty.........................................................................................................................252.3.2 Relevancy.............................................................................................................................262.3.3 Quality.................................................................................................................................26

2.4 EXTERNAL –WHAT BOTH THE USER AND THE MUSEUMS PERCEIVE..........................................272.4.1 Usability...............................................................................................................................272.4.2 Reliability.............................................................................................................................282.4.3 Portability............................................................................................................................282.4.4 Innovation............................................................................................................................282.4.5 Belonging, engagement and ownership..............................................................................292.4.6 Cost......................................................................................................................................29

2.5 EXTERNAL –WHAT THE MUSEUMS PERCEIVE............................................................................292.5.1 Sustainability.......................................................................................................................292.5.2 Transferability.....................................................................................................................302.5.3 Productivity..........................................................................................................................302.5.4 Added Value.........................................................................................................................302.5.5 Internal Variables................................................................................................................312.5.6 Capacity...............................................................................................................................312.5.7 Cycle Time...........................................................................................................................312.5.8 Conformance to Standards..................................................................................................322.5.9 Security................................................................................................................................32

3/95

Page 4: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

3 METHODS AND TOOLS...............................................................................................................38

3.1 INTRODUCTION..........................................................................................................................383.2 VALIDATION : PERSPECTIVES TO TAKE INTO CONSIDERATION..................................................39

3.2.1 a) Typology of access to the museum resources:.................................................................393.2.2 b) Frequency of the visit......................................................................................................393.2.3 c) User typology:.................................................................................................................393.2.4 d) Type of learning...............................................................................................................403.2.5 e) Process initiators.............................................................................................................403.2.6 f) Driver of the process........................................................................................................403.2.7 g) Reason for accessing resources:.....................................................................................41

3.3 DIFFERENT USERS & DIFFERENT PERSPECTIVES/REASONS/MOTIVATIONS (USER SCENARIOS). 443.4 VALIDATION TYPOLOGIES.........................................................................................................523.5 VALIDATION OF MACRO-PROTOCOLS........................................................................................543.6 TRAINING WORKSHOPS-VALIDATION PHASE I..........................................................................553.7 IN-SERVICE SEMINARS (SUMMER SCHOOL)...............................................................................583.8 TRAINING WORKSHOPS- VALIDATION II...................................................................................593.9 WEB-BASED EVALUATION.........................................................................................................623.10 QUALITATIVE VALIDATION.......................................................................................................64

4 VALIDATION SOURCES..............................................................................................................65

4.1 OSR PARTNERS (OSR PEDAGOGICAL / TECHNICAL EXPERTS AND MUSEUM EDUCATORS). . .704.2 SCIENCE TEACHERS..................................................................................................................714.3 STUDENTS (FORMAL LEARNING SCENARIOS)...........................................................................744.4 SCHOOLS, TRAINING CENTRES, UNIVERSITIES, THIRD AGE UNIVERSITIES...............................754.5 USER GROUPS & COMMUNITIES: INDIVIDUAL LEARNERS, FAMILIES, SCIENCE GROUPS AND SOCIO-CULTURAL ASSOCIATIONS...........................................................................................................754.6 AFFILIATED SCIENCE CENTRES AND MUSEUMS, PUBLISHING HOUSES DEALING WITH SCIENCE RESOURCES..............................................................................................................................................764.7 MINISTRIES OF EDUCATIONS, RESEARCH INSTITUTIONS (PRIVATE & CORPORATE)................774.8 INTERNATIONAL STANDARDS BODIES (FOR EXAMPLE, ISO, SCORM, IMS, HR-XML CONSORTIUM, IEEE-LOM RELEVANT TO METADATA AND ELEARNING MODEL DEVELOPMENT)..........78

5 VALIDATION TIMEFRAME.......................................................................................................81

5.1 IN SERVICE SEMINARS (SUMMER SCHOOLS)..............................................................................815.2 TRAINING WORKSHOPS- VALIDATION PHASE I.........................................................................815.3 TRAINING WORKSHOPS – VALIDATION PHASE II......................................................................825.4 VALIDATION TIMETABLE...........................................................................................................83

6 VALIDATION OUTCOMES.........................................................................................................84

7 LITERATURE.................................................................................................................................85

4/95

Page 5: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Introduction

This document aims to provide a comprehensive validation plan describing all validation activities that will be undertaken throughout the lifetime of the project. This plan will include details on: the theoretical approach and methodology, the type of trials, the type of participants and the exact duration and aims of each validation activity. It describes in a very comprehensive way the OSR validation activity including all major user groups interacting with the system in all foreseeable interaction contextual settings. The validation trials are not only meant for validation purposes (focusing on technological issues and user experiences), but also for involving users (museum staff and museum visitors) so that an OSR community of users will be created and can provide direction to the project. Indeed the OSR validation activities are based on a user-centred approach. Moreover, this validation plan offers a platform for the realization of examples of User Scenario that will be explained in detail in chapter 3 of this report. The validation activity aims to and is made for monitoring and evaluating the progress of the project, collecting data and analysis the results, involving project target groups and stakeholders in the implementation of the project and for the dissemination of the project itself.The plan reflects the general iterative mode of continuous interaction and exchange between different OSR work packages realized in consecutive cycles, namely definition of Educational Designs (WP2), user requirements (WP3), technology integration and customisation (WP4) and validation activities (WP6).

The validation strategy is based on different steps that aim to answer the following questions:

“Why” it is important to conduct evaluation activities (validation aims) “What” has to be validated and on the basis of what criteria (validation objects

and criteria) “How” the evaluation activities will be conducted (validation methods and

tools) “Who” will provide the required information and data (validation sources) “When” the evaluation activities will be conducted (validation timeframe) “How” the outcomes of validation will be used

5/95

Page 6: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The first chapter of the document defines the approach, the methodology and the aims of all the validation activities. In this chapter, a detailed explanation of the VALNET methodology is provided. Briefly, the VALNET methodology allows the OSR Consortium to adopt a multidisciplinary approach in order to analyse the results and the innovation value of the proposed processes and educational scenarios. The main innovation aspects will be identified taking as a focus for the analysis the following five dimensions: a) Pedagogical, b) Organisational/institutional, c) Technological, d) Economic and e) Cultural.

The second chapter provides a detailed definition of the validation objects. This is one of the main parts of the document, indeed before starting any evaluation procedures it is essential to clearly define what has to be validated. The objects of validation start from considering first of all the OSR portal as a whole and then proceeding with the validation of each different, meaningful and innovative part of OSR portal as well as the OSR project purposes. It is important to underline that these objects are based on the descriptions and the definitions stated in the two previous OSR deliverables such as D 2.1 Educational Design and D 4.1 System Specifications and Technical Design. The eleven objects for validation are the following:

1. OSR portal as a whole2. Pedagogical model3. Organisational model4. Learning Objects5. Educational Pathways6. Content sensitive search and retrieval tools7. Educational metadata authoring system8. Social Tagging system9. Community of users10. Learning experience11. Authoring/teaching experience

Moreover, after the detailed explanation of each object, some criteria have been defined and matched with the above mentioned objects. Indeed in order to achieve performance validation, an important step has been to identify key criteria at an external level (relating to both users of OSR and the Museums) as well as at an internal level, where the internal performance of the system itself is validated.

The third chapter describes the methods through which the validation activities will be implemented and it provides also an overview of the tools which will be used for monitoring and collecting data (detailed tools will be designed in the next deliverable D 6.2). This chapter presents some user scenarios which represent a “model categorisation

6/95

Page 7: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

framework” of the different perspective/reasons that OSR users/audience may have for accessing the OSR portal. The selected set of scenarios indentifies and defines some of the more common ones that would be anticipated. Moreover, this paragraph presents the list of the tools which will be used for each phase of the validation activity and a first description of results acquired by using Google Analytics for the evaluation of some web aspects of OSR portal.

The fourth chapter defines in detail the actors that the validation activities will involve for each phase, their characteristics, their level of involvement in the OSR project and their roles in the validation activity itself.

Finally, the fifth chapter presents the operational planning of the validation activity, defining period and timing during the OSR project life time, and the last sixth chapter defines the outcomes which are expected as a result of the validation process.

7/95

Page 8: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

1 Validation Context

The validation is a very important part of the work in the overall scheme of the project; the validation plan is based on the following two main principles: Validation must be performed on the basis of a common methodological framework

and according to defined procedures;

Validation and evaluation methodologies should be focused on providing new solutions and facilitating the dissemination of best practice and exploitation of results.

Due to the complexity and the ambitious nature of OSR goals, the validation plan must take into account different variables, such as:

usage contexts (for instance for different science centres and museums), learning objects (such as all the learning materials provided by OSR Content

providers), target/users groups (such as teachers, students, general visitor, etc)

In the OSR project the validation activity covers really important role. Indeed, the OSR project tries to investigate on the benefits of enriching digitised scientific objects which are currently dispersed in European science museums and science centres, with well-defined semantic metadata along with social tags, so that they become more widely and coherently available, and better searchable and usable in a variety of learning occasions. The main outcome of the project will be the OSR Portal, a set of customizable learning-oriented discovery services, reliably offered through the web sites of science centres and museums, school portals, visualization environments and other online education publishing services.

The validation activity in OSR aims to contribute to the following main process outcomes:

Provide the means and methods for data collection and analysis of the results of the OSR project;

Accompany project management by providing updated information on project achievements and emerging critical aspects vital for the continuation of OSR.

8/95

Page 9: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Generate “learning” among the parties involved

A good validation plan should have two main characteristics in order to reach its goals and to be useful and efficient:

It should be based on a user centred approach, because it is necessary to start from and take into account the needs and the points of view of the final users on the products and on results that must be evaluated and validated

It should be linked and fitted to the contextual setting where the validation activities take place

The OSR project has a wide European partnership that covers 12 different European countries plus 3 partners outside Europe (2 of them from USA and one from Taiwan). For this reason it using a methodology that allows examining also of the different settings, cultural and linguistic aspects and background of the users is essential. Indeed the validation involves all major user groups interacting with the system in all foreseeable interaction contextual settings. It is important to underline that the validation purpose is not only focused on technological issues and user experiences, but also on the ways of involving users (museum staff and museum visitors) so that they provide direction to the project and its technological and usability results. One of the main scope of the OSR validation trials and of the analysis of the data gathered is evaluating the added value that OSR project wants to provide is the exploration of social tagging and folksonomy as an accessory strategy to the educational content of the science centres and museums. It tries to perform extended comparisons between the access points in existing collections documentation and the terms that are supplied when visitors describe an exhibit along with the relevant physical phenomena.The OSR validation framework is based on a Formative and Summative approach supported by a multi-dimensional analysis:

1) Formative approach means to accompany the development of the project and suggest changes whenever a problem can be identified for improvement through evaluation. It also means to generate learning among all the actors involved to better achieve the task at hand. The formative approach will focus on the ongoing development and implementation of services and tools; this allows to gathering systematic, structured feedback from partners/experts and potential /actual users as part of an iterative development cycle during the building of the prototype.

2) Summative approach means to check if the expected technical and methodological objectives have been achieved and an impact has been produced. In this case, the focus is on the objectives dealing with the technical functionalities, with the impact being on the prototype and, by extension, the ongoing development of the global system.

3) Multi-aspect dimensional analysis is necessary because the validation does not concentrates only on the technical functionalities but also on other dimensions that cannot be ignore if the consortium desires to built the longer-term

9/95

Page 10: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

successful portal based on the prototype presented. The 5 inter-related dimensions in the analysis are Pedagogy; Technological; Organisational/institutional; Economic; Cultural/linguistic

1.1 VALNET framework and methodology in OSR

The OSR consortium will adopt the VALNET evaluation framework that offers the tools to identify different multicultural dimensions of the project results, products and typologies of target groups and users.This methodology will help the consortium to identify the innovative aspects and the differences and commonalties through the localisation process. Indeed, the main innovative elements in OSR are also related to the presence of linguistic and cultural diversities, thanks to the activation of transitional users groups and to the adoption of transitional linguistic approaches for fostering transferability of services, products, approaches and good practices.The VALNET methodology allows the OSR Consortium to adopt a multidisciplinary approach in order to analyse the initial results and the innovation value of the proposed processes and educational scenarios. Moreover, thanks to the VALNET MATRIX, for each evaluation object (Learning objects, Educational pathways, Metadata, Social tagging, Community of users, etc.) the main innovation aspects will be identified taking as a focus of analysis the following five dimensions: a) Pedagogical, b) Organisational/institutional, c) Technological, d) Economic and e) Cultural

1.2 Multi-aspect dimensional analysis in OSR

Multi-aspect dimensional analysis starts from pointing out the innovation aspects of the OSR project according to the 5 dimensions above mentioned.

1.2.1 PedagogicalThe OSR project addresses pedagogical innovation from many points of view such as:

a. Promoting and supporting learning (formal, informal learning)b. Fostering the implementation of innovation in science centres and

museumsc. Support the development of reusable learning activities based on

sharable learning resources

10/95

Page 11: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

a. Promoting and supporting learning (formal, informal learning)

One of the main objectives of the OSR project is to foster science education and the attractiveness of science studies through the development of the portal, where different and varied digital science contents (learning objects) are available and easily retrievable from different European science museums. One of the innovative aspects of the OSR portal is the opportunity that it gives to all users to mix and link formal and informal settings of learning. Indeed, the OSR portal offers not only the exploration and the recovery of digital science contents (learning objects) but also the exploration of this content along determined ways (educational pathways) so that the exploration itself acquires an educational sense, purpose and meaning. Even if this exploration occurs as a “formal” learning experience in connection with school or in science museums, the possibility for the users to explore themselves (through website or virtual visits) and to create their own personal pathway allows crossing from an informal learning experience (simply discovering and exploring the learning objects) to a formal learning experience (where there is a conscious intention to organise learning objects in order to have educational relevancy and pertinence).

b. Fostering the implementation of innovation in science centres and museums

The contexts of use of the OSR service are organised into the following three categories:

In school (combined with one of the following two categories) In the science museum/centre (physical visit) On the web (virtual visit),

As a consequence, while schools shall already have instruments, structures and methodologies and will integrate the OSR portal into the school activities, in the case of science museums this introduction represents a challenge to tackle by introducing new ways of organising visits and attracting visitors. The introduction of the OSR portal as well as the PDA mobile devices in science museums involve changes in the organisation of work and activities. For instance, new ways of organising school visits in terms of stronger collaboration with teachers (trying to plan a shared and mutual methodology in order to support the continuity of educational learning experience) may be implemented. Also, museums educators may need to acquire new skills (updating on new methodologies in teaching science), and updates in terms of new structures and instruments to apply in the museums may be acquired.

c. Support the development of reusable learning activities based on sharable learning resources

The use of the OSR portal for disseminating and teaching science, the way in which the learning objects uploaded are organised and available, the creation of a community of users and so that the opportunity to share knowledge and experience among visitors,

11/95

Page 12: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

teachers and students, all these aspects bring an important pedagogical impact in terms of attractiveness of science education and learning and in terms of different ways of teaching science. The innovative aspect of reusing learning pathways or other activities (set up by someone on the basis of resources that are dislocated in different museums but united in the portal) is that they allow mutual learning among different actors involved in educational activities (such as teachers, museum educators, students, etc.) and to set up educational pathways more and more appropriate and relevant thanks to the inputs of the different users.

1.2.2 Organisational/institutionalThe OSR project addresses organisational/institutional innovation from the following points of view:

a. Developing new instruments to apply in museums (through web connection to the OSR portal and PDA)

b. Fostering new services available for visitorsc. Promoting new skills and new jobs among the museum’s staff

a. Developing new instruments to apply in museums (through web connection to the OSR portal and PDA)

The development of new instruments is one of the main important causes that brings innovation to science museums. Indeed, in order to enable visitors to use these instruments, some changes in the organisation of museum in terms of providing and applying new structures may be implemented, for instance the set-up of internet points in front of science experiments or providing mobile devices to visitors.

b. Fostering new services available for visitors

Strongly linked with the point above, it goes without saying that if new instruments will be applied in the museums also new services shall be provided, such as wireless connection, or IT assistance points

c. Promoting new skills and new jobs in museum’s staff

Finally, if there are new services that should be provided it means that new competencies will need to be developed in the museum’s staff in order to manage any problems or need of assistance for visitors caused by the use of these new instruments. It may also become necessary to train the museum’s educators in order to familiarise them with the use of new instruments.

1.2.3 Technological The OSR project addresses technological innovation from the following points of view:

a. Applying ICT instruments for science learning and education

12/95

Page 13: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

b. Fostering a folksonomy of digital science contents through social tagging

a. Applying ICT instruments for science learning and education

Even if the use of ICT instruments for the enhancement of learning experiences is already widely accepted, it sill remains a great challenge to apply ICT instruments to the field of education and training, both in formal and informal learning setting. The use of the OSR portal and the PDA, together with the availability of different organised science learning digital contents, allow not only to apply ICT instruments for science education, but also to develop digital skills by teachers, educators, students and lifelong learners.

b. Fostering a folksonomy of digital science contents through social tagging

The possibility for the user to add a tag to the contents that he/she explores is the main innovative aspect of OSR project. Indeed the added value that the OSR project wants to provide is the exploration of social tagging and folksonomy as an accessory strategy to the educational content of the science centres and museums. The consortium aims to perform extended comparisons between the access points in existing collections documentation and the terms that are supplied when visitors describe an exhibit along with the relevant physical phenomena. The use of the OSR portal by the user can provide a folksonomy of social tags on digital science contents, thus making these resources more searchable, reusable and recoverable by all.

1.2.4 EconomicThe OSR project addresses economic innovation by:

a. Allowing to access and share different science learning resources without any cost

The OSR project aims to collect different digital science contents currently spread in each website and setting of science museums. The opportunity to make available on a single portal these digital science contents uploaded as learning objects allows the user to visit and explore them while staying in one place and without incurring any costs.

1.2.5 CulturalThe OSR project addresses cultural innovation from the following points of view:

a. Providing sharable science learning in different languages

One of the common difficulties in finding and recovering digital contents is caused by linguistic barriers. Instead, in the OSR portal learning objects and contents uploaded are available in eight different languages provided by the OSR Consortium partners. This not only gives the content a European added value, but also makes the content uploaded more understandable for all.

13/95

Page 14: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

b. Fostering the creation of virtual meeting places and supporting sharing of knowledge and cultural backgrounds among different people within different roles

The OSR portal enables different users to share their knowledge, their created educational pathways and their tags among each other. This opportunity allows creating a virtual community and a mutual/peer learning that make the simple exploration of contents an informal learning experience. In addition, this virtual meeting can offer a platform to share, make comparisons and open users’ minds on the different ways of learning, teaching and studying science in Europe and outside Europe.

2 Validation Objects and Criteria

OSR is a very complex project, it aims to develop ambitious services, such as the OSR portal, on which there are available not only a simple collection of digital science contents but classified and categorized resources able to transform a simple learning exploration into a learning experience. This chapter describes the objects of validation, starting first of all with the OSR portal as a whole and then proceeding with the validation of each different meaningful and innovative part of the OSR portal as well as of the OSR project purposes. It goes without saying that the validation objects are both the technological part of the OSR portal and the organisation and relevancy of resources uploaded inside the portal. Then after the definition of the objects, the quality criteria for validation will be described in detail.

2.1 OSR objects for validation

It is important to underline that these objects are based on the descriptions and the definitions stated in the two previous OSR deliverables, D 2.1 Educational Design and D 4.1 System Specifications and Technical Design. The eleven objects for validation are the following:

1 OSR portal as a whole2 Pedagogical model3 Organisational model4 Learning Objects5 Educational Pathways6 Content sensitive search and retrieval tools7 Educational metadata authoring system8 Social Tagging system9 Community of users

10 Learning experience

14/95

Page 15: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

11 Authoring/teaching experience

2.1.1 OSR portal as a whole

As defined in the previous deliverable D.2.1 Educational Design, the OSR project is setting out to explore the opportunities offered and challenges posed by the enrichment of science learning digital resources with standardised and social educational metadata. Thhhis is part of an agenda focused on bridging formal and informal science learning contexts in order to make science learning opportunities more accessible and appealing to learners across the lifelong learning spectrum. A particular interest of the project lies in the potential offered by such an approach for the development of synergies between school, science education and informal science learning experiences offered by science museums and centres. The main outcome of the project is the development of a validated portal able to fulfil this purpose; this is the reason why the first object of the validation activities shall be the OSR portal as a whole. Following a holistic approach, the starting point of validation must focus on the first impression perceived by the users on the OSR portal in its totality, meaning its interface system, user friendliness, ease of understanding and exploration of the system, etc. Indeed, the first impression of something is always the first step in developing motivation and curiosity to go further in the discovery experience. The OSR portal shall foster this motivation and curiosity in exploration the system, its contents and resources.

2.1.2 Pedagogical model

One of the added values of the OSR project is to create a bridge between formal and informal learning, by setting up educational tools and methods able to foster collaboration among science educators and teachers and to shift a simple exploration of the OSR portal as well as a visit in a museum site into a learning experience. In order to reach this objective, a series of tools have been developed such as educational pathways, educational metadata, and why not social tagging too. All these tools are based on the theoretical, pedagogical and didactic approach of inquiry based learning and resources based learning (referring to the previous deliverable D 2.1 Educational Design). Before validating the single educational tools and methods, it is necessary to gain a wide perspective on the pedagogical model that is the basis of the whole process. This means that it is essential to evaluate if the chosen pedagogical models (inquiry based learning and resources based learning) achieve the learning objectives, are in line with teachers’ and educators’ didactics methods and permit the learning and educational process by the users. In other words, it is important to validate it in order to be sure that

15/95

Page 16: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

this is the right pedagogical pathway that matches users’ needs, portal purposes and scopes, learning objects and educational pathways.

2.1.3 Organisational model

If the previous object is focused on the pedagogical aspect of the digital contents and learning objects organization strongly related to the educational meaning and scope, the organisational model is addressed to evaluate both the structured organisation of the digital contents in the portal such as uploading and combining different learning objects from different museums, and the structural changes necessarily incurred in the museums such as creating long-term collaborative relationships among different museums (both involved in the project and outside the project’s consortium). The organisational model could be seen as an issue of minor interest because it seems quite far from the overall objective of the project, instead it is one of the pillars on which the project is based. The use of the OSR portal involves so many different organisational changes that these should be taken into account in the validation activity. Indeed, the use of the OSR portal implies new instruments and new solutions applied, new collaborative attitudes both among museums themselves and between museums and schools, new standardized practice of exploration, and so on. The development of new services always implies structural changes that are extremely relevant for the successful achievement of the goals.

2.1.4 Learning Objects

As already described in the previous deliverable D2.1 Educational Design, the Learning Objects (LOs) are defined as a new way of thinking about learning content that are developed to support technology-enhanced learning processes. Moreover, Learning objects are defined by the IEEE Learning Technology Standardization Committee (LTSC) as "any entity, digital or non-digital, that can be reused for learning, teaching or training" (IEEE LOM, 2002). In general, any digital resource that can be reused to support learning (Wiley, 2002), can be considered as a learning object. Learning objects include, but are not limited to, simulations, animations, tutorials, diagrams, audio and video clips, quizzes and assessments. The rationale of the OSR project is based on the aim of making available in one site the learning objects uploaded on the different science centres and museum’s websites. It goes without saying that one of the main challenges is trying to classify and categorize them in a user friendly way while avoiding duplication, linguistic and size problems. Therefore, it is important to pay attention to different aspects related to the LOs in the validation phase. It means for example that it is necessary to evaluate the degree of

16/95

Page 17: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

appropriateness of LOs and to which degree they match the expectations and preferences of the users, as well how much they support the increase of knowledge and development of competences during the learning experience.

2.1.5 Educational pathway

The main purpose of any educational process is to start, develop and reach a change inside the learners as a final result of the learning process. This change can be represented as an acquisition of knowledge, or attitudes, or behaviour, or competence or all of them in the same learning setting. This basic theory is the mainstream that guides all the OSR project, indeed the purpose of the project is not only to make digital science contents more available but also to organize it in learning pathways trough which the educational process can start and then develop a change inside the final users. This is the theoretical approach the OSR portal is based on. Thus, it is not only a matter of uploading science content to the portal, but also to organize this science content in educational pathway. As mentioned before, the validation activities will be focused both on the technological part of the OSR portal and on how the content is organised inside of it. As defined in the D 2.1 the concept of Educational Pathway (EP) in OSR reflects the priority given by the project to respond to the needs of the diverse communities of potential users of the OSR services. Thus, an Educational Pathway in the OSR project describes the organization and coordination of various individual science learning resources into a coherent plan so that they become a meaningful science learning activity for a specific user group (e.g. teachers, students, other museum visitors, etc.) in a specific context of use. Further, Educational Pathways directly serve the priority assigned by the project to the integration of resources scattered in various science museums/centres into the same learning experience rather than the mere selection of resources from a single museum or science centre.Following with this definition of what EP is and what is its scope, the validation activities shall focus on how this way of organizing content fits and matches the needs and the expectations of the users and supports their learning experience and educational process. Moreover, it is important to evaluate if the way they are classified and categorized makes it easy for users to create new ones and recover the EPs that are already uploaded. Related to what is described above, in the validation phase it is important to take into consideration the learning process that lie behind the definition of educational pathway in order to monitor and evaluate their pertinence. In the same time it is important to evaluate the structure of the educational pathways:

1. they should motivate : curiosity, ambition, desires, habits2. they should be a real experience involving abstract concepts: linked with

previous experiences but also new, deepening and improving knowledge.

17/95

Page 18: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

3. they should bring to a synthesis: integration of knowledge, understand, refection and further question.

4. they should bring to a valorisation: of knowledge, of innovation, of use.

2.1.6 Content sensitive search and retrieval tools

As defined in the previous deliverable D 4.1 System Specifications and Technical Design, the searching interface is probably the most crucial element for the success of the portal. The objective is to find easily what the users are looking for, thanks to a user friendly interface. Four different searching interfaces have been implemented in order to satisfy every need, each of them with distinct characteristics such as a text search interface, semantic search, tag cloud, linked vocabulary terms. In addition, in order to avoid linguistic problems, a standardized translation key will be created for all important science terms. According to these specifications, it is important to evaluate different aspects that are involved in contents searching (learning objects and educational pathways) and of course the ways to recover them. First of all it is necessary to determine to which extent the query language of the search mechanism is user- friendly and how these kinds of search mechanisms match the OSR purposes and users’ needs. Secondly, as the multi linguistic aspect is an important challenge to tackle, it is equally essential to evaluate the usability of the translation system and how it matches the users’ expectations.

2.1.7 Educational metadata authoring system

Educational metadata are defined as meta- information attributed to educational digital content, indeed educational metadata represent the educational characteristics of a learning object, such as the target group it involves, or the thematic area it concerns.The OSR portal is set up in a way that allows all OSR portal users to add materials and contents and classify them with specific keywords and educational metadata. As this opportunity has a great added value in making the portal efficient, effective and sustainable, during the validation phase it is important to establish how the keywords and metadata defined are appropriate for defining, classifying and selecting the learning objects and the educational pathways and how this system appears in a user friendly way and how is respecting the IPR issues that could arise.

2.1.8 Social Tagging system

18/95

Page 19: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The added value that OSR project wants to provide is the exploration of social tagging and folksonomy as an access strategy to the educational content of the science centres and museums. It tries to perform extended comparisons between the access points in existing collections documentation and the terms that are supplied when visitors describe an exhibit along with the relevant physical phenomena. Considering this project purpose, the evaluation and the validation of the system mechanism that allows the users to add tags, is a key point that the validation activity shall address. The main points of attention shall be how and to which level the social tagging helps users find what they are looking for and, secondly, to determine the effectiveness and efficiency of the social tagging in respect to the educational metadata system.

Social tagging allows the development of a user based taxonomy reflecting the broad range of needs and perspectives of users. Therefore the OSR portal will get additionally to the formal metadata description of the resources a folksonomy contributed by the general public. In other projects it could be discovered that there is a semantic gap between the formal description (metadata) created by specialists and the common language used by visitors. The users address the content from other points of view than specialists. The social tagging and the resulting multi-lingual folksonomy can bridge this gap and can support metadata documentation, content distribution, content management, search and retrieval (Trant, J. 2009). For instance the tags can supplement and complement the metadata collection of the OSR portal and offer lots of additional access points for search. Social tags are user-generated, user-initiated content and represent the points of engagement between general users and the educational content. Furthermore, social tagging offers a quick, low-investment way for visitors to make contact with the content.The points which should be evaluated concern the utility of the social tags to enhance the access to the OSR resources and the convenience of the tool to engage visitors to look at and think about the educational content. The research questions concerning the impact and utility of social tagging are the following:

Do user tags differ from terms used for the metadata description of the educational content? If so, how?If user tags differ from metadata terms it could be assumed that social tagging provides additional access points and thus improves for instance searchability of the OSR portal.

Do OSR visitors find user tags useful for searching the repository?If the usefulness of user tags for searching is proven, i.e. the ability of naive users to provide helpful descriptions, social tagging can be respected as an additional effective characterisation method of online resources.

Is social tagging conducive to engage visitors of the OSR portal to look at and think about the educational content?

19/95

Page 20: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

If social tagging is an easy and quick method to give useful and meaningful descriptions to the educational content, it will help to engage visitors and to establish a fast-growing OSR user community.

2.1.9 Community of users

The OSR portal supports the creation of virtual communities in order to allow user interaction and exchange of ideas. This “community building” will have strong synergies with the social tagging model, in particular thanks to the possibility to add personal comments and tags on the learning objects and educational pathways explored. The system supports different functionalities in order to let the user interact and participate in the portal. The creation of a virtual community is an added value of the learning experience that will occur during and after the exploration of the OSR portal. For this reason it is important to evaluate and validate the efficiency and effectiveness of these Community building functionalities, and of course the quality and the efficiency of the community itself. It means that it is necessary, for example, to evaluate if the OSR portal is attractive and fosters and supports a sense of belonging perceived by the users, or how their behaviours can impact on the participation in the community, in sharing materials and knowledge, etc.

2.1.10 Learning experience

As described above the OSR project has a great pedagogical aspects and purpose, indeed the OSR portal and its functionalities want to support a smooth shift from an informal learning activities to a formal learning experience in the learning and the teaching of science. In order to evaluate the learning experience, it is important to underline the meaning of the learning experience as an educational process. The main purpose of any educational process is to start, develop and reach a change inside the learners. This change can be represented by an acquired knowledge, or attitudes or behaviour or competence or all of them in the same learning setting. This basic theory is the mainstream that guides all the OSR project, indeed the purpose of the project is to make not only digital science contents more available but in the same time organizing them in learning pathway trough that the educational process can start and develop a change inside the final users. Following the flow of the educational process, it can be assumed that the learning objects are well organized if an exploration throughout the portal, based on open as well as a structured educational pathway, brings a change in the learner and in the final user. If this happens, we can assume that a learning experience has occurred.

20/95

Page 21: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

PROCESS

KnowledgeAttitudesBehaviourCompetences

Before exploration

NewKnowledgeAttitudesBehaviourCompetences

During exploration

A sustainable and relevant change has occurred in the learners

After exploration

Validation plan

All this process can be summarized in the diagram below:

Of course the outcomes of any educational process are strictly linked to the setting and the context where the process took place.

Context in OSRThe context of a learning process is really important in order to determine the outcomes. Indeed each learning experience occurs in a specific context and at the same time the context affects the learning experience. The learning experience can be determined and influenced by different aspects of the context, for example how the learning setting encourages the acquisition of knowledge, the sense of collaboration, the self- learning and autonomy, etc. In the OSR project the context is composed by the typology of users and the learning places (outside –inside the museum or virtual or real visit).

Process in OSR

21/95

OUTCOMESCONTEXT

Page 22: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The process is the way in which a learning experience takes place. This means the formal and informal learning processes which is supported by the OSR portal.

Outcomes in OSRAs mentioned before, the outcomes are all the changes occurred in the learner after the educational process. Moreover, these changes can be expected or unexpected, desired or not desired.

The learning experience is one of the most important objects of the evaluation and the validation. Indeed, during the validation phase it is necessary to pay attention to how efficient, pertinent and effective the learning experience is. How the learning objects are presented and showed in a way to foster appeal and curiosity and how all that promotes and develops motivation in discovering new things and will improve knowledge are also aspects for consideration. Finally, one great challenge of the OSR project, how to bridge the gap between formal and informal learning, is tackled and overcome.

2.1.11 Authoring/teaching experience

In order to ensure that users enjoy a learning experience it is necessary to plan the didactic methods, tools and objectives. This means that a teaching model that develops and promotes the educational process must be used. Following this basic principle, during the validation activities observations will focus on how this teaching experience occurred and how the OSR portal, its learning objects and pathways match the needs of the educators and teachers. Here it is important to bear in mind that in the OSR project users are not only passive users but rather they are active actors that take advantage of the portal’s possibilities and at the same time can create and update contents. Teachers and museum educators are the main actors involved in the uploading of resources (both learning objects and educational pathway). This is why it is important to validate how the tools available for uploading and creating resources (in particular educational pathways) fit their needs, how simple, smooth and teacher–friendly these processes are. This means that the validation activity will focus on evaluating how the use of these new tools changes the teaching process and the planning of each teaching session, how teachers can adapt and fit these tools in their contexts and strategies of teaching. Moreover, the degree to which users’ ownership of the content is recognised is another aspect that the validation activities take into account.

2.2 Quality criteria for validation: Key criteria and key measures

Not only should the principles and the approach for the validation be clearly established so as increase the efficiency and accuracy of data collection, but there should also be

22/95

Page 23: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

task-oriented protocols – “descriptive scenarios” – to give meaning to this collection. These scenarios will allow the diverse actors and stakeholders involved to evaluate the different functions and services of the OSR system based on system -activated realities.

These scenarios show the inter-related modular functioning and the services of OSR. The validation of these elements demonstrates how the system works – its performance – in a manner that is as realistic as possible. By using realistic measures as the yardstick of validation, the programme allows for the key functions to be clearly seen and measured so that they can, if and when needed, be improved for the global benefit of OSR.

To achieve performance validation, we have identified key criteria at an external level, which relate to both users of OSR and the Museums, as well as at an internal level, where the internal performance of the system itself is validated.

The following paragraph describes the primary criteria together with some of the key measures that will guide data collection.

ExternalWhat the user perceives

ExternalWhat the museum

perceives

Internal (OSR System)

Searchability Relevancy Quality (effectiveness and

efficacy of learning experience)

Sustainability Transferability Productivity Added value

Capacity Cycle time Conformance to standards Security Cost

23/95

Page 24: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Usability Reliability Portability Security

Innovation Belonging and ownership

Cost

2.3 External - What the user perceives

By external, we mean the important criteria that the end user (teachers, students, lifelong learners, etc.) will consider when testing whether the OSR system meets their requirements.

2.3.1 SearchabiltyIn OSR, this refers to the system’s search features that enable users to locate items using an efficient “query language” to access the product or service in a simplified way, rather than requiring the user to browse down / through a number of different menus to get the needed information, and where the system can work independently irrespective of the size of the learning resource and be capable of working across a variety of systems.

Key Measures include:

The system’s cross-referencing capacity (flow of information across different databases/technical modules internally and across reference systems and/or models externally)

The ease with which the user knows and handles the “query language” of the search mechanisms: text search, semantic search, tag cloud, social tagging, rating, etc.

The degree to which the resources (learning objects and learning pathways) are classified and/or categorized and made available to the user in a simple and straightforward manner.

2.3.2 RelevancyIn OSR this refers to the degree the system can allow users to find within the learning objects and educational pathways database provided, the educational and training materials that best fit their preferences and needs.

24/95

Page 25: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Key Measures include:

How well OSR matches text search, semantic search, tag cloud, social tagging,

How accurately the keywords appear in the page title or meta tags of the resources,

How well the matches reflect the user’s needs and preferences,

How well social tagging, tag cloud and rating help users to find what they are looking for

2.3.3 QualityA number of variables of that provide a measure of Quality are identifiable, some of which will unavoidably cut across other variables related to performance.

Key measures include:

Diversity and completeness of OSR features (personalization options, variety of search modes, interactivity etc),

The extent that user expectations are fulfilled in terms of what the user asks for and what the OSR system is capable of offering (user satisfaction). This can be seen as the sum of all other measures, i.e. the overall perception of the system,

To what extent the OSR resources / learning pathways adopt pedagogical models which support teachers and educators in their didactics

Effectiveness of the learning experience:

To what extent the OSR resources / learning pathways adopt pedagogical models which support the achievement of the stated learning objectives

Quality of the LOs for supporting learning, increase knowledge and develop competences

Quality of EPs for supporting learning, increase knowledge and develop competences

To what extent resources are offered to users in a way which supports curiosity, enjoyment and motivation,

To what extent OSR supports the learning process both formal and informal learning,

Effectiveness of the learning experience: to what extent the learning results achieved by the user were actually the best or most suitable ones available through OSR (or if other methods to organise and propose resources/learning paths would have supported more effective for learning).

25/95

Page 26: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

2.4 External –what both the user and the museums perceive

2.4.1 UsabilityThe official definition of Usability according to the ISO is: “The effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments”. And according to IEEE 90, it is “the ease with which a user can learn to operate, prepare inputs for, and interpret outputs from a system or component”. Using these two definitions, we can cite various relevant measures of OSR usability.

Key measures include:

User-friendliness (ease of navigation, aesthetically pleasing interfaces, menu organization and personalization of user presence in OSR…)

The degree of appropriateness of the learning resources recommended to the user according to his/her preferences

The agility with which learning objects and educational pathways are created and updated

The usability of the multilingualism system

Range and degree of interactivity with the OSR system interface

Time Lag: the amount of time it takes for the user to get responses from the OSR system

Quantity and suitability of tools and methods for searching and finding resources (on the users part, from the point of view of the user)

Quantity and suitability of tools for uploading resources (Los and EPs) to OSR system (from the point of view of the museums and authors)

The degree of appropriateness of metadata for selecting LOs and EPs

Relevance and utility of the user tracking and data mining system

2.4.2 ReliabilityIn OSR, the term Reliability indicates the system’s capacity to fulfil functional expectations, such as contemplating the time interval between software failures.

Key Measures include:

The degree to which a fault-tolerant server or a clustered server exists

The extent to which the server, as designed for the prototype, and forecasted for future usage, contemplates the issue of time between software failures

The appropriateness of tools and procedures for dealing with potential system

26/95

Page 27: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

malfunction and the speed of recovery in the case of system malfunction occurring.

2.4.3 PortabilityThis refers to the ability of OSR applications to be easily transferable, “portable” to another platform or the fact that they can interact with different platforms.

Key measures include:

With respect to code development, the degree to which the transitioning of a portable application suitable for all systems, is implemented

The portability of OSR (e.g. OSR’ possibility of being accessed and used from different platforms, from home, work, museums, PDA, and remote centres via multiple delivery means)

2.4.4 InnovationThis is a variable which is related both to the way the resources (learning objects and educational pathways) are showed and offered to the learners as well as to the way museums are changing their ways to organise, store and manage their resources.

Key measures include:

Quantity and suitability of tools and methods for creating learning educational pathways (both open and structured) for users

Quantity and suitability of tools and methods for classifying, storing, searching and retrieving resources, with particular reference to Tag Cloud; Social tagging, Rating etc.

Quantity and suitability of tools and methods for combining resources from different museums in way that can accommodate learning needs and preferences.

Suitability of those approaches adopted for managing users’ ownership of the content

Effectiveness and efficiency of the OSR social tagging in respect to Metadata system

Mission fulfilment: OSR‘s capacity to create a bridge between formal and non formal learning (the pedagogical model)

2.4.5 Belonging, engagement and ownershipThis criterion refers to copyright and intellectual property rights issues as well as to “soft” aspects which are fundamental for assuring effective user participations in the OSR community and “affiliation ” of the learners.

27/95

Page 28: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Key measures include:

The degree of which users’ ownership of the content is recognised

Social reputation and the rating system

Sense of belonging to the OSR community

The effectiveness of the process of creating, storing, sharing and exploiting knowledge and practices emerging from the virtual communities

The extent of which user feel engaged in exploring OSR portal and its resources and tools (blog, news, community)

The extent of which the rules (implicit and explicit) and the behaviours of the community members support participation, sharing, learning and collaboration.

2.4.6 Cost This is an undeniable determinant in customer behaviour and satisfaction. In this regard, it will refer to the costs involved in obtaining the global services of OSR, from the point of view of both the users and the Museums.

Key measures include:

Equipment and operating costs (Internet costs, electricity, PC)

Indirect costs (such as the time and effort it takes the user to connect to and maintain the education and training service)

Direct and Indirect costs for Museums when they offer, modify and promote resources (learning objects and educational pathways) in the system

2.5 External –what the museums perceive

2.5.1 SustainabilityKey Measures include:

The degree to which a long-tem, mutually beneficial collaborative relationships amongst museums and other OSR partners is established

The extent to which a large number of users become affiliated to OSR

The extent to which the OSR system is recognised within Education, Training, LLL and museum sectors

The degree to which OSR is able to anticipate future trends and act on them

The degree to which people and resources are rewarded with / by involvement / with or use of OSR

28/95

Page 29: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

2.5.2 Transferability The degree to which the pedagogical model for managing resources (learning

objects) and creating learning/educational pathways can be adopted as standard practice by the science museums participating in the OSR (the organisational model)

The degree to which the pedagogical model for managing resources (learning objects) and creating learning path can be adopted by those science museums which are not involved in OSR as well as other providers.

2.5.3 ProductivityIn OSR this refers to the relationship between different outputs and inputs, and is viewed as both adding value to (see Added Value and Sustainability) and optimizing the quality of the process. Productivity is a total concept that addresses the key elements of competition, i.e. innovation, cost, quality and delivery.

Key measures will include:

The return on “investment and efforts” in terms of museums users

The costs and efforts of launching and maintaining the production of the OSR portal and associated services

The degree of integration that exists amongst the various project partners that will support transfer and assimilation of knowledge in the context of the building and maintenance of OSR

The “Soft” competitive benefits that exist for the project partners (the value of improved knowledge management and transfer processes within the museums).

2.5.4 Added ValueThe considerations of added value are contemplated in the raison d’être of the system’s design. As such, we include this variable, referring specifically to the projected enhancement / development of the museums resources even before these resources (learning objects) are offered to customers via OSR portal and software capabilities at each stage of production or service.

Key Measures include:

Access to distribution channels: the extent to which OSR dissemination channels can provide a competitive advantage to museums

Economies of scope: the efficiencies gained when the investment on OSR can support many supplementary activities (data mining, for example)

Product complementarity: OSR ‘s capacity to combine resources belonging to different Museums so as to create more extensive learning paths to respond to

29/95

Page 30: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

those users having different needs and expectations.

2.5.5 Internal VariablesBy Internal, we mean the important variables that project partners and future participants will consider when testing whether the OSR prototype and system meet their technical requirements.

2.5.6 CapacityCapacity relates to the amount of work (processes and services) that the OSR system can accommodate. This can be measured either as traffic or information load at peak periods or measured over a span of time to obtain an average. This is an important issue insofar as the OSR system, for it to be successful, must operate without significant delays – both for the users and for the museums themselves.

Key Measures include:

Number of supported museum operators and users capable of being in the system

Number of supportable technical operations

Rate of basic services in terms of receiving, storing, retrieving, manipulating and displaying data for a variety of functions

The scalability of the system (its ability to significantly expand – or reduce – its capacity to operate) and its ability to avoid any significant disruptions or excessive costs)

2.5.7 Cycle TimeIn OSR, this refers to the time interval between the start of a process and its completion. Under consideration here are included a wide range of processes: e.g. completing or updating the user profile; searching, finding learning objects; updating or modifying user data; the system’s verification of a users’ identity; and the system’s ability to incorporate the contribution of new learning objects or educational pathways.

Key Measures include:

Processing time for performing each step

Waiting time between the completion of one step and initiation of another

Dependencies between steps

Bottlenecks and their correction

2.5.8 Conformance to StandardsWe refer here to compliance and concordance with legal, technical and industry standards in the area of intellectual copyright, metadata for learning objects, (e.g.

30/95

Page 31: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

IEEE/IMS, HR-XML and SCORM) and data schemes and identifiers.

Key Measures include:

Level of compliance to de facto or de jure standards regarding intellectual property

2.5.9 SecurityThis refers to the possibility that the OSR system and its services are not vulnerable to unauthorized usage, sabotage, or criminal activity.

Key Measures include:

Sufficient intelligent procedures that ensure authorization, verification, and accuracy of information flows to prevent unauthorized access

Methods for checking the accuracy of inputs, identifying errors and correcting them

Methods for system recovery in the case of major malfunctions

Methods for securely tracking participants, while ensuring their confidentiality

In particular the following table matches the validation objects with key measures

VALIDATION OBJECTS KEY MEASURES

OSR portal as a whole The system’s cross-referencing capacity

Diversity and completeness of OSR features (personalization options, variety of search modes, interactivity etc),

User-friendliness (ease of navigation, aesthetically pleasing interfaces, menu organization and personalization of user presence in OSR…)

Range and degree of interactivity with the OSR system interface

Time Lag: the amount of time it takes for the user to get responses from the OSR system

Relevance and utility of the user tracking and data mining systems

The degree to which a fault-tolerant server or a clustered server exists

31/95

Page 32: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The extent to which the server, as designed for the prototype, and forecasted for future usage, contemplates the issue of time between software failures

The appropriateness of tools and procedures for dealing with potential system malfunction and the speed of recovery in the case of system malfunction occurring.

With respect to code development, the degree to which the transitioning of a portable application suitable for all systems, is implemented

The portability of OSR (e.g. OSR’ possibility of being accessed and used from different platforms, from home, work, museums, PDA, and remote centres via multiple delivery means)

Equipment and operating costs (Internet costs, electricity, PC)

The extent to which the OSR system is recognised within Education, Training, LLL and museum sectors

The degree to which OSR is able to anticipate future trends and act on them

The extent to which a large number of users become affiliated to OSR

The return on “investment and efforts” in terms of museums users

The costs and efforts of launching and maintaining the production of the OSR portal and associated services

The degree of integration that exists amongst the various project partners that will support transfer and assimilation of knowledge in the context of the building and maintenance of OSR

The “Soft” competitive benefits that exist for the project partners (the value of improved knowledge management and transfer processes within the museums).

Number of supported museum operators and users capable of being in the system

Number of supportable technical operations

32/95

Page 33: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Rate of basic services in terms of receiving, storing, retrieving, manipulating and displaying data for a variety of functions

The scalability of the system (its ability to significantly expand – or reduce – its capacity to operate) and its ability to avoid any significant disruptions or excessive costs)

Processing time for performing each step

Waiting time between the completion of one step and initiation of another

Dependencies between steps

Bottlenecks and their correction

Sufficient intelligent procedures that ensure authorization, verification, and accuracy of information flows to prevent unauthorized access

Methods for checking the accuracy of inputs, identifying errors and correcting them

Methods for system recovery in the case of major malfunctions

Methods for securely tracking participants, while ensuring their confidentiality

Pedagogical model The extent that user expectations are fulfilled in terms of what the user asks for and what the OSR system is capable of offering (user satisfaction)

To what extent the OSR resources / learning pathways adopt pedagogical models which support the achievement of the stated learning objectives

To what extent the OSR resources / learning pathways adopt pedagogical models which support teachers and educators in their didactics

To what extent OSR supports the learning process both formal, informal learning,

Quantity and suitability of tools and methods for searching and finding resources (on the users part, from the point of view of the user)

33/95

Page 34: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Quantity and suitability of tools and methods for creating learning educational pathways (both open and structured) for users

Mission fulfilment: OSR’s capacity to create a bridge between formal and non formal learning (the pedagogical model)

Organisational model The agility with which learning objects and educational pathways are created and updated

Quantity and suitability of tools to combine resources from different museums in way that can accommodate learning needs and preferences.

Suitability of the approaches adopted for managing users’ ownership of the content

The degree to which a long-tem, mutually beneficial collaborative relationships amongst museums and other OSR partners is established

The degree to which the pedagogical model for managing resources (learning objects) and creating learning/educational pathways can be adopted as standard practice by the science museums participating in the OSR (the organisational model)

The degree to which the pedagogical model for managing resources (learning objects) and creating learning path can be adopted by those science museums which are not involved in OSR as well as other providers

Access to distribution channels: the extent to which OSR dissemination channels can provide a competitive advantage to museums

Economies of scope: the efficiencies gained when the investment on OSR can support many supplementary activities (data mining, for example)

Equipment and operating costs (Internet costs, electricity, PC)

Indirect costs (such as the time and effort it takes the user to connect to and maintain the education and training service)

34/95

Page 35: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Product complementarity: OSR ‘s capacity to combine resources belonging to different Museums so as to create more extensive learning paths to respond to those users having different needs and expectations.

Direct and Indirect costs for Museums when they offer, modify and promote resources (learning objects and educational pathways) in the system

Learning Objects The degree to which LOs are classified and/or categorized and made available to the user in a simple and straightforward manner.

The degree of appropriateness of the learning resources recommended to the user according to his/her preferences

Quantity and suitability of tools for uploading resources (Los) to OSR system (from the point of view of the museums and authors)

Quality of the LOs for supporting learning, increase knowledge and develop competences

Educational Pathways The degree to which EPs are classified and/or categorized and made available to the user in a simple and straightforward manner.

The degree of appropriateness of the learning resources recommended to the user according to his/her preference.

Quantity and suitability of tools for uploading resources (EPs) to OSR system (from the point of view of the museums and authors)

Quality of the EPs for supporting learning, increase knowledge and develop competences

Content sensitive search and retrieval tools

The ease with which the user knows and handles the “query language” of the search mechanism

How well OSR matches text search, semantic search, tag cloud, social tagging

How well the matches reflect the user’s needs and preferences

How well social tagging, tag cloud and rating help users to find what they are looking for

35/95

Page 36: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The usability of the multilingualism system

Quantity and suitability of tools and methods for classifying, storing, searching and retrieving resources, with particular reference to Tag Cloud; Social tagging, Rating etc.

Educational metadata authoring system

The degree of appropriateness of metadata for selecting LOs and EPs

How accurately the keywords appear in the page title or meta tags of the resources

Level of compliance to de facto or de jure standards regarding intellectual property

Social Tagging system How well OSR social tagging help users to find what they are looking for

Effectiveness and efficiency of the OSR social tagging in respect to Metadata system

Community of users The degree to which people and resources are rewarded with / by involvement / with or use of OSR

The extent to which a large number of users become affiliated to OSR

The extent of which the rules (implicit and explicit) and the behaviours of the community members support participation, sharing, learning and collaboration

Sense of belonging to the OSR community

The effectiveness of the process of creating, storing, sharing and exploiting knowledge and practices emerging from the virtual communities

Learning experience Effectiveness of the learning experience through to the use of OSR

To what extent resources are offered to users in a way which supports learning, curiosity, enjoyment and motivation

The extent of which user feel engaged in exploring OSR portal and its resources and tools (blog, news, community)

36/95

Page 37: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

To what extent OSR supports the learning process both formal, informal learning,

Authoring/

teaching experience

Quantity and suitability of tools for uploading resources (Los) to OSR system (from the point of view of the museums and authors)

The agility with which learning objects and educational pathways are created and updated

The degree of which users’ ownership of the content is recognised

Social reputation and the rating system

3 Methods and tools

3.1 Introduction

To adequately cover the broad range of aspects involved in the OSR project and associated system development, multiple validation methods applicable to different objects will be implemented, along with differing tools and criteria.

The work will involve:

Establishing defined validation protocols

Developing project-specific validation tools

Carrying out project-specific activities in close cooperation with internal partners and external stakeholders to meet specific validation needs and requests

3.2 Validation: perspectives to take into consideration

Before presenting the validation protocols1 which will be followed during the different validation activities, it is important to define some “variables” on the basis of which the 1This report includes the macro-design of the validation protocols. A detailed description of the protocols (step by step description) will be included in the deliverable D 6.2 Development of validation & feedback tools “ together with the tools which will be administered during the different validation phases.

37/95

Page 38: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

different “User scenarios” are designed. At the present time, the following 7 potential variables are defined:

3.2.1 a) Typology of access to the museum resources:1) Inside the museum (a. direct access) + web/virtual access (b. distance access) and

vice versa2) Solely web/virtual access (b. distance access) 3) Solely inside the museum (direct access) which is out from the scope of the project

3.2.2 b) Frequency of the visit 1) single visit2) multiple visits

3.2.3 c) User typology:1. Students:

a. Kindergarten >6b. Primary 6-10c. Middle School 11-14 yrsd. High School 15-18 yrs, e. College/University > 18.f. Senior Students(> 60)

2. Teachers /Instructor a. newly employed b. year of teaching >10 c. years of teaching <10 d. Retired

3. Groups : small groups or community e.g. families, Art/Language/History/Society (like Gender-related groups) - University groups/staff (like working groups on scientific history/innovations - Company working groups (like Siemens on history of electricity) - Groups with multilingual – multicultural background

4. Individuals (Adults, seniors, teenagers etc)

38/95

Page 39: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

3.2.4 d) Type of learning

Formal learning Informal learning(structured and planned) (open and free exploration) 1. on open tasks-2. on specific/pre-determined tasks3. All combinations possible

3.2.5 e) Process initiators 1. School of all levels and kinds /University2. Museum / Centres3. Informal learning organisation (e.g. adult learning centres, NGOs)4. Individual5. Companies

3.2.6 f) Driver of the process1. Teacher / supervisors/ trainers2. Students / pupils3. Museum educators / staff4. Designated persons within a group /organisation /team5. Individual

3.2.7 g) Reason for accessing resources:1.a Specific task/aim fulfilment:

1.1.Acquisition of general (basic) knowledge1.2 Acquisition of specific knowledge

1.3 Increase/expand knowledge1.4 Update knowledge

1.b Personal development or personal satisfaction (self-driven) : 1.5 Curiosity1.6 Enjoyment (science’n’fun)1.7 Belonging 1.8 Personal development1.9 Professional development (e.g. Career specific requirements)

39/95

Page 40: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

40/95

Page 41: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

A. Type of access B. Frequency of the visit

C. User typology D. Type of learning E. Process initiator F. Driver of the process G. Reason for accessing

A1a. direct access + web/virtual access

B.1 Single visit C.1 Students in compulsory school

(a,b,c,d, e)

D1 formal learning E1 School of all levels and kinds /University

F1 Teacher / supervisors/ trainers

/organisation /team

G1 Acquisition of general (basic) knowledge

A1b. web/virtual access + direct access

B. 2 Multiple visit C.2 Teacher D2 informal learning

E2 Museum / Centres

F2 Students / pupils G2 Acquisition of specific knowledge

A2. Solely Virtual access

C.3 Groups D3 combination D1 + D2

E3 Informal learning organisation

F3 Museum educators / staff G3 Increase/expand knowledge

41/95

Page 42: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

C4. individual E4 Individual F4 Designated persons within a group

G4 Update knowledge

E5. Companies F5 Individual G5 Curiosity

G6 Enjoyment (science’n’fun)

G7 Belonging

G8 Personal development

G9 Professional development (e.g. Career specific requirements)

G10 Generic Interest

42/95

Page 43: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

3.3 Different users & different perspectives/reasons/motivations (User scenarios)

This paragraph presents some user scenarios which represent a “model categorisation framework” of the different perspective/reasons that OSR users/audience may have for accessing the OSR portal.Before presenting a set of macro-protocols which guide the defined 2 validation phases, it is important to describe the perspective (a so called “pair of glasses”) that the users might have when accessing and operating within the OSR system. Each legitimate perspective will influence the way OSR users access, act within and evaluate the OSR portal (service and tools) together with the existing resources. Amongst the endless list of possible user scenarios which can be drawn by matching the above mentioned variables, the following set of scenarios indentifies and defines some of the more common ones that would be anticipated.

43

Page 44: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 1

Title of scenario A Web visitor inserts a key word on a specific science topic in one of the Internet search engines (e.g. Google), s/he come across the OSR portal. S/he enters the portal in order to find information on this particular science topic.

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single visit

C. User typology Individual

D. Type of learning Informal

E. Initiator of the process Individual

F. Driver of the process Individual

G. Reason General interest, curiosity

44

Page 45: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

45

Page 46: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 2

Title of scenario A teacher wants to prepare a lesson for his/her students in high school. The teacher wants to retrieve visual material (the learning object) to be used at the beginning of his/her lesson in order to increase motivation, interest and attention with respect to the particular topic

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single visit

C. User typology Teacher

D. Type of learning Formal

E. Initiator of the process Teacher

F. Driver of the process Teacher

G. Reason Develop interest and curiosity in his/her students, Acquisition of specific knowledge

46

Page 47: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 3

Title of scenario The teacher wants to prepare an educational pathway on a specific theme in order to provide the necessary background knowledge to students before accompanying them on a museum visit

A. Type of Access Web/virtual access + direct access

B. Frequency of the visit Multiple visits

C. User typology Teacher and Students

D. Type of learning Formal + Informal

E. Initiator of the process Teacher

F. Driver of the process Teacher + Students

G. Reason Acquisition of generic knowledge by his/her students and the creation curiosity and interest in the particular subjects.

47

Page 48: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 4

Title of scenario The Teacher notices that the majority of his/her students encounter difficulties in understanding a particular phenomenon or concept and he/she is looking for a suitable simulation in order to improve specific knowledge of the phenomenon /concept. The teacher provides the initial stimulus / impetus and students may or may not continue their explorations independently of the teacher once they have been provided with suitable instruction on portal usage

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single / Multiple visits

C. User typology Teachers and Students

D. Type of learning Formal + Informal

E. Initiator of the process Teacher

F. Driver of the process Teacher + Students

G. Reason Acquisition of specific knowledge

48

Page 49: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 5

Title of scenario The teacher wants to look for resources in order to expand his/her knowledge on a specific scientific theme

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single / Multiple visits

C. User typology Teacher

D. Type of learning Informal

E. Initiator of the process Teacher

F. Driver of the process Teacher

G. Reason Expand knowledge, career development

49

Page 50: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 6

Title of scenario A University teacher is going to give a speech at a European conference. S/He is looking for a video to show during his/her presentation in order to support their presentation to attract attention or stimulate discussion.

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single visit

C. User typology Teacher

D. Type of learning Informal

E. Initiator of the process Teacher

F. Driver of the process Teacher

G. Reason Enjoyment , expand knowledge

50

Page 51: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 7

Title of scenario A teacher wants to find resources to update his/her knowledge on ongoing developments with regard to a particular subject area and discuss the specific issue with a particular interest group to which they belong, such as other science teachers

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Multiple visits

C. User typology Individual

D. Type of learning Informal

E. Initiator of the process Individual

F. Driver of the process Individual

G. Reason Update knowledge, Belong to a Community of Interest

51

Page 52: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 8

Title of scenario A retired school science teachers searches for resources in order to update his/her knowledge

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single/ Multiple visits

C. User typology Individual

D. Type of learning Informal

E. Initiator of the process Individual

F. Driver of the process Individual

G. Reason Enjoyment , Update knowledge

52

Page 53: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 9

Title of scenario A secondary school student searches the repository in order to prepare an essay on a pre-determined subject- for his/her science class.

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single/ Multiple visits

C. User typology Student

D. Type of learning Formal: specific/pre-determined tasks.

E. Initiator of the process Student

F. Driver of the process Student

G. Reason Acquisition of specific knowledge.

53

Page 54: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 10

Title of scenario A high school student searches the Repository in order increase his/her knowledge on a scientific topic which was presented by the teacher during a science lesson

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single / Multiple Visits

C. User typology Student

D. Type of learning Informal learning

E. Initiator of the process Student

F. Driver of the process Student

G. Reason Acquisition of basic knowledge

54

Page 55: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 11

Title of scenario A PhD student is preparing his/her thesis in science. S/he is looking for a specific resource or resources to be included in the thesis

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Multiple visits

C. User typology PhD Student

D. Type of learning Informal

E. Initiator of the process Student/Individual

F. Driver of the process Student/Individual

G. Reason Update knowledge, career development.

55

Page 56: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 12

Title of scenario An individual looks for material related to a topic which he/she heard about by reading a newspaper/watching TV etc

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Single visit

C. User typology Individual /Adult

D. Type of learning Informal

E. Initiator of the process Individual

F. Driver of the process Individual

G. Reason Curiosity, acquisition of general knowledge

56

Page 57: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 13

Title of scenario An animator for a group of people belonging to a Third Age University, who is used to planning group visits (and has some basic ICT skills), accesses the repository in order to find materials related to the museum which is planned to be visited by the group. S/he would like to find material related to existing / permanent exhibitions provided by the museum.

A. Type of Access Web/virtual access + direct access

B. Frequency of the visit Single / Multiple Visits

C. User typology Group /individual

D. Type of learning Informal

E. Initiator of the process Animator

F. Driver of the process Individual + Groups

G. Reason Interest, curiosity, acquisition of general knowledge

57

Page 58: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 14

Title of scenario Parents of young primary school children, after having visited the museum, are looking to find materials to improve the knowledge of their children on what they have seen during the visit.

A. Type of Access Direct access + web/virtual access

B. Frequency of the visit Single visit

C. User typology Family / group

D. Type of learning Informal

E. Initiator of the process Family (parents)

F. Driver of the process Parents+ Children

G. Reason Interest, curiosity, acquisition of general knowledge

58

Page 59: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 15

Title of scenario A cultural association is planning an evening or a set of events on science topics in order to increase knowledge, create interest and increase the number of its associated members.

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Multiple

C. User typology Generic audience

D. Type of learning Informal

E. Initiator of the process Association educator

F. Driver of the process Association educator

G. Reason Curiosity, general interest, fun, acquisition of generic knowledge

59

Page 60: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 16

Title of scenario Museum visitor tags various museum exhibits in which they are interested in finding out further information

A. Type of Access Direct access + virtual access

B. Frequency of the visit Multiple access

C. User typology Museum visitor

D. Type of learning Informal

E. Initiator of the process Museum visitor

F. Driver of the process Museum visitor

G. Reason Interest

60

Page 61: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 17

Title of scenario A Museum combines the LOs available in the OSR portal in order to prepare a structured educational pathway (: Introduction, per-visit, visit and post-visit)

A. Type of Access Web + direct access

B. Frequency of the visit Multiple access

C. User typology Museum Educator

D. Type of learning Formal-Informal

E. Initiator of the process Museum educator

F. Driver of the process Museum educator

G. Reason Acquisition of knowledge, Service Development

61

Page 62: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

No. 18

Title of scenario A research centre manager regularly uses the OSR portal as a reference sources for their particular area of research or for training the new staff

A. Type of Access Solely Virtual in OSR portal

B. Frequency of the visit Multiple access

C. User typology Research centre manager

D. Type of learning Formal

E. Initiator of the process Research centre manager

F. Driver of the process Research centre manger

G. Reason Availability of single – comprehensive reference repository. (acquire and update knowledge)

3.4 Validation typologies

The following table shows the different typologies of evaluation that are foreseen within the Validation Programme:

1. Self-evaluation: for project partners and OSR museum educators (this can be thought of as an internal self-review process)

2. Peer review: for Ministries of Education, Research Centres and International Standard Bodies / Agencies

3. External evaluation: for the remaining user categories

Actors Typology of evaluation

Project Partners and OSR Museum Educators

Self-evaluation

Affiliated Science Centres and Museum Educators, Publishing Houses dealing with scientific resources

Peer evaluation

Ministries of Education and Research Institutions

Peer Review

International Standards Bodies / Agencies Peer Review

Science teachers External evaluation

62

Page 63: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Students External evaluation

Schools, Training Centres, Universities, Third Age Universities

Peer evaluation

User Groups & Communities : Individual Learners, Families, Science Groups and Socio-Cultural Associations

External evaluation

The OSR validation programme foresees different tool typologies. They are articulated while taking into consideration the degree of openness of the answers required (from the most structured and codified ones e.g. checklists to the most open and subjective ones e.g. the observation grid) and whether the answers are being given individually or at a group level.

a) Checklists C

b) Structured Validation Tools:

1b) Questionnaires, 2b) Structured Individual Interviews, 3b) Structured Group Discussions S

c) Focus Groups F

d) Observation Grids O

e) Tests (behavioural, technical) T

3.5 Validation of macro-protocols1-2

When the validation activity takes place, data on the users’ perspective (user scenarios: please refer to the previous chapter) will be collected in order to match it to the feedback provided.In other words, in each validation phase a pre-defined set of “Use Case” scenarios will be provided for the in-services seminars, Training workshops/Piloting phase I/Validation Phase I and Training workshops/Piloting II/Validation Phase II 3, so that they can be suitably tested / evaluated. The “Use Case” scenarios are derived from

1 The following macro- validation protocols, are defined on the basis of what has been already stated in Chapter 2 “Validation Objects” and Chapter 4 “Validation Sources” which are included in this deliverable together with the “User Cases” presented as chapter 2.4. of deliverable D 4.1. “System Specifications and Technical Design”.

2 A detailed description of the protocols (step by step description) will be included in the deliverable D6.2 Validation tools, together with the tools which will be administered during the different validation phases.

63

Page 64: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

deliverable D.4.1, “System Specifications and Technical Design”. The same set of “Use Case” scenarios (e.g. 2.4.3 Retrieve learning Object) will be tested by the people involved in each phase of the validation but understanding and knowing the perspective of the validators will allow the OSR consortium to contextualise the results achieved.Therefore, at the beginning of each validation phase (through the appropriate use of the validation tools provided), the involved validators will be asked to describe their profiles and their user perspective. Furthermore, in some cases, validators will be also asked to adopt an alternative user perspective while performing some scenarios in order to provide an added degree of flexibility with regards to the perspective being adopted by different validators. A validator adopting an alternative perspective to that they would normally assume may reveal different, new or innovative angles and approaches which will allow the OSR consortium to conduct a more effective and comprehensive validation.

3.6 Training workshops-Validation phase I 1

When the validation trials will take place

M12-M18

Duration of the Training Workshops

2 days

Duration of Validation phase I 6 months

Who will organise the Training Workshops

Each OSR Content Providers together with Museums educators

Location of the Training Workshops

Seven (7) OSR participating museums in:

Munich, Paris, Helsinki, Budapest, Athens, Lisbon, Milan

Who will take part in the Training Workshops

Museums educators working in the 7 OSR participating museums and a representative sample of teachers from amongst those who took part in the elicitation workshops

How will the Training Workshops be structured

In the first part of the training workshop an introduction to the OSR project and system will

3 In addition, during the project implementation the consortium will try to organise additional moments for training and validation activities, such as the Winter School that took place in Helsinki in January 2010.

1 Detailed on the Training workshop’s organisation and the foreseen activities will be defined in the deliverable D.6.3

64

Page 65: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

given to the participants in order to present them with the overall picture and scope of the project /trial.

In the second part of the workshop, the museum educators will follow a detailed protocol/path (please refer to the “User Case” scenarios)

In the third part of the workshop, the subsequent validation steps of the validation programme will be presented and a discussion will follow. The aims of this part of the workshop are to prepare OSR museum educators so that they can effectively organise the subsequent validation activities.

What will be validated o The OSR Portal,

o The Pedagogical Model,

o The Organisational Model,

o The Learning Objects,

o The Educational Pathways,

o The “Content Sensitive” Search and Retrieval tools,

o The Educational metadata authoring system

o The Social Tagging systemo The Authoring experienceo The Learning experienceo The Virtual communityo The Voting system

Criteria which will guide the validation

Please refer to the table reported in Chapter 2

65

Page 66: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

“Use Case” scenarios to be tested1

2.4.1 Registration of a new user

2.4.2 Login

2.4.3a Retrieval of Learning Objects (from amongst a selection of already available LOs with which they are provided).

2.4.3b Retrieval of Learning Objects by means of the Text Search Interface (key words), Semantic Search, Tag cloud. (from a selection of LOs already available provided to them.)

2.4.4a Retrieval of Educational Pathways (from a selection of already available EPs with which they are provided).

2.4.4b Retrieval of Educational Pathways by means of the Text Search Interface (key words), Semantic Search, Tag Cloud. (from an already available selection of EPs provided to them.)

2.4.5. Uploading of Learning Objects

2.4.6. Uploading of Education Pathways

2.4.7. Social Tagging of Learning Objects / Educational Pathways (normal flow)

2.4.8 Social Tagging of Learning Object/s Educational Pathways (PDA).

2.4.9 Multilingualism

2.4.10 Virtual Communities

How the validation exercise will be organised

1. Participants will be asked to make annotations during the testing of the “User Case” scenarios.

2. A questionnaire to be completed will be given to each participant.

3. A focus group will be organised. during the final part of the validation session

4. Validation of online discussion

1 Use cases, sometimes called user scenarios, are narratives or flow diagrams that describe how users will interact with a Web site. Some people also refer to them as task analysis or user flows. Regardless of what you call them, the idea is the same. Illustrate to people scenarios or specific tasks which users will perform

66

Page 67: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

5. Weekly Voting Polls

Tools to be used for Validation Checklists, Questionnaires, Focus Group, Group Interview Grid, Weekly Voting Polls

Documentation Referenced The specific Validation Protocols for the Training workshop

A Validation Dossier (containing reference procedures, and associated tools together with result raw data)

A Validation Report

67

Page 68: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

3.7 In-service seminars (summer school)

When the validation phase will take place

On M13 and on M25

Duration of the In Service Seminars

One week

Who will organise the In Service Seminars

The OSR content manager (EA)

Location of the In Service Seminars

Crete

Who will take part in the In Service Seminars

Science Teachers, educators working in science museums and science centres not directly involved in the OSR project

How will be In Service Seminars be structured

In the first part of the training workshop an introduction to the OSR project and system will be given to the participants in order to present them with the overall picture and scope of the project /trial

In the second part of the trial the participants will follow a detailed protocol/path (please refer to the “User Case” scenarios)

What will be validated o The OSR Portal, o The Pedagogical Model, o The Learning Objects, o The Educational Pathways, o The “Content Sensitive” Search and

Retrieval Tools, o The Educational metadata authoring system,

Criteria which will guide the validation

Please refer to the table reported in Chapter 2

“Use Case” scenarios to be tested 2.4.1 Registration of a new user

68

Page 69: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

2.4.2 Login

2.4.3 Retrieval of Learning Objects by means of the Text Search Interface (key words), Semantic Search, Tag Cloud. (from amongst an already available selection of LOs with which they have been provided

2.4.4. Retrieval of Educational Pathways by means of the Text Search Interface (key words), Semantic Search, Tag Cloud. (from amongst an already available selection of EPs with which they have been provided)

2.4.5. Uploading of Learning Objects

2.4.6. Uploading of Education Pathway (following “Normal Flow” and “Alternative flows”)

How the validation exercise will be organised

1. Participants will be asked to make annotations during the testing of the “User Case” scenarios.

2. A questionnaire to be completed will be given to each participant

3. In the final part of the validation session a Focus group will be organised.

Tools to be used for Validation Checklists, Questionnaires, focus group form

Documentation referenced The specific Validation protocols for the In-Service Seminar

A Validation dossier containing reference procedures and associated tools together with result raw data)

A Validation Report

3.8 Training workshops- Validation II

When the validation phase will take place

M 24- M30

Duration of the Training 2 days

69

Page 70: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Workshops

Duration of the second Phase II 6 months

Who will organise Validation Phase II

Each content provider together with Museums educators belonging to the 7 OSR Science Museums and in general OSR users

Location of the Validation Phase II

OSR partner Museums in Munich, Paris, Helsinki, Budapest, Athens, Lisbon, Milan

And in other places throughout the Open Science Days initiatives

Who will take part in the Validation Phase II

A representative sample of teachers from amongst those who took part in Validation Phase I together with teachers who took part in the “In-Service Seminars”

How will Validation Phase II be structured

The participants will access the OSR portal

What will be validated o The OSR Portal, o The Pedagogical Model, o The Learning Objects, o The Educational Pathways, o The “content sensitive” Search and Retrieval

tools, o The Educational metadata authoring system, o The Social Tagging systemo The Authoring experienceo The Virtual communityo The Voting system o The Learning and Teaching experienceo The Watch list of Users

Criteria which will guide the validation

Please refer to the table reported in Chapter 2

“User Case” scenarios to be tested

2.4.2 Login

2.4.3 Retrieval of Learning Objects by means of the Text Search Interface (key words), Semantic Search, Tag Cloud.

70

Page 71: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

2.4.4. Retrieval of Educational Pathways by means of Text Search Interface (key words), Semantic Search, Tag Cloud

2.4.5. Uploading of Learning Objects

2.4.6. Uploading of Education Pathway (following “Normal Flow” and “Alternative flows”)

2.4.7. Social Tagging of Learning Objects / Educational Pathways -WEB

(normal flow, alternative flow 1, alternative flow 2, Alternative Flow 3)

2.4.8 Social Tagging of Learning Objects / Educational Pathways (PDA).

2.4.9 Multilingualism

2.4.10 Virtual Communities

How the validation exercise will be organised

1. Participants will be asked to make annotations during the testing of the “User Case” scenarios.

2. A questionnaire will be given to each participant to be completed after their registration

3. When a user searches for a resource (Text Search Interface (key words), Semantic Search, Tag Cloud, etc) a pop up window will provide a link to a short questionnaire.

4. When a user selects a LO or EP a pop up window will provide a link to a short questionnaire

5. When a user uploads a LOS or EP a pop up window provide a link to a short questionnaire

6. When a user inserts or looks for a tag a pop up window will provide a link to a short questionnaire

7. All user activity will be tracked and recorded electronically

8. In the final part of the validation session a

71

Page 72: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

focus group will be organised.

Tools to be used to validate Checklists, Focus Group Short, Questionnaires,

Documentation reference The specific Validation protocols for Piloting Phase II

A Validation dossier containing reference procedures and associated tools together with result raw data)

A Validation Report

3.9 Web-based evaluation

In addition to the above defined user scenario, the validation activities include also the web analysis related to the use of the portal, which offers a complete overview on all the aspects related to the portal. The web analysis of the OSR portal will be facilitated by Google Analytics. Google Analytics is an online tool that collects data from all of the OSR portal users. For example it gives insight in the frequency of access, the loyalty of users and the amount of up- and downloads.

Google Analytics is a free web analysis tool provided by Google Inc. The on-site web analytics tool works with a tracking code that allows collecting visitor data. These data are then sent back to Google servers where they are processed hourly and provided to the Google Analytics user. On each visitor’s computer a cookie is placed that facilitates identifying the visitors and thus track whether the visitor has been to the site before. Thus, Google Analytics offers information on the visitors´ home countries, the timestamp of the current visit, pages visited and more. The data are presented on the Google Analytics website (cf. Figure 1) and can be downloaded in different data formats like XLS or PDF.

UBT (University of Bayreuth) is responsible for the evaluation activities concerning the use of the OSR portal. UBT will observe the data collection with Google Analytics and analyse the received information. The results will be presented in the validation report (D6.4, at the end of the project).

72

Page 73: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Figure 1: Output document giving an overview on website data provided by Google Analytics.

During the whole duration of the project the visitor data will be collected. This web-based evaluation will address the following questions:

How many visitors are accessing the OSR portal? How frequently is the OSR portal used? How loyal are the users? (How often do the users return?) How long do visitors stay on the OSR portal? What pages are visitors most interested in? Which tools and features are visitors using most? Does the OSR-portal-access conform to the Law of Surfing (HUBERMAN et al.,

1998)?

The Law of Surfing (HUBERMAN et al., 1998; ADAR & HUBERMAN, 2000) summarises the common patterns which the surfing behaviour of website users follows. HUBERMAN et al. (1998) developed “a model that assumes that users make a sequence of decisions to proceed to another page, continuing as long as the value of the current page exceeds some threshold, yields the probability distribution for the number of pages that a user visits within a given website”. This probability distribution is given by the following equation:

73

Page 74: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

-λ (L - µ)2λP(L) = 2πL3 exp 2µ2L

Validation plan

.L equals the number of pages a user viewed, λ is a scale parameter and µ gives the variance.

According to HUBERMAN et al. (1998) the law could be used to simulate the surfing patterns of users on a given web topology and therefore make accurate predictions of page hits.

3.10 Qualitative validation

For the qualitative validation of the OSR project, interviews will be conducted during the OSR workshops and summer schools. To assess the user’s appreciation of the OSR project and to get a deeper insight into the user group’s experiences with the OSR portal it is important to get individual statements.

UBT will conduct interviews with workshop and summer school participants. To ensure a homogenous data collection, UBT will create an interview grid of fixed items (which will be defined in the D 6.2).The interviews will be evaluated using the qualitative content analysis (MAYRING 2008). According to MAYRING, the qualitative content analysis aims at analysing communication, normally oral speech, which is recorded and fixed in written texts. This analysis should proceed systematically and according to definite rules and theories and thus facilitate conclusions on certain aspects of the communication.

74

Page 75: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

4 VALIDATION SOURCES

The principle validators involved in the envisaged validation trials for the OSR project can be broadly categorised into two main groups:

1. The OSR Project Partners (pedagogical / technical experts and Museum educators directly involved in the project)

2. The OSR User Group, external to the project. This is comprised of:

a. Users involved in the requirement elicitation; in-service seminars and Validation Phase I

b. Users who will be involved in Validation Phases II and,

c. Users who will take part in the dissemination events

Glossary:

In order to avoid misunderstanding while reading this chapter, the following terms are defined.

Validation Master PlanThis document aims to outline the overall validation rationale and strategy together with an overview of all relevant operational tasks to be undertaken, those responsible for conducting and or coordinating the relevant activities described and a programme that summarises when and in what order the activities should occur. The Validation Master Plan provides an overview of the complete Validation Programme and makes general reference to the related documents and tools to be used in the relevant validation protocols and which are included in the Deliverable 6.2. The Validation Master Plan along with the specific documents protocols and procedures necessary to conduct the Validation Trials will be refined by taking into consideration the results from the different validation phases. One finalised the VMP becomes a stand-alone document and the only additions or updates are usually related to the executive summary report when the validation is complete.

Validation Programme It is an operative and coherent programme which is comprised of a set of validation steps or phases (e.g. In-service seminars, Training workshops/Piloting phase I/Validation I & Training Workshops/Piloting phase II/Validation phase II) which constitutes the concrete part of the validation plan. It is described in the Master Validation Plan.

75

Page 76: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Validation TrialsThe trials are the individual validation exercise/s which will take place within the different phases of the validation programme in accordance with the requirements specified in the relevant Validation Protocols.

Validation ProtocolsValidation Protocols describe the specific steps to be undertaken in the relevant phase or sub-phase of the validation exercise. They document in detail the specific objective and requirements of the particular exercise, the pre-requisite requirements, and the steps to be followed during the exercise, the records to be taken and the acceptance criteria applicable to a successful validation.

Validation DossierThe Validation Dossier is a document / folder in which all procedures, tools and relevant documentation (Part 1), together with the raw data (reports sheets) (part 2) from the individual trials and related validation activities are compiled in an ordered and self referencing manner.

Validation Report:This is the document that summarises all the results of the individual validation trials together with relevant summaries of the outcomes of the individual trials. The Validation Report also includes a Validation Summary report that details the overall outcome of the Validation Programme together with relevant conclusions.

The Validation Protocols, Validation Dossier and Validation reports should all cross reference each other and be ordered in an identical or coherent manner to allow easy cross reference and retrieval of information as and when required.

The following table cross-checks the relevant Validation Sources available together with the principle validation phase/steps with which they are envisaged to be involved. As is evident from the Table, through the adopted weighting scheme applied (x, xx etc), the level of contribution /involvement of the different validation sources varies during different phases of Validation programme. This ensures that each critical phase is suitably supported though the availability of the appropriate sources and that effective coherency exists throughout the overall Validation Programme

76

Page 77: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Validation Sources In-service Seminars

Training workshops Validation Phase I

Training workshops Validation Phase II

Dissemination Events

Project Partners and OSR Museum Educators

OSR Pedagogical experts X X XX

OSR Technical experts X X XX

OSR Museum Staff directly involved in the project

X X XX

OSR user group Science teachers

o Newly employed

o With >10yearsteaching experience

o With <10years of teaching experience

o Retired Teachers

X X X X X X X XX

Students

o Primary School Age 6-10

o Middle School Age 11-14 yrs

o High School Age15-18 yrs,

o College/University Students > 18.yrs

o Senior Students(> 60 yrs)

X X X X X

Schools, Training centres, Universities, Third age Universities

X X

User Groups & Communities : Individual Learners, Families, Science Groups and Socio-cultural Associations.

XX X X

Validation Sources In-service Seminars

Training workshops ValidationPhase I

Training workshopsValidation Phase II

Dissemination Events

77

Page 78: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Affiliated Science Centre and Museum educators, Publishing Houses dealing with science resources

X XX XX

Ministries of Educations and Research Institutions

X X

International Standards Bodies for example, ISO, SCORM, IMS, HR-XML Consortium, IEEE-LOM relevant to metadata and elearning model development.

X XX

4.1.1

78

Page 79: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The following sections present in detail the validation sources, the validation questions which are relevant to them, the expected numbers of persons involved and how they will be involved in the different steps/phases of the validation programme. The specifications of the trials (methods and tools) are reported in chapter 3.

4.2 OSR Partners (OSR Pedagogical / Technical Experts and Museum Educators)

Who are they? OSR pedagogical and technical experts together with museum educators that belong to OSR partner organisations and who are involved in many project activities on an on-going basis. Suitable examples of these on-going project activities include;

- Organisation of the User’s elicitation Workshop

-Design of the learning objects and educational pathways to be uploaded on to the OSR portal.

- Activities related to defining the metadata systems to be used and to the development of the OSR portal

Questions relevant to validation source

oDoes OSR contribute to our image and positioning in the online learning environment/market?

oDoes OSR help us to gain attention and achieve an enhanced reputation from new target users?

oDoes OSR facilitate the evaluation and collection of feedback on newly developed online learning resources (LOs) under ‘real-life / realistic’ conditions or contexts?

oTo what extent does OSR help us to improve product and services quality?

oDoes participation in OSR allow us to learn or improve our way of categorising, storing and retrieving online learning resources (LOs)?

oDoes participation in OSR allow us to improve the competences of staff/educators?

oDoes participation in OSR help us to learn new ways to create educational pathways?

oTo what extent does OSR allow resources from different museums to be combined in way that can accommodate the learning needs and preferences of a wider audience?

oDoes the OSR Project generate new project partnerships,

79/95

Page 80: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

strategic alliances etc?oTo what extent is the OSR system recognised within Education,

Training, and LLL sectors?oTo what extent is OSR able to anticipate future trends and act

on them?oDoes OSR provide any additional benefits beyond those

previously identified?How will these Validation Sources be involved in the validation?

The project envisages the organisation of a number of training workshops targeted at OSR museum educators. The training workshops will provide an opportunity not only for training the museum educators so that they will be suitably prepared for the upcoming trials (but will also ensure that there is an opportunity for the refinement/fine tuning of the validation programme itself.

During the Training Workshop these contributors will act as the principle validation sources while in the subsequent downstream trial activities planned to take place in the museums subsequent to the workshop, they will act as local coordinators of associated validation activities.

How many will be involved?

The number of the people involved will depend from staff taken from the seven museums that are participating in the OSR project

Duration of the trials / Methods and tools

Please refer to chapter 3

4.3 Science Teachers

Who are they? The sample /selection of science teachers who are to participate in the different trials will chosen to ensure broad representation of teachers from different school typologies as well as representation from within the different grades/ levels within such typologies. At the end of the project it is envisaged that a large group of teachers should have accessed, used (downloaded or uploaded LOs or educational pathways) and tested the OSR Portal.The OSR consortium will pay particular attention to ensuring that a very diverse group of teachers is involved in order to be assured

80/95

Page 81: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

that the OSR prototype is tested against needs that are well defined and highly differentiated.Initially, the sample of teachers who will be asked to take part in Validation Phase I will be those who have already taken part in the elicitation workshops. The Validation Phase II will be targeted at both those teachers that have been involved in the Validation Phase I as well as those who have taken part in the “In-Service Seminar”. As the validation programme progresses the sample of teachers will comprise those previously mentioned but will be supplemented and extended by additional groups of teachers selected on the basis of their familiarity with the museums in question as well as having had expressed or demonstrated interest in the use and application of the OSR portal.

4.3.1Questions relevant to validation source

oTo what extent does OSR support us in improving new or existing competencies and improving our knowledge of our students?

oTo what extent does OSR allow us to select and retrieve resources which fit our students’ specific needs, interest and requirements?

oDoes OSR offer access to quality digital resources?oTo what extent does OSR offer us services and tools to support

our teaching processes?oTo what extent does OSR offer us quality services and tools to

plan lessons and scientific learning pathways (both inside and outside the museum milieu)?

oDoes OSR contribute to our students ‘motivation for learning science?

oTo what extent are OSR service and tools easy to be used?oTo what degree is it advantageous, useful, interesting and

relevant to be a member of the OSR community?oTo what extent is the OSR community managed in a way that

allows participation and knowledge sharing?oTo what extent does the OSR community recognise my existing

or future competences and knowledge?oTo what degree does OSR manage ownership of contents?oTo what extent does OSR offer us something (in term of

services and learning resources) that can be considered to be of added value compared to the existing museum websites?

81/95

Page 82: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

How will these Validation Sources be involved in the validation?

The science teachers will be involved in all trials throughout the overall Validation Programme. In particular :a) the “in-service seminars” (summer schools - Two to be held each year).The “in-service seminars” will offer the opportunity for teachers to gather together in order to reflect, learn and exchange experiences and view points on the issue of teaching and learning science through the use of digital resources. During the “in-service seminars” participants will have an extensive opportunity to access, navigate and use the OSR prototype in order to research/retrieve or develop new learning objects/educational pathways and in so doing, will test the suitability of the system. b) Validation Phase I : The “Validation Phase I “involves teachers who took part in the elicitation requirement workshops. During this phase teachers will be asked to test the OSR portal and to provide their feedback to the OSR consortium. c) Validation Phase II: The Validation Phase II, not only involves those teachers who took part in “Validation Phase I “ and those who attended the “in service seminars “. In this phase, the user scenarios which are to be tested will be more complex and it is expected that the OSR portal should by this stage have undergone considerable improvement or upgrade based on the results that emerged from the Validation Phase I.This phase will involve an “open” group of users and in particular involve teachers who are users from the seven museums that constitute the project partner group. Moreover, OSR partners will offer access for involvement in this validation phase to any teachers interested in science. The OSR consortium will endeavour to involve as many teachers as possible.

4.3.2How many will be involved?

Depends on the organisation

Duration of the trials / Methods and tools

Please refer to chapter 3

82/95

Page 83: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

4.4 Students (Formal Learning scenarios)

Who are they? Students together with science teachers represent key stakeholder categories and final beneficiaries of the OSR project.The students at whom the project is addressed may be from any school (and any grade / level within these schools) within Europe and therefore the sample of students which will be involved in the trial should progressively increase as the Validation Programme proceeds from Validation Phase I to Validation Phase II. It should be noted that Secondary School or University students involved in the validation are required to directly provide their feedback on the OSR system to the designated project coordinators, while in the case of primary or lower secondary school students, feedback will be collected through their respective teachers.

Questions relevant to validation source

oTo what extent can I find my way around OSR?oTo what extent does OSR support us in learning science?oTo what extent can I find in OSR exactly what I’m looking for?oTo what extent do the available resources in OSR fit my

specific needs, interest and requirements?oDoes OSR offer access to interesting and relevant resources

that support my learning requirements and completion of associated school tasks?

How will these Validation Sources be involved in the validation?

It is implicitly understood that in the validation phases the students will indirectly test the OSR LOs and educational pathway. Teachers will report students’ feedback by selecting resources to be delivered during the students’ lessons.

How many will be involved

Depends on the organisation

Duration of the trials / Methods and tools

Please refer to chapter 3

83/95

Page 84: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

4.5 Schools, Training centres, Universities, Third age Universities

Who are they? They are the instructions to which teachers, educators, informal learning operators belong.Their involvement is implicit through the involvement of their staff members. Nevertheless, in order to facilitate the participation of a large group of teachers, particular efforts will be placed on promoting OSR through the involvement of school headmasters and full professors at the University level.

Questions relevant to validation source

oTo what extent can the OSR prototype be considered a “reference model” or a point of reference” in fostering LLL?

oTo what extent can the methodological approach for creating educational pathways that is adopted in OSR be considered a “reference model”?

oTo what extent can OSR guarantee quality in learning (in terms of resources and services)?

oTo what extent does OSR foster a bridge or link between formal and informal learning?

How will these Validation Sources be involved in the validation?

They will be approached in order to identify teachers who can participate in the validation programme. Furthermore, the dissemination events (OSR annual conferences, “Open Science Days”, presentations at conferences and events) will constitute excellent occasions / opportunities for promoting OSR and for collecting view points and feedback from these project stakeholders.

How many will be involved

Depends on the organisation

Duration of the trials / Methods and tools

Please refer to chapter 3

4.6 User Groups & Communities: Individual Learners, Families, Science Groups and Socio-cultural Associations

Who are they? This group of Validation Sources consists of various members of the OSR online user community and museum visitors

Questions relevant to validation source

oDoes OSR motivate users to learn science?oDoes OSR offer access to quality resources?oTo what extent does OSR allow us to select and retrieve

84/95

Page 85: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

resources which fit our specific needs, interest and requirements?

oTo what extent does the use of tag cloud, rating and social tags support users to find what they are looking for?

oTo what extent OSR offer us services to support our learning process?

oTo what extent is being a member of the OSR community useful, interesting and relevant?

oTo what extent is the OSR community managed in a way that supports participation, knowledge sharing and recognition?

oTo what extent can OSR easily accessed and used from anywhere?

oTo what extent does OSR offer us something (in term of services and resources) that can be considered to represent added value compared to the existing museum websites?

How will these Validation Sources be involved in the validation?

As visitors of the OSR museums or cybernautes and through the dissemination events they will be invited to enter and test the OSR portal. In particular museum staff will organise validation sessions to be held in the museums in order to create a link between the museum exhibitions and OSR digital resources, The idea is to expand the interest .of museum visitors from beyond the physical confines of the museum and stimulate individuals and groups within the population at large to remotely access the OSR portal.

How many will be involved

The amount will be deducted from the Google analyst survey

Duration of the trials / Methods and tools

Please refer to chapter 3

4.7 Affiliated Science Centres and Museums, Publishing Houses dealing with science resources

Who are they? They are science centres and museums which are not directly involved in the OSR project. They might be interested in joining the consortium later on.

Questions oTo what extent does OSR represent an interface between the

85/95

Page 86: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

relevant to validation source

supply and demand sides and what are the concrete benefits? oDoes OSR contribute to the image and positioning of the

museums which are part of the OSR consortium within the online market?

oDoes OSR facilitate in the evaluation and collection of feedback on newly developed online learning resources (LOs) being used in ‘real life’ scenarios?

oDoes OSR allow the staff of the participating museums to learn or improve our way of categorising, storing and retrieving online learning resources (LOs)?

oDoes participation in OSR improve the competences of staff/educators?

oDoes participation in OSR help museum educators to learn new ways to create educational pathways?

oTo what extent does OSR allow the combination of resources from different museums in way that can accommodate the learning needs and preferences of a wider audience?

oDoes the OSR Project generate new project partnerships, strategic alliances etc?

How will these Validation Sources be involved in the validation?

They will be involved in Validation Phases I & II. Staff of science centres and museums external to OSR project may have the possibility to participate in the “in-service seminars”.

How many will be involved

Depends on the organisation

Duration of the trials / Methods and tools

Please refer to chapter 3

4.8 Ministries of Educations, Research Institutions (Private & Corporate)

Who are they? These categories of validators are project stakeholders who have an interest in project results and in their future exploitation.

Questions relevant to validation source

oDoes OSR foster Lifelong Learning, in particular with respect to the existing efforts by Ministries of Education that aim to support the harmonisation of standards of education and training progression/recognition?

86/95

Page 87: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

oTo what extent can the OSR prototype be considered a “reference model” or a point of reference” in fostering LLL?

oTo what extent can OSR guarantee quality in learning (in terms of resources and services)?

How will these Validation Sources be involved in the validation?

During the project life-cycle, a series of dissemination events aimed at fostering and valorising the OSR intermediate and final results as well as promoting interest in learning and teaching science are envisaged. These activities will be specifically undertaken to identify and forge links with relevant Institutions, Bodies, and Agencies whose activities are related to or complement the scope and objective of the OSR project. The dissemination events will range from ad-hoc and embedded actions: organisation of OSR dissemination and research workshops, OSR annual conferences, “Open Science Days”, presentations at conferences and events. On such occasions, inputs and feedback from the relevant participants will be sought, collected and included in the subsequent validation reports.

4.8.1How many will be involved?

Depends on the organisation

Duration of the trials / Methods and tools

Please refer to chapter 3

4.9 International Standards Bodies (for example, ISO, SCORM, IMS, HR-XML Consortium, IEEE-LOM relevant to metadata and elearning model development)

Who are they? They are working groups and consortium members active in the definition of standards for categorising, storing and retrieving online resources.

Questions relevant to validation source

oDoes OSR help to establish quality standards regarding metadata?

oDoes OSR project help support the harmonisation of standards within the education and training sectors?

oTo what extent does the social tagging system provide inputs to the metadata model?

87/95

Page 88: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

oDoes OSR introduce new methodological and technological solutions and contribute to the elaboration of standards for identifying, classifying, storing and retrieving digital resources?

How will these Validation Sources be involved in the validation?

During the project life-cycle, a series of dissemination events aimed at fostering and valorising the OSR intermediate and final results as well as promoting interest in learning and teaching science are envisaged. These activities will be specifically undertaken to identify and forge links with relevant Institutions, Bodies, and Agencies whose activities are related to or complement the scope and objective of the OSR project. The dissemination events will range from ad-hoc and embedded actions: organisation of OSR dissemination and research workshops, OSR annual conferences, “Open Science Days”, presentations at conferences and events. On such occasions, inputs and feedback from the relevant participants will be sought, collected and included in subsequent validation reports

How many will be involved

The Project partners will try to involve the maximum number possible

Duration of the trials / Methods and tools

Please refer to chapter 3

The table below matches each source of evaluation with the main validation objects in an attempt to simplify and to better focus and direct the efforts for collecting relevant information, data and opinions without committing unnecessary resources or requiring that individual sources devote an inordinate amount of time to answering/commenting on all the innovation aspects.

88/95

Page 89: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

Actors OSR portal as a whole

Pedagogi_cal model

Organisa_tional model

Learning Objects

Educatio_nal Pathways

Content sensitive search and retrieval tools

Educatio_nal metadata authoring system

Social Tagging system

Commu_nity of users

Learning experience

Authoring/teaching experience

Project Partners and OSR Museum Educators

Science teachers

Students

Schools, Training centres, Universities, Third age Universities

User Groups & Communities: Individual Learners, Families, Science Groups and Socio-cultural Associations

Affiliated Science Centre and Museum educators, Publishing Houses dealing with science resources

Ministries of Educations and Research Institutions

International Standards Bodies

89/95

Page 90: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

90/95

Page 91: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

5 Validation Timeframe

This chapter illustrates the timetable describing when validation activity will be implemented and when the different activities will be undertaken throughout the lifetime of the project. As described in the previous chapters the validation activity is composed of three main phases:

1. In service seminars (summer schools)2. Training workshops – Validation phase I 3. Training workshops – Validation phase II

5.1 In service seminars (summer schools)

They are scheduled once per year in Crete (Greece) for a total of 2 summer schools during the lifetime of OSR project. These summer schools will address to teachers and museum educators and will try to involve a great amount of participants throughout the OSR partner’s networks.The first summer school will be organised on M13 and it will last one week from 5 th of July 20010 to 10th of July 2010. In this event, the participants will be involved in some trials (see chapter 3) to allow them to familiarize with the first prototype of the portal and to create and develop their own pathways. The second summer schools will be organised on M25 and will be focused on the results and the achievements reached during the first two years of the project.

5.2 Training workshops- Validation phase I

The first session of the validation activity will occur during the first summer school on M13 of the project lifetime where a first collection of data related to the use and the functionalities of the OSR portal will be acquired. Then different training workshops will be organised in the 7 leading science centres and museums in Italy, France, Greece, Germany, Finland, Portugal and Hungary where the proposed trials will be realized. They will be based on the organisation of communities of user groups consisting of teachers and museums educators of participating museum and on the implementation of the user scenarios defined in this document (see chapters 2 and 3), plus those which will be designed in WP5. Of course each user scenario will be accommodated in relation to the different contexts and environment. Thanks to the previous task of delivering questionnaires and organising workshops in order to analyse the user’s requirement (WP3), a minimum number of communities of user groups have already been identified. The idea of this first phase is to help all actors involved to familiarise with the approach, the metadata attribution process and the relative tools/interfaces that aim to introduce new ways of interaction with the digital science content. Training workshops will prepare the relevant groups (teachers and museum experts) for the upcoming trials and scenario of use. The main part of this task will be the organization and the management of the user-centred activities in the participating museums and science centres.

91/95

Page 92: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

The selected user groups will be involved in educational activities according to the OSR scenarios of use. This phase will have the duration of six months (M12- M18) and each museum will organize these activities following the training workshop template defined in the next deliverable D 6.2 Validation tools. Users’ reactions to the proposed scenarios as well as to the OSR approach for the organisation of the science contents will be monitored and analysed in detail. During this extended period of implementation the different components of the system (and mainly the interfaces) will be upgraded and adapted based on the users’ feedback. In addition, the data from the testing will be analysed to provide information on the usability of the system and this will be contained in the evaluation report, indeed an extended validation and usability evaluation effort will evolve in parallel with the implementation activities.

5.3 Training workshops – Validation phase II

The second phase will start around M24 and will be based on the results and the analysis gained during the first phase of implementation and after the appropriate modifications on both the deployed technologies and the proposed scenarios of use, the partnership will implement the project in the participating museums during a second six-month cycle. This time, the user experience will consist of more complex scenarios of use or sequences of extended field trips among different exhibits. In this phase the “real” validation phase will enter into force, the selected tagging methodologies and the relative tools as well as the usage pathways will be exposed to use by large numbers of real users in real settings, with the aim of validating the findings of the pilots with feedback from and observations of real (and not anymore deliberately selected) users in museums and science centres. New user communities will be developed in the process trying to extend the network of user groups by dynamically involving more groups as a result of work conducted within the organization of OpenScience Days in additional science museums (outside the consortium museum partners).

Finally, the results of all stages of testing and validation will be integrated into the final validation report.

92/95

Page 93: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

5.4 Validation timetable

months 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36

In service seminars- Summer Schools

5.4.1 5.4.2 5.4.3 5.4.4 5.4.5 5.4.6 5.4.7 5.4.8 5.4.9 5.4.105.4.115.4.125.4.135.4.145.4.155.4.165.4.175.4.185.4.195.4.205.4.215.4.225.4.235.4.24

Validation

Phase I5.4.265.4.275.4.285.4.295.4.305.4.315.4.325.4.335.4.345.4.355.4.365.4.375.4.385.4.395.4.405.4.415.4.425.4.435.4.445.4.455.4.465.4.475.4.485.4.49

Evaluation report 5.4.515.4.525.4.535.4.545.4.555.4.565.4.57

Evaluation report and modifications of the system and new scenario of use

5.4.585.4.595.4.605.4.615.4.625.4.635.4.645.4.655.4.665.4.675.4.685.4.69

Validation Phase II 5.4.715.4.725.4.735.4.745.4.755.4.765.4.775.4.785.4.795.4.805.4.815.4.825.4.835.4.845.4.855.4.865.4.875.4.885.4.895.4.905.4.915.4.925.4.935.4.94

Validation report 5.4.965.4.975.4.985.4.995.4.1005.4.1015.4.1025.4.1035.4.1045.4.1055.4.1065.4.1075.4.1085.4.1095.4.1105.4.1115.4.1125.4.1135.4.114

Validation Report

6 7

93/95

Page 94: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

8 Validation outcomes

As mentioned in chapter 2, the OSR project will adopt formative and summative evaluation approaches. In particular, the formative approach ensures that the results emerging from the workshops and in-services seminars will be used to improve the validation plan as well as provide the first improvement suggestions for the OSR portal. Moreover, these results are key inputs for the subsequent improvement of the OSR portal prior to the implementation of the Validation Phases I & II in which a large group of users will be involved. In other words, at the end of M13 (post first summer schools) at the end of M18 (post Validation phase I) and again at the end of Validation Phase II (M30), partners involved in validation activities will prepare intermediate validation reports which will present the results achieved and identify areas/aspects which need to be improved in order to better meet the requirements and expectations of the OSR users. Partners involved in the educational and technical design will be in the position to modify the OSR system (tools and functionality) taking these inputs into consideration before the extended Validation activity (Phase II) takes place. The OSR partners will design new validation scenarios (WP5) in order to validate the Proof of Concept studies, verify if the improvements introduced in the OSR portal respond to the user demands and identify which aspects should be further refined. The extended Validation activity (Phase II) which will involve a larger number of users will also allow the project consortium to conduct a summative evaluation in order assess if the projects objectives have been met and to what extent the OSR project has contributed to the eContentplus programme priorities.

94/95

Page 95: 1_deliverables_as_per_dow.doc - ea.gr - 6.1 Validation Plan.docx  · Web viewThis methodology will help the consortium to identify the innovative aspects and the differences and

Validation plan

9 Literature

ADAR, E. & HUBERMAN, B. A. (2000). The Economics of Surfing. Quarterly Journal of Electronic Commerce 1(3), 203-214

HUBERMAN, B. A., PIROLLI, P. L. T., PITKOW, J. E. & LUKOSE, R. M. (1998). Strong Regularities in World Wide Web Surfing. Science 280, 95-97.

MAYRING, P. (2008). Qualitative Inhaltsanalyse: Grundlagen und Techniken. Deutscher Studienverlag, Weinheim.

Online reference:

Google Analytics Web Analysis Tool: http://www.google.com/intl/en_uk/analytics/

TRANT, J. (2009). Tagging, Folksonomy and Art Museums: Results of steve.museum’s research. http://conference.archimuse.com/files/trantSteveResearchReport2008.pdf.

95/95