special theme: information security - ercim · 14 towards digital credentials by stefan brands,...

56
Special Theme: Information Security European Research Consortium for Informatics and Mathematics www.ercim.org Number 49 April 2002

Upload: others

Post on 08-Jun-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

Special Theme:

InformationSecurity

European Research Consortium for Informatics and Mathematicswww.ercim.org

Number 49 April 2002

Page 2: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

2 ERCIM News No. 49, April 2002

CONTENTS

KEYNOTE

3 European Research Area – and Beyond by Philippe Busquin, Commissioner for Research

JOINT ERCIM ACTIONS

4 The Norwegian University of Science and Technologyis now a Member of ERCIM

6 Kick-off Meeting on Image Understandingby Eric Pauwels, CWI

6 Euro-Legalby Heather Weaver, CLRC

7 Working Group Matrix Computations and Statistics – Second Workshopby Philippe Bernard, INRIA

7 ERCIM Sponsored Events

7 Cor Baayen Award 2002

SPECIAL THEME

8 Information Security — Introductionby Michael Waidner, IBM Zurich Research Lab, Switzerland

9 Methods and Tools against Hacker Attacksby Andreas Wespi, IBM Zurich Research Lab, Switzerland

10 Regulating Access to Web-published Data by Pierangela Samarati, University of Milan, Italy

11 An.on — Privacy Protection on the Internet by Hannes Federrath, Dresden University of Technology, Germany

12 Privacy Protection and Trust Modelsby Olle Olsson, SICS

13 idemix for Internet Anonymity by Endre Bangerter, Jan Camenisch, Els Van Herreweghen andMichael Waidner, IBM Zurich Research Lab, Switzerland

14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada

16 The Cryptography Group at Aarhus Universityby Ronald Cramer, Aarhus University, Denmark

17 New Prime Factorisation Record obtained using the General Number Field Sieveby Friedrich Bahr, Jens Franke and Thorsten Kleinjung, University of Bonn, Germany

18 The Heuristic Evolution of Security and Insecurityby John Clark and Jeremy Jacob, University of York, UK

19 Cryptography using Hyperelliptic Curvesby Norbert Göb and Georg Kux, FhG-Institute for Industrial Mathematics

20 Trusted Logic’s Approach to Security for Embedded Systems: Innovation and Pragmatism byDominique Bolignano, Daniel Le Métayer and Claire Loiseaux, Trusted Logic SA, France

21 Verification of Cryptographic Protocols used in Fixed and Mobile Networksby Tom Coffey, Reiner Dojen and Tomas Flanagan, University of Limerick, Ireland

22 Security and Safety through Static Analysis by Chris Hankin and Thomas Jensen, INRIA/IRISA

23 Security: Formal Methods at Workby Stefano Bistarelli, Fabio Martinelli and Marinella Petrocchi, IIT-CNR

25 CORAS - A Framework for Risk Analysis of Security Critical Systemsby Theo Dimitrakos, Juan Bicarregui, CLRC and Ketil Stølen, SINTEF Group, Norway

26 SMM – Assessing a Company’s IT Security by Holger Kurrek, FhG-Institute for Software and Systems Engineering

27 MICOSec: CORBA as a Secure Platform for Mobile Applicationsby Rudolf Schreiner and Ulrich Lang, ObjectSecurity Ltd, UK

29 Security for Distributed and Mobile Active Objects with the ProActive Library by Isabelle Attali, Denis Caromel and Arnaud Contes, INRIA

30 Mobile IP Security in Foreign Networksby Sami Lehtonen, VTT

31 Security Issues underpinning Large Virtual Organisations by Theo Dimitrakos, Brian Matthews and Juan Bicarregui, CLRC

32 Information Systems Security at CLRCby Trevor Daniels, CLRC

34 Secure Collaboration in Global Computing Systems by Christian Damsgaard Jensen, Trinity College Dublin

35 Integrating Biometric Techniques with Electronic Signaturefor Remote Authentication by Luca Bechelli, Stefano Bistarelli, Fabio Martinelli,MarinellaPetrocchi and Anna Vaccarelli, IIT-CNR

36 Secure Resale of Tangible and Digital Goodsby Harald Häuschen, University of Zurich

37 Managing Authorisations by Babak Sadighi Firozabadi and Mads Dam, SICS

38 The Role of Smart Cards in Practical Information Securityby Javier López, Antonio Maña, Pedro Merino and José M. Troya, University of Málaga, Spain

40 Realizing Trust through Smart Cards by István Mezgár and Zoltán Kincses, SZTAKI

RESEARCH AND DEVELOPMENT

42 Phytoplankton Dynamics — the Struggle for Light by Ben Sommeijer, CWI

43 Virtual and Interactive Environments for Workplaces of the Futureby John Wilson, University of Nottingham, UK

44 Blind Image Analysis helps Research in Cosmologyby Emanuele Salerno, Luigi Bedini, Ercan Kuruoglu and Anna Tonazzini, IEI-CNR

45 Relativistic MHD Computation of Gamma-ray Bursts by Barry Koren, CWI

46 The CORVAL2 Contribution to achieve Confidence in Middleware by Ina Schieferdecker, Axel Rennoch and Dorota Witaszek, FhG-FOKUS

47 COVAX: an Internet Access to Libraries, Archives and Museums by Luciana Bordoni, ENEA/UDA, Italy

TECHNOLOGY TRANSFER

48 A Cluster of European Projects on Agents and Middleware Technologiesby Massimo Busuoli and Emanuela Rasconi, ENEA/UDA, Italy

49 Jalios: master your Contentby Vincent Bouthors, Jalios, France

EVENTS

51 SOFSEM 2001 - 28th Conference on Current Trends in Theory and Practice of Informaticsby Gabriela Andreijkova, SRCIM

51 W3C European Interoperability Tour 2002

52 Workshop on Current Research Directions

in Computer Music by Leonello Tarabella, Graziano Bertin and

Gabriele Boschi

52 Announcements

55 IN BRIEF

Cover image: illustration by Orc Lönn, SICS

Page 3: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

At the Lisbon summit in 2000 Heads of State and ofGovernments adopted a new strategic objective for theEuropean Union for the next decade, which is to make

Europe the most competitive and dynamic knowledge-basedeconomy in the world.

At the same time it was clear that such an ambitious objectivewould only be achieved if the Union were to undertake majorinitiatives. In this context theCommission has proposedmeasures to make Europe muchmore active and attractive in thefield of research, new technolo-gies and innovation.

When I became ResearchCommissioner, just over twoyears ago, I proposed thatresearch should be recognised asa vital area of Union policy. Thiswas accepted and in Lisbon thehighest EU representativesendorsed the creation of aEuropean Research Area.

As a result, there is a new aware-ness that the knowledge-basedsociety requires a much morepro-active, coherent and encom-passing vision of the wayEuropeans manage research. AtUnion level, research needs aconsistent policy with structured,forward-looking objectives.

The Sixth FrameworkProgramme 2002-2006 wasconsequently designed above allto be a structuring instrument for realising the EuropeanResearch Area. As a result it features two major innovations.

The first one is that concentration on a few priorities (such asgenomics, information society and sustainable development)has been accepted as a key principle to achieve a critical massof finance and knowledge.

The second important change concerns new implementinginstruments, namely networks of excellence and integratedprojects of a much larger order of magnitude than currentCommunity research projects, and the possibility of providing

European support for joint research initiatives by severalMember States. These innovations will allow the FrameworkProgramme to act as a catalyst within the European ResearchArea.

In addition, I have recently laid two strong ideas on the table ofthe Spanish EU Presidency. The first is quantitative: upon myinitiative, the Commission has proposed that the Union set

itself the objective to increaseits global research expenditureto 3% of GDP by 2010, insteadof less than 2 % today.

We need to catch up with theUnited States and Japan. Istress that this is not a matterof increasing public expendi-ture but of stimulating privatesector investments in research:annual R&D investment byEuropean companies currentlylags behind that of their UScompetitors by 43% or almost€71 billion. We cannot claimto become the most dynamicknowledge-based economywhile producing far less know-ledge than our competitors.

The second objective is quali-tative. Creating a knowledge-based Europe clearly hingeson both research and educationpolicies. The dual mission ofuniversities in this respect

illustrates clearly to what extentthese two fields are linked.Together with my colleague

Viviane Reding, the Commissioner for Education and Culture,we are going to put forward an integrated research and educa-tion strategy to foster the European Area of Knowledge.

I consider it vital that the European Union achieve these objec-tives if it is to fulfil the Lisbon goals, and I call on represen-tants of the science and technology community, such asERCIM, to take up these objectives in all the countries andregions of Europe.

Philippe Busquin

3ERCIM News No. 49, April 2002

KEYNOTE

Philippe Busquin, Commissioner for Research.

European Research Area – and Beyond

Page 4: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

4 ERCIM News No. 49, April 2002

JOINT ERCIM ACTIONS

The research at NTNU is structuredmainly through the basic organisationand five prioritised strategic researchareas. The responsibility for ERCIMactivities rests with the Faculty ofInformation Technology, Mathematicsand Electrical Engineering(http://www.ime.ntnu.no/eng/). Some ofthe research activities will be co-ordi-nated within the framework of thestrategic research area of Informationand Communication Technology (ICT).(http://www.ntnu.no/satsingsomraader/ikt/english.htm)

NTNU is the major centre for technolog-ical education and research in Norway.Traditions in natural sciences and tech-nology are interwoven with broadlybased expertise in the classical universitydisciplines of the humanities, medicineand the social sciences. The NorwegianInstitute of Technology (NTH), foundedin 1910 is now an integral part of NTNU.

A specific ambition is to promote cross-disciplinary interplay between all formsof human intellectual activities, the arts,the natural and social sciences and tech-nology. This is considered to be ofparticular importance when exploitingthe opportunities of ICT in promotingthe information society and innovationsfor industrial development.

Membership of ERCIM strengthens ourposition in contributing to European andinternational development and inproviding Norway with an internatio-nally competitive level of technologicalknow-how.

Professor Arne Sølvberg, Dean atFaculty of Information Technology,Mathematics and Electrical Engineeringis heading the ERCIM activities atNTNU. He is a member of the Board of

Directors. Professor Finn Arve Aagesen,Head of Department of Telecommuni-cations, is the representative on theExecutive Committee.

NTNU key information:• 7 faculties with a total of

79 departments• 453 professors • 1,770 other academic staff and

doctoral students• total number of employees: 3300

(37% women)• at any given time, about 130

academics are on sabbatical• 20,000 students.

NTNU’s research staff is continuouslyengaged in some 2000 R&D projects. Inaddition 20-30 major scientific confer-ences are hosted by NTNU every year.NTNU has bilateral agreementsconcerning student exchanges with morethan 200 foreign universities.

NTNU co-operates closely withSINTEF, a major independent Europeanresearch institution with about 2,000employees. SINTEF was established bythe university and is located on theuniversity campus. About 25% of theR&D projects of SINTEF are highlyrelevant to the ERCIM community.

The Faculty of Information Technology,Mathematics and Electrical Engineeringhas 270 academic staff and doctoralstudents and is responsible for around20% of the educational activities atNTNU. The five strategic research areasare: • Information and Communications

Technology (ICT) , with specialfocus on Web-technology and ICTand learning

• Materials Technology• Medical Technology and MR• Energy and Environment • Marine and Maritime Technology.

NTNU, the Norvegian University of Science and Technology now represents theNorwegian research community in informatics and mathematics including the relevantdepartments at SINTEF, the University of Oslo, the University of Bergen, the University ofTromsø and the Norwegian Computing Center. Membership was established with supportfrom The Research Council of Norway. Until recently Norway was represented by SINTEF.

The Norwegian University of Science and Technology is now a Member of ERCIM

NTNU main campus with Trondheim city center in the righ hand background.

Page 5: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 5

JOINT ERCIM ACTIONS

Focus Area ICT-Web TechnologyNTNU’s prioritisation of ICT willensure access to competent professionalsfor the ICT industry and for all enter-prises and other activities in society thatmake use of ICT. Our plans are devel-oped in co-operation with business andindustry.

The academic diversity at NTNU makesit possible to cover most aspects of ICTin research and teaching. The main areasare:• computer supported co-operation • information resources • user interface • software and system services • information transport and networks • electronics and hardware.

NTNU’s research within electrical engi-neering, computer and informationscience, mathematics, etc., is not limitedto the above-mentioned fields withinICT-web technology. For informationabout the entire range of subjects, see theoutline of subjects under the Faculty of

Information Technology, Mathematicsand Electrical Engineering.http://www.ime.ntnu.no/eng/. Leader ofthe focus area ICT-web technology isProfessor Arne Sølvberg.

Focus Area ICT and learningThe strength of this central field lies inthe collaboration between pedagogicaland technical experts and academic staffwho work with the development ofknowledge from many different angles.The central field benefits from thecollaboration with business and industryand with system suppliers of telecommu-nications. Important issues in this fieldof research are:• the integration of ICT and learning in

teaching at NTNU• the role of ICT in education in

primary, secondary andsupplementary education

• interdisciplinary communication • the development and evaluation of

ICT-supported methods of teaching

• experimentation with virtuallaboratories and distance learningover the Internet.

The Laboratory for ICT and Learningand the NTNU-forum for ICT andlearning are central to the research activ-ities. Leader for ICT and Learning isProfessor Bjørn Sørenssen, Departmentof Art and Media Studies, NTNU.

Links: NTNU main website: http://www.ntnu.no/indexe.php

Faculty of Information Technology,Mathematics and Electrical Engineering: http://www.ime.ntnu.no/eng/

Strategic research area ICT:http://www.ntnu.no/satsingsomraader/ikt/english.htm

Please contact: Tore R. Jørgensen, NTNU,Faculty of Information Technology, Mathematics and Electrical EngineeringTel: +47 73598035E-mail: [email protected]

SonoWand, an ultrasound-based navigation system for image-guided key-hole surgery developed by MISON, a spin-off companyfrom NTNU and SINTEF. MISON is the first company to resolve acompact integration of high quality 3D ultrasound andneuronavigation. They have achieved international recognition fortheir accomplishments in ultrasound-guided surgery. In December2001 they were awarded the prestigious European IST Grand Prizefor the best IT product in Europe.

Neuronavigation systems have become very common inneurosurgery in recent years. Conventional systems offer thecapability of navigating surgical instruments into the brain basedon MR or CT images that are several days old. As the operationproceeds, such images can no longer be trusted, ‘the map doesnot correspond with the terrain’.

SonoWand enables the surgeon to easily update the map duringsurgery. It takes only a few seconds to scan the brain with highquality 3D ultrasound. The surgeon can then navigate tinyinstruments down to the tumor and perform imageguided surgerywith high precision (approximately 1mm). Ultrasound images ofsimilar quality as MR-images simplify identification of smallremaining tumor structures towards the end of the surgery, andthe high precision navigation system simplify localization andremoval through a smaller opening in the normal brain.

Page 6: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

6 ERCIM News No. 49, April 2002

JOINT ERCIM ACTIONS

The meeting drew participants fromINRIA, KTH (Stockholm), UTIA(Prague), and TCD, as well as endorse-ments from CNR, SZTAKI, andFORTH. It served as a kick-off meetingfor an official ERCIM Working Group,formal approval being expected later thisSpring.

MotivationDue to a confluence of various technolo-gies, the capturing, storage and manipu-lation of images or video has becomeinexpensive and straightforward. As aconsequence, vast collections of image-data and video-streams are beingamassed in digital libraries rapidlyexpanding in scope and size.Unfortunately, this evolution might fallvictim to its own success as the sheersize and complexity of these digitalcollections hamper efficient data mining.There is a growing consensus that thisnew bottleneck can only be alleviated bythe development of content-awareprocessing methodologies that can auto-matically extract metadata with highsemantic value and adapt the processingaccordingly. The name of the workinggroup, Image Understanding, has beenchosen because it is a broad term thatseems to cover all of the above-expounded goals.

Research ThemesThe working group intends to organiseits activities around three subthemeswhich by necessity share a lot ofcommon ground. Below we give foreach subtheme a non-exhaustive subjectlist:• Indexing and retrieval: content-based

image and video retrieval, (semi)-automatic generation of semanticmetadata, cross-modality data mining(eg, combining images and text)

• Information integration: Fusion ofdifferent image modalities, content-aware image enhancement

• Visual decision and control: Visualinspection and expert systems, visual

decision and control systems (eg,auto-pilots for cars), intelligentsurveillance.

MethodologiesProgress in the above applications willcall for methodological advances inresearch areas such as:• generic probabilistic models for

spatial structure detection andreasoning

• generic mathematical models forspatial processing (eg, morphology,PDEs, wavelets)

• computational intelligence (eg,simulation, Bayesian nets,evolutionary computing).

A scientific workshop will be held as apart of the ERCIM meetings onThursday 6 June in Vienna. For expres-sion of interest, as well as further infor-mation and suggestions,

Link: http://www.cwi.nl/ERCIM/WG/Image_Understanding/

Please contact:Eric Pauwels, CWITel: +31 20 592 4225E-mail: [email protected]

To focus the efforts of several ERCIM partners in high-level vision and imageprocessing, a meeting on Image Understanding was organized in February at CWI.

Kick-off Meeting on Image Understandingby Eric Pauwels

Euro-Legal

News about legal information related toInformation Technology from Europeandirectives, and pan-European legalrequirements and regulations.

Network and Information SecurityThe recent EU Communication on Networkand Information Security has as its keyrequirement the need for interoperabilitybetween IT systems and security solutions.Member States are responsible for incor-porating effective information security intotheir e-procurement and e-governmentactivities and in so doing offer a lead toprivate sector organisations. As technologyevolves the use of managed securityservice providers that offer integratedsecurity solutions is increasingly beingconsidered in an effort to stay ahead ofillegal hackers.

CybercrimeThe Communication on cybercrime, underdiscussion within the EU, recommends thata European forum be established to iden-tify security problems and appropriate pan-European solutions to deal with cyber-crime. The law across European jurisdic-tions at the present time varies consider-ably and lacks consistency. The demandfor security solutions and associated legaladvice will increase as awareness of theissues surrounding cybercrime grows.

Data Protection Article 17 of the Data Protection Directive95/46/EC requires Data Controllers andData Processors to ensure that securitymeasures are in place to protect personaldata, and that those measures are appro-priate to nature of the data and the risksinvolved with the processing. The ‘appro-priate measure’ in respect of sensitivedata, ie data about racial or ethnic origin,religious beliefs, sexual orientation, tradeunion membership, political opinions,physical or mental health or condition,criminal offences, proceedings or convic-tions, will require market leading securitydevices. At the present time there are nointernational standards for such securitysolutions.

e-learningThe research firm IDC revealed in theirrecent study European Corporate BusinessSkills Training Market Forecast andAnalysis 2000-2005 that e-learning repre-sented just 3% of the business skills trainingmarket in 2001. According to IDC only asrecently as last year were European trainingorganisations offering a mix of instructor-ledand e-learning services. The use of e-learning techniques for IT training wasslightly higher than for traditional businesstraining, but even so e-learning in IT trainingonly accounted for 6% of the market. IDCestimates a growth rate of 69% in e-learningfor IT skills training between 2000-2005; theequivalent growth rate in the business skillssector was estimated at 108%.

by Heather Weaver, CLRCTel: +44 1 235 446151E-mail: [email protected]

Page 7: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 7

JOINT ERCIM ACTIONS

The Working Group ‘Matrix Compu-tation and Statistics’ aims to find newtopics of research emerging from severalstatistical applications that involve theuse of linear algebra methods. Themembers are particularly interested invery large problems requiring the designof reliable, robust and fast procedures.

Second WorkshopThe second workshop of the workinggroup was hosted by INRIA in Rennes.Thirty participants attended fifteenpresentations including two invitedtalks.

Nick Higham gave the first invitedlecture on a problem of linear algebraencountered in Finance for the stockcorrelation study. The second invitedspeaker, Eric Moreau, presented thestate-of-the-art for IndependentComponent Analysis a popular problemin signal processing. Both talks werevideo recorded and are available at:http://www.irisa.fr/bibli/videos/.

Thirteen other speakers presentedvarious problems encountered in compu-tational statistics, statistical signalprocessing and statistical communica-tions. A number of presentations focusedon ill conditioned linear systems arisingin large least square problems andconsidered associated techniques asSVD or rank-revealing factorisations.Three talks were related to regressionmodels and finally one talk was devotedto adaptive detection in mobile commu-nication. The workshop programme, theabstracts and the slides of each presenta-tion are available at the group’s website.

The third workshop will be held inNeuchatel, 9-10 November, 2002.

Links:Website of the group:http://www.irisa.fr/aladin/wg-statlin/http://www.unine.ch/iiun/matrix/index.htmVideos of the invited talks:http://www.irisa.fr/bibli/videos/

Please contact:Bernard Philippe, INRIATel: +33 2 99 84 73 38E-mail: [email protected]

ERCIM Working Group Matrix Computation and Statistics — Second Workshopby Bernard Philippe

The second workshop of the Working Group on Matrix Computation and Statisticswas held in Rennes, France on 14-15 February 2002.

Cor Baayen Award 2002The 5000 Euro Cor Baayen Award forthe most promising researcher incomputer science and appliedmathematics was created in 1995 tohonour the first ERCIM President. Theaward is open to any young researcherhaving completed their PhD thesis in oneof the ‘ERCIM countries’, currently:Austria, Czech Republic, Finland, France,Germany, Greece, Hungary, Ireland, Italy,Norway, Slovakia, Sweden, Switzerland,the Netherlands and the United Kingdom.The selected fellow will be invited to theERCIM meetings in autumn.

Rules for NominationNominations for each country are madeby the corresponding ERCIM ExecutiveCommittee member (also referred to asthe ‘national contact’). Those who wisha particular candidate to be nominatedshould therefore contact the ERCIMExecutive Committee member for theircountry. Nominees must have carried outtheir work in one of the ‘ERCIMcountries’ and they must have beenawarded their PhD no more than twoyears prior to the date of nomination. EachERCIM institute is allowed to nominateup to two persons from its country. Aperson can only be nominated once forthe Cor Baayen Award. The selection ofthe Cor Baayen award is the responsibilityof the ERCIM Executive Committee.

How to NominateFor proposing a nomination to yournational contact, fill out the Cor BaayenAward Nomination Form available at theERCIM website.

Deadline30 April 2002: nominations are to bereceived by the national contacts. Furtherinformation can be obtained from yournational contact or from the ERCIM CorBaayen Award coordinator Lubos Brim.

Links: The ERCIM Cor Baayen Award: http://www.ercim.org/activity/cor-baayen.htmlNational contacts:http://www.ercim.org/contacts/execom/

Please contact:Lubos Brim, CRCIMCo-ordinator for the Cor Baayen AwardE-mail: [email protected]

ERCIM Sponsored Events

ERCIM sponsors up to fifteen conferences, workshops and summer schools per year.The funding for all types of events is in the order of 2000 Euro.

Forthcoming Events Sponsored by ERCIM:• Eurocrypt2002 - Annual Conference on Cryptology Research

Amsterdam, 28 April - 2 May, 2002• MFCSIT2002 - Second Irish Conference on the Mathematical Foundations of Computer

Science and Information Technology, National University of Ireland, Galway, Ireland, July 18-19, 2002

• ISSTA - International Symposium on Software Testing and Analysis, Rome, 22-24 July 2002

• CONCUR 2002, Brno, Czech Republic, 20-23 August 2002• IEEE Joint International Requirements Engineering Conference (RE'02),

Essen, Germany, 13-20 September 2002• HCI02 - Human-computer Interaction for Mobile Devices,

Pisa, Italy, 18-20 September 2002• SOFSEM 2002 - 29th Annual Conference on Current Trends in Theory and Practice

of Informatics, Milovy, Czech Republic, 24-29 November 2002

For detailed information about ERCIM event sponsorship, see: http://www.ercim.org/activity/sponsored.html

Page 8: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

8 ERCIM News No. 49, April 2002

Security EngineeringBuilding secure systems has to evolve from an art to a security engineering discipline,with well-defined methods for constructing secure systems out of secure subsystemsand basic components, and for assessing and formally validating security. Security-preserving notions of composability need to be developed, combining techniquesfrom software engineering, secure hardware design, formal methods and cryptog-raphy. In particular European researchers have made a lot of progress in this direc-tion, but nevertheless, security engineering is still closer to black magic than toscience: the number of new vulnerabilities is still growing with a dreadful rate.

Protecting PrivacyProtecting privacy in the information society requires good security, but more thansecurity is needed: tools that empower ordinary, non-technical users to control theinformation they reveal about themselves; business models and processes that balancethe personal information they consume and customer value they generate; tools thatenable enterprises to define and enforce their privacy practices, and to manage theidentity and profile information given to them in a trustworthy and responsible way.In particular the development of enterprise privacy technologies has just started.

Intrusion ToleranceOne of the most important R&D topics is the development of intrusion tolerantsystems: such systems work securely and safely even if some subsystems have beensuccessfully attacked and maliciously corrupted – which is inevitably the case formost large systems. Such systems might even react on detected intrusions by recon-figuring themselves into a less corrupted state. Over the last few years a lot of workhas been done on developing intrusion tolerant systems, in particular in the context ofsecure group communication and service replication. More work will be needed, inparticular towards intrusion tolerance for systems based on large dynamic and ad-hocgroups. New trustworthy, intrusion tolerant means of authentication, authorizationand management for such systems need to be developed.

Intrusion DetectionA very related topic is that of intrusion detection: How to detect intrusions, or moregenerally, high-risk situations? Many sensors have been developed that watch forspecific situations, and tools to correlate alarms generated by different sensors inorder to get a better picture. But still the main problems remain open: Intrusiondetection systems generate by far too many false alarms, and rarely suggest effectivereactions on true alarms. More R&D is needed that improves the quality and meaning-fulness of alarms, eg, by considering semantically richer layers and specific applica-tions.

This special issue points to some of the existing European R&D in information secu-rity. Although far from being a complete collection, it gives a good impression ofwhere the European research community is putting its efforts today, and where onecan expect, or at least hope for more results in the future.

by Michael Waidner

Information security is one of thecornerstones of the InformationSociety. Integrity of financial transac-tions, accountability for electronicsignatures, confidentiality within avirtual enterprise, privacy of personalinformation, dependability of criticalinfrastructure, all depend on theavailability of strong, trustworthysecurity mechanisms. Ensuring theavailability of these mechanismsrequires solving several substantialR&D problems.

Information Security — Introduction

Please contact:Michael Waidner, IBM Zurich Research LabTel: +41 1 724 8220E-mail: [email protected]://www.zurich.ibm.com/~wmi

Page 9: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 9

Increasingly refined intrusion-detectiontechniques allow users to operate withconfidence, in spite of the vast number ofattacks that threaten computer systems.The Zurich lab supports IBM’s securityconsulting practices and managed secu-rity services by developing methodolo-gies and tools for the detection, preven-tion and analysis of recurring hackerattacks.

A key component for the GSAL’s workon intrusion detection is its VulnerabilityDatabase (VulDa), a comprehensivedatabase of computer system weak-nesses. Developed by the GSAL andnow maintained by IBM’s ManagedSecurity Services (MSS) organisation, itcontains all known attacks, hacks andcountermeasures. It therefore provides apowerful tool for continued intrusion-detection research and, at the same time,supports IBM's security offerings byallowing carefully controlled access tothe data by IBM consultants as well as ITarchitects and developers. VulDa isunique with respect to its size and thequality of the data it contains. It is aproduct of the continuous monitoring ofinformation that is publicly available onthe Internet (hacker Web sites, news-groups, mailing lists, ftp sites) and dataobtained from confidential IBM sources.The filtering and classification methodsdeveloped by GSAL provide uniqueoptions for accessing the data.Employing three different searchengines has optimised the efficiency ofinformation retrieval from the more than40 gigabytes of compressed datacontained in VulDa. In addition to a clas-sical full-text search engine, advancedsearch techniques involving documentclustering and vulnerability profilesvastly improve the accuracy of searches.The GSAL research activities concen-trate on distributed intrusion-tolerantintrusion detection systems and arestructured along three main axes: (i)development of new ID (Intrusion

Detection) sensors, (ii) development ofan ID management console, and (iii)research on the application of thedependability paradigm for intrusiondetection. The third item constitutes thecore of a three-year European project,which started in January 2000 andinvolves six partners. The Zurich labora-tory is co-leader of this project.

Some results of the two first axes consti-tute the core of the Tivoli Risk Managerproduct and of a novel ID sensor called‘Web IDS’, targeted at detecting attacksagainst Web servers. Tivoli RiskManager enables organisations tocentrally manage attacks, threats andexposures by correlating security infor-mation from various intrusion detectors.The solution enables administrators toeliminate clutter such as false positives,while quickly identifying the real secu-rity threats. This helps administratorsrespond with adaptive securitymeasures. Web IDS is a real-time intru-sion-detection system. It addresses pene-tration of the system, denial-of-serviceattacks, legal but undesirable activity,existing server vulnerabilities, andpolicy violations. This is required tech-nology for e-business, because networkintrusion-detection systems have alimited ability to detect Web attacks. TheWeb intrusion-detection system isdesigned specifically for content-basedattacks on URLs using http or https.

Furthermore, three other new ID sensorshave been developed and are currentlybeing used. The first one is called a‘sniffer detector’, which, as its nameimplies, is designed to detect so-calledsniffers (passive intruders) in a networkby simulating network traffic usingintentionally false information as bait.Any reuse of this information indicatesthat a system has been compromised.This can also help locate an intruderwithin the network.

The second prototype is a so-calledbehaviour-based approach for intrusiondetection. It monitors the behaviour of asystem and sends an alert when a devia-tion from normal behaviour occurs. Thisbehaviour-based approach is used forprocesses running on UNIX machines. Ithas broken new ground by applying the‘Teiresias’ algorithm, originally used forDNA sequencing, to intrusion detection.

The third prototype, called RID (‘routingintrusion detection’), has been devel-oped primarily by a group of the ZurichLab’s Communications Systems depart-ment based on its core competency inrouting algorithms. The GSAL iscontributing to this joint project byproviding expertise on the ID front. Theprinciple of RID is to monitor a networkfor significant deviations from its normalbehaviour. Intrusion detection is crucialfor providing active security in anetwork. As a by-product, it alsoprovides a means of automaticallydetecting potential system misconfigura-tions or errors that may affect overallnetwork operation. An example ofrouting intrusion is a reachability attack:an intruder floods false reachabilityinformation to hijack calls or to generatea denial-of-service attack. An RID appli-cation prototype which detects reacha-bility attacks in OSPF and PNNI hasbeen implemented and is operational.

Please contact: Andreas Wespi, IBM Zurich Research Lab,SwitzerlandTel: +41 1 724 8264E-mail: [email protected]

The research work of the Global Security Analysis Lab (GSAL) Zurich is dedicatedto ensuring that the benefits and convenience of networked computing continueto outweigh the risks of operating in an open networked environment.

Methods and Tools against Hacker Attacks by Andreas Wespi

Page 10: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

10 ERCIM News No. 49, April 2002

Today’s society places great demand onthe dissemination and sharing of informa-tion. With the development and wide-spread use of the Internet and the WorldWide Web, organizations in the privateand public sectors are increasinglyrequired to make their data available tothe outside world. A growing amount ofdata is being collected by statistical agen-cies and census bureaus for analysis andsubsequent distribution to the generalpublic or to requesting organizations (eg,research institutions, governmentoffices). Data producers can release theirdata directly, as in the case of nationalstatistical institutions, or through themediation of archive institutions (datapublishers) that collect data from varioussources for subsequent distribution.

This data distribution process is clearlyselective: data cannot just be released toanybody. For instance, certain sensitivedata can only be released to authorisedindividuals and/or for authorisedpurposes (eg, health data). Some data issubject to time restrictions and can onlybe released to the general public after acertain period; some data can be releasedonly for non-commercial purposes; otherdata can only be released on payment.These few examples already give an ideaof the variety of protection requirementsthat may have to be enforced. There isthus the need for a powerful and flexibleaccess control system able to enforce thedifferent requirements that the dataproducers (or publishers) may want toimpose on the data access.

In the context of the FASTER project, wehave developed an Access ControlSystem for specifying and enforcingprotection requirements on publisheddata, such as statistical tables that havealready undergone a statistical disclosurecontrol process, or survey results, etc.

The access control component is based on asimple, expressive language for the specifi-cation of protection requirements. Theapproach has the following features:• Abstractions/classifications support: this

supports access rules based on the typicalabstractions used by data producers andpublishers, which can define categoriza-tions of users, purposes of use, types ofoperations, and data objects.

• Metadata-dependent access rules: theauthorization language allows the expres-sion of access rules based on conditionson metadata describing (meta)propertiesof the stored data and the users, which canbe represented through system-main-tained profiles. For instance, an accesscontrol rule could grant EU citizensaccess to all census data more than 20years old, where both the citizenship ofthe requesters and the age of data arerepresented as metadata.

• Dynamic condition support: access tocertain data may depend on conditionsthat can be only evaluated at run-time,possibly via interaction with the user.Examples of such conditions, which mustbe associated with procedural callsexecuting the necessary actions, areagreement acceptance (that can be assimple as clicking an ‘ok’ button on apop-up window), payment fulfillment,registration, or form filling.

• Expressiveness: the language is based onflexible rules specifying the accesses to begranted via boolean expressions evalu-ating properties of the requestor (meta-data), of the data being accessed, as wellas of the context (dynamic conditions). Italso allows the expression of two kinds ofaccess rules: authorizations and restric-tions. Authorizations correspond to tradi-tional permissions specifying sufficientconditions for an access, whereas restric-tions make it possible to express condi-tions necessary for an access. Thecombined support of the two kinds of

rules provides a natural fit for the types ofprotection requirements examined in realworld scenarios known to the partners.

• Declarative: the language also has asimple declarative form, making it easy touse for nonspecialists in the field.

The Access Control System has beenimplemented and integrated in theFASTER architecture, which has beendeveloped in a collaboration between theData Archive at Essex University (UK),the Information Technology Dept. of theUniversity of Milan (Italy), the NorwegianSocial Science Data Services (Norway),the Dansk Data Arkiv(Denmark), theCentraalBureau voor de Statistiek(Netherlands), the Central Statistics Office(Ireland), the Statistik Sentralbyra(Norway), and the Centre National de laRecherche Scientifique (France). Theaccess control language and componentdeveloped at the University of Milan arebeing adopted by the project partners toexpress and enforce protection require-ments on the data they make available.

The approach used to develop an accesscontrol for web-publishing is now beingextended to the support of credentials andcertified statements (instead of requiringthem to be stored at the server as metadata)and to policy composition. Policy compo-sition refers to the controlled combinationof access constraints independently speci-fied by different authorities (eg, datarespondent, publisher, producer, andprivacy advocates and regulators).

Link:Faster project: http://www.faster-data.org/Security Group at the Information Technology Department, University of Milan: http://seclab.dti.unimi.it

Please contact: Pierangela Samarati, University of Milan, ItalyE-mail: [email protected]

The overall goal of the FASTER project (Flexible Access to Statistics Tables andElectronic Resources) is the development of a flexible and open system forcontrolled dissemination of statistical information. The system includes two majorsecurity components: a Statistical Disclosure component, for the sanitization ofsensitive tables, and an Access Control component, allowing enforcement ofprotection requirements on published data.

Regulating Access to Web-published Databy Pierangela Samarati

Page 11: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 11

Using Internet services nowadays meansleaving digital traces. More and morecompanies try to use these traces to createindividual profiles of Internet users.Moreover especially people searching forhelp or advice in the Internet do not wantothers to get knowledge of their problemsor diseases. Just imagine drug-relatedadvisory services or medical informationservices. Even in the field of e-commerceanonymity plays a big role because noone is happy receiving spam email as aresult of his Internet activities.

The An.on project wants to helpeveryone to protect his e-privacy. Theopen-source software developed withinthe project tries to reach this goal. Theclient software JAP provides anonymousand unobservable communication in theInternet. Upon this basis any privacy-related Internet service could be built.JAP runs on the JAVA platform and iseasy to install and use to enable green-horns among Intenet users to protecttheir privacy. Two scenarios using JAPare thinkable:• JAP helps to protect the personal

privacy of a single user. JAP can beinstalled on the user‚s computer toprotect his Internet actvities.

• JAP helps to protect the privacy of anorganization. JAP can also be installedon a dedicated machine, for exampleon a proxy or firewall. There JAPserves as a privacy gateway for theentire company, and there is no need toinstall software on the user’s worksta-tion. This might be very useful forcompanies in order to hide their trans-actions and/or research activities on theInternet against observation bycompetitors or against other spyingactivities. (Note that a single userwithin the organization is traceable bythe organization.)

JAP acts as a local proxy between thebrowser and the insecure Internet. Allrequests for Web pages go directly to

JAP, and are multiply encrypted there.Instead of directly fetching the requestedpages, the request is forwarded through achain of multiple intermediate servers(named Mixes by the inventor of thetheoretical background, David Chaum)offered or initiated by the An.on project.The encrypted requests travel throughthis chain of Mixes to the final destina-tion on the Internet. The web server’sresponses are returned along the sameroute. Figure 1 illustrates the architectureof the system. The multiple layers ofencryption protect all messages. A Mixsamples messages in a batch, changestheir coding (removes one layer ofencryption) and forwards them all at thesame time, but in a different order. Allmessages have the same length.

The final system developed within An.onwill withstand so-called traffic analysis:Even an adversary observing all commu-nication links cannot decide whichincoming and outgoing packet belongs toeach other. A surfer remains anonymouswithin the group of all users of theservice. The system protects a user'sInternet behaviour against tracing as longas at least one Mix in the chain workscorrectly. The Mixes are run by differentorganizations who were willing to partic-ipate in the An.on project. The chainingalso prevents these Mixes fromobserving. Thus the system providesanonymity even against the anonymityservice itself. The client program (JAP)runs on the Java platform. JAP works onall major platforms, for instanceWindows, Macintosh, Linux, Solaris, etc.The Mix-servers are written in C++ andwork on many different platformsincluding Windows NT, Linux, Solaris,Irix and other Unix-like operatingsystems. JAP and Mixes are Open-Source software. Everybody may inspectit and convince himself that the softwareprovides the expected functionality andcontains no hidden trapdoors. Figure 2shows the user interface.

Future research in the An.on projectconcentrates on integrating a paymentfunction in JAP and research on services(eg pseudonymous email) that might bebuilt upon the infrastructure An.onalready provides. We are always lookingfor partners - ISPs, IT security compa-nies, networking companies, privacycommissioners - who are willing tooperate a Mix and would like to supportthe idea of providing a world wideanonymity service. We are open to part-ners who want to discuss commercializa-tion of our service.

An.on is a joint project from DresdenUniversity of Technology and the PrivacyCommissioner of Schleswig-Holstein,Germany. From 2001 to 2003 the projectis sponsored by the German FederalMinistry of Economics and Technology.

Link: An.on project: http://www.anon-online.org/

Please contact:Hannes Federrath, Dresden University of Technology, Germany Tel: +49 351 463 38247E-mail: [email protected]

An.on is a joint project from Dresden University of Technology and the PrivacyCommissioner of Schleswig-Holstein/Germany. Its aim is to enable every user toprotect his privacy on the Internet.

An.on — Privacy Protection on the Internetby Hannes Federrath

Figure 1: Architecture of the system.

User A

User B User C User D

User E

Figure 2: JAP user interface.

Page 12: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

12 ERCIM News No. 49, April 2002

We are investigating the problem of howcomputational components can acquireand use knowledge about the reliabilityof other components. This problemdomain is conceptualised in terms ofagents that interact with each other,where agents (consumers) may needservices, and where other agents(producers) can offer such services.Consumers need to select whichproducer to enter into a contract with,and they should make this choice withthe aim of optimising their accumulatedutility, in short-term or long-termperspective. In open computational envi-ronments, a consumer is only able toobserve the external behaviour ofproducers, ie, selection of a producer canonly be done on the basis of hard factsabout past performance of the producers.

How well does this model match proper-ties of real-world problems? Oneexample is electronic shopping on theWeb. We consumers have to select whichretailer to use, and often we onlyencounter these retailers on the web.Which one should we choose?Experiences, good or bad, may indicatethat some retailers could be preferredwhile others should be avoided. But it isalso important to take into account thepotential gains we may make; cheap butdubious vs. expensive but reliable.Furthermore, we can make our decisionsbased on our own experiences, as well ason what we know about other consumers’experiences of the producers. Thisscenario is an example of a wider classof applications. Important characteris-tics of this class are that components canprofit from using services provided by‘alien’ components, that such aliencomponents have externally observablebehaviour, that they may deliverservices of a priori unknown quality,and that the quality of services can beestablished after delivery.

Computational Models of TrustPractical methods for building systemsthat use trust as a component in decisionmaking are still in an immature state. Theaim of this research is to develop arational approach to trust-based systems,and to take initial steps towards an engi-neering methodology for such systems.

A rational approach to trust concernsidentification of the theoretical basis oftrust. Key elements are incorporatedfrom related disciplines, eg, decisiontheory and utility theory. As trust is aconcept with meaning in a societalcontext, the theory of social choice offersimportant scientific underpinnings.Theoretical frameworks as thosementioned provide alternative models offundamental concepts, as well as proofsof the limits of what can be achieved.

A core problem is the ontological statusof the concept of trust. Agents thatcommunicate trust knowledge mustconceptualise trust in the compatibleways. To some extent trust can beavoided as an explicit concept, bymaking agents disseminate informationabout their experiences in terms of theirconcrete interactions with other agents.The drawback of this approach is that anagent may thereby disclose private infor-mation, information that could compro-mise the privacy of the agent. Byabstracting experiences into statementsabout trust, an agent can prevent personalsensitive information from beingpublicly accessible. An intermediatesolution is to identify ‘regions of trust’,where the amount of communicateddetailed information depends on theproximity of the trust regions to whichthe sender and the receiver belong.

A practical engineering methodologymust take into account a number of prac-tical issues, eg, how to choose betweenalternative algorithmic methods formaking trust-based decisions. As there

are infinitely many such algorithms, it isimportant to understand in what way aspecific class of algorithms contributes toutility of the user. There may be real-world properties that have critical impacton how well certain algorithms succeedin optimising the utility of the user, and,from an engineering point of view, it iscritical to understand how such environ-mental factors may influence the useful-ness of specific trust-based methods.

We have developed a generic model oftrust, on the level of generic problem-solving methods. A workbench has beendeveloped, based on this model, wheresystems built on configurable and para-metric trust models can be simulated.Preliminary results have been obtained interms of sensitivity analysis, eg, howstrong or weak is the correlation betweenutility and some environmental factor.

The trust model will be evaluated withina project focussing on privacy in theinformation society (SAITS). From asocial point of view, individual citizensmay protect themselves by being co-operative members of their society, andinforming their peers about their experi-ences from earlier interactions withothers. This ‘gossiping’ model is funda-mentally about establishing a societalknowledge base of trust experiences, andabout the use of this in individual deci-sion making. Important questions are;how far can privacy protection bestrengthened through such means, howcan ‘exchange rates’ between differentvalue/preference domains be established,and what are the threats to such societalmechanisms (eg, how to prevent thatknowledge bases can be compromised ormanipulated).

Please contact:Olle Olsson, SICSTel: +46 8 633 1500E-mail: [email protected]

Privacy Protection and Trust Models by Olle Olsson

The SAITS project, a cooperation between SICS, Stockholm Universityand others, will investigate technical aspects of privacy protection. Trustmodels is one of the techniques be evaluated.

Page 13: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 13

Modern forms of electronic communi-cation and commerce are such that prac-tically any transaction leaves a data trailin the global Internet, ie, informationabout who conducted which transactionwith whom and when. Novel in thisrespect is not the fact that these datatrails exist but rather the ease with whichlarge amounts of data from manydiverse sources can be gathered,combined, and exploited in a widevariety of ways.

A scenario to illustrate what this canmean for each one of us is quicklygiven: Anyone who books a hotel roomwill probably be registered in an elec-tronic system with his or her name,home address, and length of stay. Thisin itself may have numerous advantages,also for the hotel guest, for example interms of frequent-flyer mileage. On theother hand, a malicious hotel employeecould supply the guest's home address toa complice for a low-risk burglary in theabsence of the owner. A regularcustomer of an on-line bookshop mightappreciate receiving reading sugges-tions based on his or her specific prefer-ences, but will be less than pleased ifpersonal data is passed on to a thirdparty, and the reading preferences areexploited for other purposes.

Using the Internet in a variety of waysincreasingly puts personal data at risk.But just as modern technology is a threatto one's privacy, it also provides manytools for its protection: Making consis-tent use of cryptographic encoding whensaving and transmitting data protectspersonal data from uninvolved thirdparties. A conscientious check of a busi-ness partner’s identity, for exampleusing digital signatures and public-keyinfrastructures (PKIs), can help toensure that personal data are only givento partners deemed trustworthy. Finally,

various technical tools exist to specifythe privacy preferences of an individual,for example, which data one is willing toreveal for what purpose and to whom.Moreover, many organizations thatprocess data publish details of theirprivacy policy, such as which data aregathered, and to whom and for whatpurpose they may eventually beforwarded.

All these technologies help to prevent anaccidental transfer of personal data or atransfer under unclear conditions. Butthey cannot prevent data being passedon by, for example, a malicious hotelemployee or, probably a much morefrequent occurrence, being accessed bya curious employee or accidentallyrevealed because of technical problems.

This is why researchers at IBM’s ZurichResearch Laboratory in Rüschlikon,Switzerland, go one step further andinvestigate concepts that embrace ‘dataparsimony.’ The basic concept is verysimple: personal data is best protected ifnot revealed at all, ie, if the amount ofdata revealed is kept to a minimum. Theidea is not new, many laws on theprotection of personal data contain datafrugality as an implicit guideline. Thequestion then is how ‘minimum’ isdefined.

Anybody who rents a car has to producea valid driver’s license, thereby –whether voluntarily or involuntarily –revealing a wealth of personal data.Actually, the car rental only needs thename and address of the person renting acar in the event of an emergency. Aslong as there is no accident, it wouldsuffice to know that the person renting acar is in possession of a valid driver’slicense. In this case, the data minimumcould be quite easily achieved: the nameand address in the driver’s license could

simply be replaced by a randomlychosen artificial name, a pseudonym,provided that in an emergency the namehidden by a pseudonym could beretrieved.

The ‘idemix’ system developed inRüschlikon uses precisely suchpseudonyms for e-commerce transac-tions: Today, anybody who subscribesto an on-line service has to register withthe service using a user name and a pass-word which he or she has to produceeach time he or she wants to access theservice. Under ‘idemix’ a user wouldfirst select a pseudonym, then registerusing this pseudonym and receive thecorresponding credentials with an elec-tronic signature. If later the user wantsto access the service, he or she onlymust first provide proof to the servicethat the corresponding, digitally signedcredentials are in his or her possession.

Of course, a user could merely presenthis or her pseudonym and the creden-tials; however, in many cases this wouldinvalidate the desired data-protection

The protection of data and privacy in the Internet is an issue of increasing concernand importance. Such protection requires not only that general standards and lawsbe agreed upon but also that technical measures be taken. An example of the latteris the ‘idemix’ project at IBM’s Zurich Research Laboratory in Rüschlikon.

idemix for Internet Anonymityby Endre Bangerter, Jan Camenisch, Els Van Herreweghen and Michael Waidner

In the left scenario, the organizations arethe motorvehicle administration, aninsurance company, and a sports carsrental agency. Alice on the left wants torent a car. To do so, she needs to showthe car rental agency her driver's licenceand an insurance police. Conventionally,Alice would just send such documents tothe rental agency who would then checkthem for valitidy. However, thereby Alicehas to reveal the rental agency all kindsof unecessary information such as hername, her address, the details of herinsurance policy. Idemix allows Alice toconvince the rental agency that she ownsa driver’s license and an insurace policywithout acctually sending them and thusthe car rental agency does get to knowonly the information required but nothingmore.

Page 14: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

14 ERCIM News No. 49, April 2002

Often at least one of the parties in atransaction needs to know whether theother party is authorized to perform acertain action. Typically, authorization isgranted on the basis of a person’s privi-leges, personal characteristics, reputa-tion, identity, membership to a group,willingness to provide value in

exchange, and so on. In all these cases,the verifying party must rely on theinspection of one or more tangibleobjects issued by trusted third parties.Examples of such ‘credentials’ are coinsand bank notes, stamps, medicalprescriptions, cinema tickets, votingballots, membership cards, access

tokens, diplomas, passports, and drivers’licenses.

Physical credentials are increasinglyprone to counterfeiting, however, andcannot be transferred by mobile devices,personal computers, and chipcards.Well-known cryptographic techniques,

Digital Credentials are the digital equivalent of paper documents and other tangibleobjects traditionally used for establishing a person's privileges, characteristics,identity, and so on. Since they are just cryptographically protected binary strings,they can be electronically transferred and can be verified with 100 percent accuracyby computers. Digital Credentials preserve the key privacy properties of paper-based documents and plastic tokens, but offer much greater security. They areentirely feasible, as R&D work during the 1990s has shown. Several academicprototypes have been developed recently, and a new Canadian company, Credentica,is now commercializing Digital Credentials.

Towards Digital Credentialsby Stefan Brands

advantages in that the on-line servicecould monitor when and how a user usesthe service, which in turn could result inan involuntary de-anonymization. Inaddition, the on-line service would ofteneven know who owns the pseudonym,for example, for billing purposes.

By employing modern cryptographictechniques, the so-called Zero-Knowledge proofs, researchers at IBMhave succeeded in resolving this issue:the pseudonym and credentials are givento the on-line service only in encryptedform. Although the on-line servicecannot decrypt the information, it canstill employ a clever interaction tacticwith the user to verify the authenticity ofthe encrypted pseudonym and that theusers must indeed own correct, digitallysigned credentials. In an equally securemanner the user can supply credentialsreceived from another organization tothe on-line service. The car rentalagency in our example could in this wayreceive proof of possession of a validdriver’s license from the authorities, andof a valid credit card from a bank.

A user can in principle present his or hercredentials any number of times in thisway. Because a new encryption is used

every time, the repeated use is hiddenfrom the on-line service, ie, the user isnot re-identified and thus, so to speak,can act completely anonymously.However, for many applications thistotal anonymity is undesirable: forexample, if a rented car is not returned,the identity of the person who rented thecar has to be retrievable. Therefore, theidemix system also has provision for adesignated authority who can uncoversuch an identity. In the case of an‘anonymized’ driving license, it couldfor example be the office that issued thelicense; in a business context it could bea third party trusted by both businesspartners.

The IBM researchers have also found asolution for another potential probleminherent in such data-protectionmeasures: total anonymity could enticea ‘generous’ user to share his or herpseudonym and credentials for anexpensive on-line service with friendsand acquaintances. To render this asunattractive as possible, idemix is set upsuch that all pseudonyms and creden-tials of a user are interleaved in a cleverand secret manner. If a user allows afriend to use one of his or her creden-tials, idemix is configured such that this

is equivalent to granting permission touse all of his or her credentials. Sharinga password for a magazine subscriptionwould then be virtually equivalent tosharing the secret PIN code of one’sATM card and the secret code for thesafety deposit box at the bank. By usingsmartcards to implement parts ofidemix, this sharing of pseudonyms andcredentials can be further precluded.

Currently the idemix developers arebuilding a prototype system to guaranteeanonymity in the Internet. Decisivefactors are not only the functionality butalso the speed of the systems. It is esti-mated that transactions using idemixtake about five times as long as transac-tions executed in the traditional manner.In practical use, however, idemix willnot lead to a noticeable slowing down ofthe transaction speed because transac-tions typically take milliseconds tocomplete the actual process.

Link: idemix project:http://www.zurich.ibm.com/security/idemix/

Please contact: Endre Bangerter, Jan Camenisch, Els van Herreweghen and Michael Waidner, IBM Zurich Research Lab,SwitzerlandE-mail: {eba,jca,evh,wmi}@zurich.ibm.com

Page 15: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 15

such as digital identity certifi-cates and message authentica-tion codes, are not a solution:• they encourage large-scale

identity fraud and otherdevastating abuses ofsecurity holes that areinevitably caused byfundamentally relying on thecentral storage andmanagement of sensitiveinformation

• they ignore the privacy rightsof individuals: all theiractions can be linked andtraced automatically andinstantaneously, by amultitude of parties

• they do nothing to discourageparticipants from using eachother’s credentials.

To overcome these fundamentaldrawbacks, a transition to cryp-tographically protected digital forms ofcredentials is inevitable. (See S. Brands:“Rethinking Public Key Infrastructuresand Digital Certificates; Building inPrivacy,” with a foreword by Ronald L.Rivest. August 2000, MIT Press, ISBN0-262-02491-8,http://www.credentica.com/technology/book.html; and “A TechnicalIntroduction To Digital Credentials,”International Journal on InformationSecurity, ed. Moti Yung, to appear in theAugust 2002 issue, http://www.creden-tica.com/ technology/overview.pdf.) It isentirely feasible to design DigitalCredentials that address the completespectrum of security, liability, andprivacy risks for all parties involved.Digital Credentials preserve the keyprivacy properties of paper-based docu-ments and plastic tokens, but offer muchgreater security and functionality:• they are just sequences of zeros and

ones, and so they can be transferredelectronically and can be verified with100 percent accuracy by computers

• the holder of a Digital Credential canselectively disclose a property of thedata that the issuer encoded into it,without revealing anything else. Forexample, the holder of a ‘privacy-enhanced’ national ID card can provethat her Digital Credential specifies anidentity that is NOT equal to one of

several identities on a list of criminalsuspects, without revealing her identity

• lending of a Digital Credential can bediscouraged by encoding confidentialdata into it, such as a credit cardnumber of the applicant. Even thoughthe applicant can hide the confidentialdata when using the Digital Credentialhimself, it is not possible to use thedigital credential without actuallyknowing the confidential data

• a limited-show Digital Credential cancontain a built-in identifier that can beuncovered by a central party only ifthe Digital Credential is shown morethan a predetermined number oftimes. See the Figure for an example.This property can be used to designsecure electronic coins, stamps, giftcertificates, and other value tokens

• they can be issued to, or embedded in,low-cost chipcards or other tamper-resistant devices. This provides anadditional layer of protection againstloss, theft, extortion, lending,copying, and discarding of digitalcredentials, and can prevent otherkinds of unauthorized behaviour.

Digital Credentials are suitable for anycommunication or transaction systemthat needs to protect the transfer andstorage of information. Examples areaccess control systems (for VPNs,

subscription-based services,Web sites, databases, buildings,and so on); privacy-enhancednational ID cards; public trans-port tickets; electronic voting; e-health systems; financial securi-ties trading; digital copyrightprotection; road-toll pricing;and electronic money.

The practicality of DigitalCredentials has been well-estab-lished. Notably, last yearZeroknowledge Systems inMontreal developed a wirelessprototype for RIM’s Blackberryas well as a software develop-ment toolkit suitable for a widevariety of applications. Also,from 1993 until 1999, CAFEand OPERA, two majorEuropean consortiums co-funded by the EuropeanESPRIT Programme, imple-

mented and extensively tested a chip-card-based electronic cash system basedon the Digital Credentials technology.(See Rafael Hirschfeld: “OPERA - OpenPayments European ResearchAssociation”, ERCIM News no.30, July1997, http://www.ercim.org/publica-tion/Ercim_News/enw30/hirschfeld.html.)Furthermore, several computer sciencegraduates in Europe and America haveindependently developed academicprototypes.

In January 2002, a new company calledCredentica was founded to commer-cialize Digital Credentials. Incorporatedin Quebec, Credentica’s mission is todeliver multi-party secure solutions forapplications that involve the electronictransfer of sensitive information.Credentica leverages the DigitalCredentials technology and its special-ized R&D expertise to develop tailoredsoftware and services for the providersof Internet, wireless, and chipcard appli-cations.

Link:Credentia: http://www.credentica.com

Please contact:Stefan Brands, McGill University, School of Computer Science, CanadaTel: +1 515 985 4111E-mail: [email protected]

Alice fraudulently uses the same one-show Digital Credentialat both a medical office and a business. Although eachseparate transaction cannot be traced by anyone to heridentity, the aggregate information is sufficient to enable thecentral authority to find out her identity.

Page 16: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

16 ERCIM News No. 49, April 2002

While Cryptography historically ismerely about transporting data securelyfrom A to B, the widespread use ofcomputers in today’s society poses newchallenges to data security solutionsbased on cryptography. This goes farbeyond virus protection and the like. Infact we need secure implementations ofquite complicated tasks that were earlierhandled by exchange of paper docu-ments. Examples of this are electroniccommerce and payments, electronicelections, etc. In this way, cryptographyhas become an enabling technology thatunderpins, for instance, the security ofcountless homebanking systems acrossthe world. The Cryptography Group atAarhus University specializes inresearch of solutions of this kind, socalled cryptographic protocols. This canbe based on public key cryptographysuch as RSA, but may also be based onsecurity that is unbreakable, for instanceusing secure channels protected byquantum cryptography.

When modern cryptography started, inthe 70s and early 80s, there was a largegap between theory and practice: whereassolutions did exist that could be analyzedand security proved rigorously, theseconstructions were way too inefficient forcommercial use; on the other hand, nogood analysis methods were known forthe systems that practitioners were happywith. One of our main goals in the Aarhuscrypto group is to contribute to bridgingthis gap and provide efficient construc-tions that are also provable. Several of ourresults, for instance in design of digitalsignatures and public key encryptionschemes are in this direction. Often ourresearch is conducted in close collabora-tion with other international universitiesand research labs.

We mention here briefly a couple ofconcrete lines of research that we havebeen involved in recently. The first ofthese is centred around secure electronicimplementation of elections. Any elec-

tion procedure needs to ensure that onlypeople authorized to vote can actually doso, that the results correctly reflects thevotes cast, that the privacy of voters isprotected, and finally that the result canbe verified after the election is over.Particularly the last two concerns mayappear to be contradictory: if we canverify that all votes are counted and noone voted twice, it may seem to benecessary that each vote is linked to anindividual voter, and so privacy would beviolated. Fortunately, this is not the case:votes can be cast in encrypted form, afterwhich all votes are combined to form anencryption of the result, which can the bedecrypted and made public. But since thiscan take place without decrypting anysingle vote, privacy can still be protected.Several of us have contributed to this arearecently, for instance in designing proto-cols that scale well to large elections, orin providing formal security proofs forelection protocols. Through ourcooperation with Cryptomathic, anAarhus based company in security, weare involved in an EU supported project‘EVote’ that aims at making suchsystems commercially available.

In a more general direction, we have alsobeen active in design of general multi-party computation: an election may beseen as a game where a number ofplayers (the voters) have inputs (howthey want to vote) and we wish tocompute some function on the inputs(the result of the election) securely, ie,the result must be correct and we mustprotect privacy of the inputs. It is in factpossible to compute any desired functionin this way, and one of our main goalshas been to provide as efficient aspossible solutions of this general type.

A second trend goes towards providingso-called unconditional security: despiteprogress in the analysis of practicalsystems, all such schemes in use todayare ultimately based on the assumptionthat certain problems, for instance

factoring large integers, are difficult tosolve for an attacker.

Unfortunately, we do not know withcertainty that any concrete problem reallyis sufficiently hard. One way to solvethese problems is to use quantum commu-nication: we send information encoded inthe state of very small physical systems,typically single elementary particles areused. The behaviour of so small systemsis governed by quantum physics, and thishas some unexpected consequences:information sent in this way cannot beeavesdropped without damaging theinformation sent in such a way that thiscan be detected by the receiver. Byexploiting this fact properly, we can buildchannels with security that no amount ofcomputing power can break. These factshave been known since the early 1980s,and the first experimental implementa-tion is from 1990. In collaboration withthe Physics Department in Aarhus, wehave built a fully operational quantumcryptography prototype and analyzed itssecurity against realistic attacks.

The research group in Cryptography atAarhus University consists of thefollowing members: head of group IvanDamgard, senior researchers RonaldCramer and Louis Salvail , and PhDstudents Jesper Buus Nielsen, MadsJurik, Maciej Koprowski, Jens Groth,Kasper Dupont, Kirill Morozow andJesus Fernandez.

Links:Group Home Page:http://www.brics.dk/Activities/Cryptology/

Center for Quantum Information Processing:http://www.cki.au.dk

Electronic Voting:http://www.cryptomathic.com/news/tech_frame_evote.html

Please contact:Ronald Cramer, Aarhus University, DenmarkTel: +45 89 42 3476E-mail: [email protected]

The Cryptography Group at Aarhus University specializes in research of solutionsof so called cryptographic protocols.

The Cryptography Group at Aarhus Universityby Ronald Cramer

Page 17: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 17

Since the safety of the RSA cryp-tosystem depends on the difficulty offactoring general integers into primes,the selection of key size for this cryp-tosystem depends on the state of the artin this area of algorithmic number theoryand on likely improvements in thefuture. Using a new implementation ofthe general number field sieve (GNFS),we have factored a 158-digit divisor of2953-1, establishing a new record for thefactorisation of general numbers withoutsmall divisors into primes.

The previous record was a 155-digitRSA challenge number factored by ateam of mathematicians led by CWI in1999.

Integer factorisations by GNFS startwith a massively parallel polynomialselection step. This is followed by thesieving step, which is also massivelyparallel and takes most of the CPU time.Both steps are usually carried out usingthe idle time of ordinary office PCs orworkstations. Our new contribution tothis step is a new algorithm for latticesieving, which results in a considerableimprovement in speed compared toCWI’s implementation.

The most time-consuming part of post-processing the siever output can bethought of as a linear algebra problemover the field F2 with two elements on asparse matrix of a few hundred millionrows and columns. In a first filteringstep, the size of this matrix is reduced toa few million rows and columns. Thiscondensed matrix is then solved using ablock Lanczos algorithm for sparsematrices over F2. For the previousrecord, the filtering step was done on alarge workstation and the Lanczos algo-rithm was run on a Cray 90 supercom-puter. We wrote parallel implementa-tions for both the filtering and the

Lanczos step, running them on a LINUXcluster built by the scientific computingdepartment at the institute for appliedmathematics in Bonn.

ConclusionDue to improvements in the algorithmsand their implementation as well asimprovements in computer hardwaresince the CWI broke the 512-bitthreshold in 1999, factoring a 512-bitRSA key within a year now costs lessthan 3000 Euro. Extrapolation of thedevelopment of the GNFS in the 1990smakes it likely that 768-bit keys becomevulnerable at comparable cost within thedecade starting in 2010. For large ormedium-sized governments, they maybecome vulnerable within this decade.

We are indebted to the scientificcomputing department at the Institute forApplied Mathematics and to theMathematical Institute at Bonn

University for providing much of thecomputer hardware used for this record.

Links:http://www.crypto-world.com/ announcements/c158.txthttp://www.loria.fr/~zimmerma/records/gnfs158http://wissrech.iam.uni-bonn.de/research/projects/parnass2

Please contact:Friedrich Bahr, Jens Franke, Thorsten Kleinjung, University of Bonn, GermanyTel: +49 228 73 29 52 E-mail: [email protected]

by Friedrich Bahr, Jens Franke and Thorsten Kleinjung

A team of researchers at the University of Bonn established a new record in the artof factoring general integers into primes on 18 January 2002. This has implicationsfor public-key cryptography.

New Prime Factorisation Record obtained using the General Number Field Sieve

Selected polynomials:

16915642778160*X^5-756958519686871038*X^4

-13302674486415639704432*X^389395931439544311110799193*X^2

81521488422532239989865771400*X^1 -664290154060829787211338433347600*X^0

andX-74760919650729255820151370977

the number:39505874583265144526419767800614481996020776460304936454139376051579355626529450683609727842468219535093544305870490251995655335710209799226484977949442955603

the prime factors:3388495837466721394368393204672181522815830368604993048084925840555281177and11658823406671259903148376558383270818131012258146392600439520994131344334162924536139

Page 18: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

18 ERCIM News No. 49, April 2002

Heuristic search techniques such assimulated annealing and genetic algo-rithms are a major success story ofcomputer science. Their use in cryp-tology, however, has remained very lowkey. Most work has been concernedwith breaking simple classical ciphersand there has been little application tomodern-day cryptological problems.Modern day crypto-algorithms do notlend themselves to cryptanalysis viaheuristic search in the way that classicalciphers often do. This may well haveled to heuristic search being discountedas a serious tool for cryptologists. Weaim to show that its cryptological poweris greatly underestimated and outlinebelow some successes so far. The workranges from the design of fundamentalcomponents, through cryptanalysis ofidentification schemes, to the auto-mated synthesis of provably correctsecurity protocols.

The Design of CryptographicBuilding Blocks Boolean functions (mappings of vectorsof Boolean inputs to vectors of Booleanoutputs) are at the heart of moderndigital cryptography. Block cipherssuch as the Data Encryption Standardand Advanced Encryption Standard, forexample, can be regarded as (rathercomplex) Boolean functions (mappingplaintext input to ciphertext output in amanner determined by a secret key).Modern algorithms use simplerBoolean functions as components.These are carefully constructed to resistmodern cryptanalytic attacks. Variousresilience properties have beenproposed as desirable but the field issubtle and engineering functions withexcellent properties is surprisinglydifficult. Mathematical constructionremains the primary derivation tech-nique but for the small single-outputcase our search approaches, basedaround simulated annealing, have been

able to produce functions with moreextreme values of properties than hith-erto known. Counter-examples toseveral conjectures in the literatureconcerning achievable bounds havebeen demonstrated (often within a fewseconds or less). When functions satis-fying many criteria are sought ourapproaches have equalled, and in manycases bettered, the best results theoret-ical construction has provided to date.A feature of optimisation-basedapproaches is that they extend naturallyto cater for additional desirable criteria(the cost functions used are appropri-ately amended). What counts as desir-able, however, depends on who you are.Though the focus of our work is theattainment of functions with excellentprofiles of positive properties, we haveevidence to show that trapdoor proper-ties may easily be incorporated too.Understanding how, why and when thetechniques work effectively remains amajor challenge.

Exploiting the ComputationalDynamics of Search Techniques Work over the past decade has shownthat timing and power consumptioncharacteristics of a cryptosystem can beexploited to reveal the secret keys used.The injection of hardware faults (eventransient ones) into a cryptosystem canbe similarly exploited. But it is notrealised that search techniques such assimulated annealing also have computa-tional dynamics that can be exploited ina cryptological context (using suchcharacteristics we have successfullyattacked instances of NP-completeproblems underpinning certain identifi-cation protocols). The important pointhere is that annealing is a form ofguided search. The way in which thesolution moves from a random startpoint to its eventual destination mayprovide substantial information aboutthe underlying solution to the problem

instance under attack. The order inwhich solution elements attain theirfinal values (ie the order in which thoseelements get stuck at that final valuesduring a search) may be related to theactual sought solution. This may beregarded as a form of timing channel.On a different tack, we have obtainedexcellent results by warping theproblem instance under attack in manydifferent ways (in analogy with faultinjection) and have used the sets of finalsolutions obtained by the searches tolocate the solution to the originalproblem.

Profiling is a crucial component of suchattacks and cryptosystems lend them-selves to such approaches. Every secretkey for example may define a represen-tative instance. We can generate manyrepresentative instances, apply heuristicsearch based techniques aimed atfinding the underlying secrets, andinvestigate how the actual secretssought are related to the final solutionsof various search runs and to the trajec-tories taken to such final solutions. Thisknowledge can then be applied toinstances where the secret key is notknown. This has already been demon-strated in the context of some identifi-cation schemes (as indicated above) andwe now seek to attack block ciphers andpublic key algorithms.

Evolving Secure ProtocolsA protocol is a series of messageexchanges, each message designed toachieve some goal, typically makingsubsequent messages possible or mean-ingful. The notion of progress towardsgoals seems inherent to the concept of aprotocol. We have sought to exploit thisand have created a prototype frame-work for the automated synthesis ofsecurity protocols (eg key distributionprotocols) based around the belief logicof Burrows, Abadi and Needham. We

by John Clark and Jeremy Jacob

Researchers of the University of York in England aim to show that heuristic searchtechniques are powerful tools for modern day cryptology.

The Heuristic Evolution of Security and Insecurity

Page 19: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 19

carry out a guided search over the spaceof feasible protocols. Each protocolachieves something. The closer thatsomething is to what we want, the fitterthat protocol is deemed to be. Theguided search approach (using simu-lated annealing or genetic algorithms)allows increasingly fit protocols to beevolved, eventually leading to one (or

more) that meets all the intended goals.The formal nature of the belief logicused allows the evolution of protocolswhose abstract executions are in factproofs of their own correctness.

Heuristic search is a powerful tool forprofessional modern day cryptology. Itspower has barely been tapped.

Please contact: John A Clark and Jeremy L Jacob, University of York, UKTel: +44 1904 433379E-mail: {jac,jeremy}@cs.york.ac.uk

Nowadays the most popular public keycryptosystem is RSA. Its security relieson the fact that factoring a large integerknown to be the product of two primes ofsimilar size is supposed to be a computa-tionally difficult problem. Nevertheless,good progress has been made infactoring due to the so-called numberfield sieve algorithm. The actualfactoring record lies at 158 decimaldigits for the composite number (seepage 17), which means that 512-bitRSA-keys can no longer guarantee secu-rity.

Considering this development it is impor-tant to have alternatives available. Manyproposals have been made, most of whichare not comparable with RSA in terms ofrunning time and security per key-length.Elliptic curves, however, have proved tobe a good choice for building public keycryptosystems which can seriouslycompete with RSA. Their security relieson the problem of computing logarithmsin the group of points of an elliptic curve.Since this is supposed to be hard for rela-tively small parameters, they offer high-grade security even for small keys and aretherefore the optimal choice for smartcards and other environments that provideonly a limited storage space. Hyperellipticcurves are generalisations of ellipticcurves, ie, they are of a higher genus(elliptic curves have a genus equal toone). Not as much is known about them,but they can be used to construct secure

and efficient cryptosystems comparableto RSA.

In our project we investigate how theparameters must be chosen to achievethis goal. Certainly this implies consider-ation of many theoretical results.Contrary to an elliptic curve, the set ofpoints on a hyperelliptic curve alonedoes not allow a mathematical groupstructure. In this case one has to gener-alise somewhat to find a group thatprovides similar features. This is calledthe Jacobi group. To offer reasonablesecurity it is of great importance to knowthe exact number of elements of thisgroup, the so-called class number. Thiscannot be simply read from the curveequation and there is still no known effi-cient algorithm to compute it for ageneral hyperelliptic curve. For specialclasses of curves, however, there exist

methods for determining the classnumber.

Our current research concentrates onfinding construction methods for secureJacobi groups. These include specifyingalgorithms to test whether the curveresists all known attacks. Another task isto speed up the arithmetic needed inorder to improve the running times of thecorresponding cryptosystem.

Link:http://www.itwm.fhg.de/mab/competences/Crypto/

Please contact: Norbert Göb, FhG-ITWMTel: +49 631 303 1861E-mail: [email protected]

Georg Kux, FhG-ITWMTel: +49 631 303 1865E-mail: [email protected]

The Fraunhofer Institute for Industrial Mathematics in Kaiserslautern is developinga cryptosystem based on hyperelliptic curves. The main goals of this project are theestablishment of the algorithmic foundations and the implementation of a prototype.

Cryptography using Hyperelliptic Curvesby Norbert Göb and Georg Kux

A hyperelliptic curve overthe real numbers. Forcryptographic purposes oneconsiders only curves overfinite fields.

Page 20: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

20 ERCIM News No. 49, April 2002

IT security is an area with great opportu-nities thanks to the ever increasing use ofembedded components in everyday life(payment cards and terminals, GSMSIM cards, electronic purses etc). It alsoinvolves technical challenges because ofthe very high expectations of the marketregarding security (confidentiality,integrity etc) and the scarce resources ofdevices such as smart cards and smallterminals. One of the reasons for TrustedLogic’s success is a pragmatic approachbased on the combination of twocomplementary ingredients: • the definition and the application of

rigorous methodologies and• the design and use of appropriate

innovative tools.

Because security is by its nature a globalconcern, this combination of methodolo-gies and tools covers all the steps of thedesign, development and validation of ITproducts. Here we provide some exam-ples of key components of the securitychain and briefly outline Trusted Logicsolutions.

System Design: from SecurityAnalysis to Design and DevelopmentRelying on its experience in securityevaluation, Trusted Logic has putforward a methodology covering thecomplete IT product lifecycle: from therisk analysis to the definition of the secu-rity architecture and the development ofthe product. This methodology is basedon a precise definition of the model ofthe IT environment (including roles,assets to be protected, threats, etc.) andthe different refinement levels of theproduct (from the functional specifica-tion to the implementation). Most impor-tantly, this approach is in line with theCommon Criteria which is the interna-tional standard for the evaluation of ITproduct security. In some sense, it caneven be seen as an interpretation of the

Common Criteria based on TrustedLogic complementary expertises insecurity, formal methods and softwaredevelopment.

Trusted Logic has developed a toolsupporting its methodology for stepwiserefinement. This tool, called TL-FIT,provides a variety of functionalitiesincluding modelisation, model consis-tency checking (both internal and refine-ment) and Common Criteria documentgeneration. TL-FIT illustrates the prag-matic approach stressed in the introduc-tion because it makes it possible to adaptthe description style (informal, semi-formal, formal) and the associated verifi-cations to the level of assurance stem-ming from the risk analysis. It is alsorepresentative of the need for innovationin this area since no other instrumentedinterpretation of the Common Criteriahas been proposed so far and the properintegration of formal and semi-formalmethods is still a very active researcharea.

Validation: Test, Analysis and VerificationValidation often boils down to testing inthe traditional software engineeringpractice. Due to the very high require-ments in the area of secure embeddedsystems, it is necessary to use a widerrange of techniques and, most impor-tantly, to justify their use and motivatetheir complementarity. First, testingitself has to be conducted in a systematicway. Trusted Logic’s offer includes bothsecurity testing and functional testing.The first category is based on the poten-tial vulnerabilities identified for theproduct and the second relies on its func-tional specification. Functional testing issupported by a tool, called TL-CAT,which provides different levels ofautomation (from the direct descriptionof test suites in a high-level language to

the automatic generation of test cases).The other validation techniques used byTrusted Logic are program analysis andprogram verification. Program analysershave been designed and developed byTrusted Logic to prove various proper-ties of Java Card programs, from stan-dard Java bytecode type correctness tospecific security policies. Depending onthe technical and economical context,such analysers can run either on thesmart card itself, on hardware securitymodules, or on mundane workstations.The most ambitious verifications, whichcannot be performed automatically by aprogram analyser, can be conductedwithin an interactive theorem prover. Asan illustration, Trusted Logic has usedCoq, the prover developed by Inria, tocheck the correctness of its byte codeverifier and other key components of aJava Card virtual machine.

Conclusion: Innovation and PragmatismTrusted Logic’s original market was thebanking sector and smart card industrybecause this is where a strong need forsecure embedded systems first appeared.This market is currently expanding toinclude terminal manufacturers, GSM,telecommunication operators, applica-tion providers etc, and Trusted Logic hasbecome a leading actor in this area. Itsreference list includes major players suchas Visa, MasterCard, Schlumberger,Gemplus, Ingenico, France Telecom,Sun etc. We believe that the key factorsfor this success are the following:

1. Innovation: embedded systems haveevolved extensively during the lastdecade and the rate of progress is notexpected to slow in the next few years.One of the key technical factors here isthe advent of open systems, especiallythrough the rolling out of Java-basedsolutions. This facility introduces further

by Dominique Bolignano, Daniel Le Métayer and Claire Loiseaux

Trusted Logic is a start-up company stemming from INRIA which was created inJanuary 1999. Its core business is the development and validation of secureembedded systems.

Trusted Logic’s Approach to Security for Embedded Systems: Innovation and Pragmatism

Page 21: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 21

challenges in terms of security and so farwe have seen only a small number of thenew possibilities offered by opensystems. To meet these challenges,Trusted Logic devotes a substantial partof its manpower to innovation (as anillustration, most of the techniquessketched in this paper are patented andlicensed) and this effort will continue toincrease in the future.

2. Pragmatism: the first motto could be“the use of the appropriate technique forthe job”. The choice of a technique(method or tool) should ultimately bemotivated by the risk analysis and by theidentification of the key security compo-nents which warrant the strongest valida-tion efforts (the top level being verifica-

tion using a theorem prover). Anothercrucial issue is the compatibility withstandards: as mentioned above, all themethodologies and tools described hereare in line with the Common Criteria.This compatibility is of prime impor-tance both from a business and a tech-nical point of view; in particular, itmakes it easier to factorise efforts andshare a common body of knowledge andmethods within the security community.Among the other standards which are atthe core of Trusted Logic activities, weshould mention the languages of the Javafamily (in particular Java Card), theGlobalPlatform services for secure cardmanagement and the emerging profilesfor small terminal interoperability (STIPand MIDP).

In conclusion, we believe that it isnecessary for Europe to foster newcollaborative efforts between researchcentres and innovative companies onsecurity for embedded systems. TrustedLogic already collaborates in differentways with several European researchcentres, including INRIA, and intends toreinforce and multiply these links in thefuture.

Link:http://www.trusted-logic.fr/

Please contact:Daniel Le Métayer, Trusted Logic S.A., FranceTel: +33 1 30 97 25 14E-mail: [email protected]

Network security encompasses crypto-graphic protocols and algorithms used toensure secure communication in ahostile environment. Such securecommunication is indispensable in areassuch as Internet, e-commerce and mobilecommunications. Fixed and mobilenetworks are vulnerable to a variety ofattacks, such as message replay, parallelsession, data interception and/or manip-ulation, repudiation and impersonation.Before trusting security protocols, it isnecessary to have some degree of assur-ance that they fulfil their intended objec-tives.

Traditionally, general-purpose crypto-graphic protocols have been designedand verified using informal and intuitivetechniques. However, the absence offormal verification of these protocolscan lead to flaws and security errorsremaining undetected. For example, the

public-key authentication protocol ofNeedham and Schroeder was consideredsecure for over a decade. Verification ofthe Needham-Schroeder protocol usingformal logics exposed vulnerability toreplay attacks in this protocol. This high-lights the fact that even comparablysimple protocols are difficult to verify byintuitive techniques.

Formal verification aims at providing arigid and thorough means of evaluatingthe correctness of a cryptographicprotocol so that even subtle defects canbe uncovered. Protocol verificationmethods include state space searching,the use of process algebras and logic-based analysis. Unfortunately, theseformal verification methods are highlycomplex and cannot be easily applied byprotocol designers.

The work undertaken by the DataCommunications Security Laboratory atthe University of Limerick includes thedevelopment of a logical technique,which can be used to reason aboutpublic-key cryptographic protocols. Thetechnique combines the modal logics ofknowledge and belief. Axioms are usedto model the low-level properties ofcryptographic communication systemssuch as: (i) data encryption and decryp-tion, (ii) the transmission and receptionof network data and (iii) the inability of aprincipal to decrypt a message withoutknowing the appropriate private key.These properties can be combined usinginference rules to allow analysis of awide range of public-key cryptographicprotocols. The constructs of the logic aregeneral purpose, enabling users deducetheir own theorems, thus allowing forincreased flexibility.

by Tom Coffey, Reiner Dojen and Tomas Flanagan

The research undertaken by the Data Communications Security Laboratory at theUniversity of Limerick includes: cryptographic algorithms and protocols for openhostile environments, non-repudiation protocols, global-wide identification,authentication and access control schemes, formal verification of security protocolsusing logical techniques and the developing area of steganography and informationhiding.

Verification of Cryptographic Protocols used in Fixed and Mobile Networks

Page 22: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

With its use of programmable smartcards and payment via the Internet, elec-tronic commerce must guarantee theconfidentiality and integrity of the datainvolved in transactions. The ever-increasing presence of software in theseapplications means that verifying thatthis software conforms to such securityrequirements becomes an all-importanttask which is far from trivial. The IST

project ‘Secsafe’ has as its aim the devel-opment of methods for validating thesafety and security of software that applyto both domains. This has led to theproject focussing a substantial part of itsefforts on the Java programminglanguage and its dialect Java Card,dealing with both source-level and byte-code-level applications. The project isdivided into five activities: specification

of security properties, semantics, staticanalysis, algorithms and tools. The part-ners in the project are Imperial College(coordinator), the Technical Universityof Denmark, INRIA and the SMETrusted Logic SA.

The primary thrust of the project hasbeen the security of multi-applicationsmart cards programmed using the Java

Security and Safety through Static Analysis by Chris Hankin and Thomas Jensen

Static analysis of programs is a proven technology in the implementation ofcompilers and interpreters. Recent years have begun to see application of staticanalysis techniques in areas such as software validation and software re-engineering.‘Secsafe’ is an IST project that aims at demonstrating that static analysis technologyfacilitates the validation of security and safety aspects of systems based on theInternet and on smart cards.

SPECIAL THEME: INFORMATION SECURITY

22 ERCIM News No. 49, April 2002

Protocol validation using this logic canbe accomplished by deductive reasoningbased on the application of valid rules ofinference to sets of valid axioms. Oncethe logical syntax, rules of inference andaxioms have been specified, the deduc-tion process proceeds as follows:• Formally express protocol steps in

the language of the logic• Formally express the desired protocol

goals• Starting with the initial protocol

assumptions, build up logicalstatements after each protocol stepusing the logical axioms andinference rules.

• Compare these logical statementswith the desired protocol goals, to seeif the goals are achieved.

It is important that the protocol goals becorrectly formulated. If the desired goalshave not been achieved, then this gener-ally points to a missing hypothesis in theprotocol assumptions or the presence ofsome protocol flaw. If a protocol goal isinaccurate or unaccounted for, then theverification cannot succeed. In this way,a formal analysis not only assesses thereliability of a protocol, but also compelsa designer to formally and unambigu-ously state the protocol assumptions anddesired goals.

Future work is aimed at developing atool-suite to simplify the formal verifica-tion process because current techniquesare highly complex and their applicationis error prone. Major components in thiswork include:• automating the application of the

logic, which requires the translationof the axioms, theorems andinference rules of the used logic intothe language of some proving-engine.

• assistance in specifying protocols inthe language of the logic, whichinvolves providing a user-friendlyinterface to simplify the specificationof the goals, assumptions andprotocol steps.

Such a formal verification tool will offercommunication security protocoldesigners, for both fixed and mobilenetworks, a significant advantage in theworld ICT marketplace. It will enablethem to prove the security and trustwor-

thiness of the cryptographic communica-tion protocol at an early stage of itsdesign. Designers will then be able toprove, and improve, the security of thecryptographic protocols before theimplementation phase begins, therebyreducing costs and increasing produc-tivity. In addition this formal verificationwill significantly add user confidence tothe end product.

In the current economic climate, wheresecurity poses a most serious obstacle tothe continued growth of e-commerce andm-commerce, the requirement that secu-rity protocols be formally verifiedcannot be overstated.

Link:http://www.ece.ul.ie/Research/DataComms

Please contact:Tom Coffey, University of Limerick, IrelandTel: +353 61 202268E-mail: [email protected]

ProtocolVerificationProcess.

Page 23: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 23

Card language. Results so far include alist of security properties that Java Cardapplets should satisfy. Among otherthings, these properties are concernedwith the allocation of memory, thehandling of exceptions and the flow ofinformation between applets. Anothertangible outcome of the project is thedefinition of the Carmel intermediatelanguage and its semantics. The Carmellanguage is distilled from the Java Cardbytecode language with the aim ofproviding a small set of bytecodes thatretains all the features of Java Card. Thislanguage has been given an operationalsemantics that specifies in detail theaction of each Carmel instruction. Inparticular, it describes the workings ofthe firewall that separates the appletsinstalled on a multi-application JavaCard.

The verification of security properties isdone by a static analysis that will build acorrect approximation of the dynamicbehaviour of the applets. This approxi-mation will include a description of how

data and objects flow between applets,which methods are called by a givenapplet and which instructions might leadto a security exception being raised. Theproject aims at developing a family ofanalyses such that the precision and costof an analysis can be adjusted to the veri-fication problem at hand. The flow logicframework for specifying static analyseshas been chosen as an appropriate tech-nology for ensuring this flexibility. Itwill be combined with powerfulconstraint resolution techniques in orderto implement the analyses in an efficientmanner.

One of the major problems in the field ofmulti-application smart cards is how todownload applets dynamically on a cardin a secure fashion. Due to the limitedresources on the card, only a certainamount of verification can be done on-card. The Secsafe project contributes toovercoming this difficulty by investi-gating how analyses can be done as cost-effectively as possible and in a modularfashion. A first and important step

towards this goal was achieved whenTrusted Logic managed to design a byte-code verifier for Java Card that is effi-cient enough (in particular memory-wise) to be executed on-card. For thisachievement, Trusted Logic won the2001 European IST Prize. Extending thisapproach to other kinds of security veri-fications requires the development oftechniques for modelling and analysingfragments of code that will support theinference of secure interfaces for applets.These interfaces will list the propertiesthat a foreign applet must satisfy in orderto ensure that its loading will not jeopar-dise the overall security of the card. Thetheoretical tools employed in this taskinclude modular abstract interpretationand novel approaches to semantics ofopen systems.

Link:Project homepage: http://www.doc.ic.ac.uk/~siveroni/secsafe/

Please contact:Thomas Jensen, IRISA/CNRS, FranceTel: +33 2 99 84 74 78 E-mail: [email protected]

Cryptography has long been regarded asthe main practical means to protect theconfidentiality of information travelingon the communication networks. It isnow also being adopted in many morecomplex applications, where the correct-ness of the algorithm does not guarantee‘per se’ the correctness of the applica-tion. Procedures that apply cryptographyare largely being used at the moment formessage authentication, personal identi-fication, digital signatures, electronicmoney transfer and other critical appli-cations. Even if we assume that the cryp-tography in such procedures iscompletely reliable, weaknesses mayresult from the way in which it is used

and assembled in the communicationprotocol. Noteworthy examples of thisrange from academic cryptographicprotocols, such as the Needham-Schroeder public key protocol (1978),which was believed to be correct forseveral years until shown to be flawed byLowe in 1996 (using formal techniques),to industrial applications, such as theJava programming language (which wasfound to have type flaws leading to secu-rity holes) and the recently announcedsecurity holes in Netscape Navigator andInternet Explorer. Many of these couldconceivably have been prevented by acareful formal design and analysis.

The detection and prevention of bugs arein fact two of the main reasons for usingformal methods and related approaches:the specification of a system is an essen-tial tool for analysis, and may help todiscover many design errors. If the spec-ification is given in an executablelanguage, system execution can be simu-lated, making it easier to verify certainproperties (early prototyping). Otherreasons to use formal specifications typi-cally include the need to express userrequirements unambiguously, and toproduce a reference guide for the systemimplementer.

Security is becoming a crucial issue in economic and social activities that involveelectronic transactions. The ‘Istituto di Informatica e Telematica’ (IIT-CNR) isconducting several activities in the field of computer security. In particular, onegroup is involved in the definition and application of correct and rigorous formalmethods for the analysis of network and system security aspects.

Security: Formal Methods at Workby Stefano Bistarelli, Fabio Martinelli and Marinella Petrocchi

Page 24: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

24 ERCIM News No. 49, April 2002

In the formal analysis approach, a secu-rity protocol (or architecture) iscommonly described as a process in anexecutable specification language. Thisprocess is designed to act in a hostileenvironment, usually represented asanother process of the language (theattacker). In the worst-case analysisscenario, the attacker has completecontrol over the communicationnetwork, ie, it can intercept, fake andeavesdrop all communications. Theentire system can be analyzed byapplying specific techniques. Forinstance, security is sometimes analyzedby comparing the state-space resultingfrom the execution of the protocol withand without the attacker. The differencesmay represent possible attacks that haveto be carefully studied. It is worth notingthat the attacker is able to deduce newmessages from the messages it hasreceived during a computation. Thebasic algebraic features of cryptographicfunctions are represented as rewritingrules for terms of a language that denotecryptographic messages. This means thatthere may be rules that allow an attackerto discover a message encrypted with acertain key when the attacker also holdsthe correct decryption key.

Analysis methods of this type can be alsoimplemented in automated softwaretools. These tools can be used by(reasonably) non-expert people and,hopefully, by the end-user of a securityapplication in order to achieve a bettercomprehension of the security mecha-nisms offered by the application itself.

Our current and future activities in thefield of formal analysis of computersecurity can be summarized as follows:

Theoretical: Our goal is to develop newand more efficient analysis techniquesfor security protocols and open systems.Recent advances concern the simulationof possible attacks using symbolic tech-niques to represent the state-space of thesystem under attack more succinctly.Other techniques aim at defining qualitymeasures with respect to the relevance ofpossible attacks on security protocols, byenabling assessment of the relativemerits of the protocols.

Applicative: We are now developing andtesting a software tool (PaMoChSA, orPartial Model Checking SecurityAnalyzer) implementing our analysistechniques. Features of the currentimplementation include:• possibility to check a number of

security properties, eg confidentiality,message and entity authentication,integrity

• no specification needed for theattacker

• the underlying theory is almostparametric with respect to the set ofterm rewriting rules for modelingcryptography

• a compiler translating from thecommon (and ambiguous) notationfor security protocols used in theliterature to a more accurate notationbased on formal descriptiontechniques.

We are now applying our verificationtool to real-life case studies. Forexample, we have performed a concep-tual analysis of some procedures of theopen source software OpenCA, which isbasically a set of procedures for runninga Certification Authority, issuing X.509digital certificates. We have alsoanalyzed some security mechanisms ofthe Simple Certificate EnrollmentProtocol (SCEP).

Several national agencies and institu-tions support our research, for instancethe Italian National Research Council(CNR), the Italian Ministry for theUniversity and Scientific and

Technological Research (MURST), theCenter of Excellence for Research,Development and Demonstration ofAdvanced Information andCommunication Technology (CSP).

Link: http://www.iat.cnr.it/attivita/progetti/progetti.html

Please contact:Fabio Martinelli, IIT-CNRTel: +39 050 315 3425E-mail: [email protected]

PaMoChSA graphical interface.

Page 25: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 25

A proper understanding of the limita-tions of the existing infrastructures is animportant prerequisite for designing newservices with a satisfying degree of secu-rity. In our opinion, an improvedmethodology for risk analysis is anecessary first step towards verifyingand/or improving the security of suchsystems.

Ideally, risk management should beapplied across all aspects of depend-ability. However, the increasingcomplexity of information systems urgesthe improvement of existing design andanalysis methods in order to increase thelikelihood that all possible threats aretaken into consideration. More particu-larly there is a need for combiningcomplementary security risk analysismethods with respect to the systemarchitecture. We are not aware of analready developed integrated approachto system design and risk analysis, wherethe architecture expressed in the infor-mation system model is used to guide thecombined application of risk analysistechniques. This need is being addressed

in the European project CORAS for thearea of security risk analysis.

An Overview of CORASThe overall objective for the CORASproject is to develop a practical frame-work for model-based security riskassessment by exploiting the synthesis ofrisk analysis methods with semiformalspecification methods supported by anadaptable tool-integration platform. Asillustrated by the following figure, theCORAS framework has four mainanchor-points.

The CORAS risk assessment method-ology integrates aspects of HazOp anal-ysis, Fault Tree Analysis (FTA), FailureMode and Effect Criticality Analysis(FMECA), Markov Analysis as well asCRAMM. It is model-based in the sensethat it gives detailed recommendationsfor the use of UML-oriented modellingin conjunction with assessment. Itemploys modelling technology for threemain purposes:• to describe the target of assessment at

the right level of abstraction.

• as a medium for communication andinteraction between different groupsof stakeholders involved in riskassessment.

• to document risk assessment resultsand the assumptions on which theseresults depend.

The core risk analysis segment of theCORAS risk management process arethree sub-processes (‘identify risks’,‘analyse risks’, ‘risk evaluation’),grouped together at the top layer of thefigure. The CORAS risk managementprocess consists of instantiations ofabstract patterns given the CORASframework using different risk analysismethods in order to analyse differentparts of the system. The choice of riskanalysis method upon which the abstractpattern is instantiated depends on theviewpoint in which the part to be anal-ysed appears and the detail incorporatedin the context of the analysis depends onthe phase in the development lifecycle.The specific instances of the CORASrisk management process that are usedthroughout the system lifecycle dependon the target (sub)system and the contextof the analysis.

As the system description becomes moreelaborate, any combination of refine-ment and decomposition results into apropagation of the risk analysis from thecomposite object to the componentsguided by the system architecture.

The CORAS Integration PlatformThe CORAS platform is based on dataintegration implemented in terms ofXML technology. The platform is beingbuilt around an internal data representa-tion formalised in XML/XMI (charac-terised by XML schema). Based onXSL, relevant aspects of the internal datarepresentation are being mapped to theinternal data representations of othertools (and the other way around). This

by Theo Dimitrakos, Juan Bicarregui and Ketil Stølen

CORAS is a European research and technological development project developinga tool supported framework for model-based security risk assessment.

CORAS - A Framework for Risk Analysis of Security Critical Systems

Figure 1: The CORAS framework for model-based risk assessment.

Page 26: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

Security is a matter of trust. Today, anenterprise cannot afford to lose its repu-tation and consequently its clientsbecause of insufficient system security.IT security measures should therefore betransparent, complex and purposeful, butalso easy to implement and cost-effec-tive. The question is: how can we accu-rately define IT security? Since ITsystems are strongly influenced byhuman behaviour, the answer lies farbeyond purely technical solutions. TheFraunhofer ISST assesses the ‘maturity’of a company’s IT security with itsspecially developed ‘Security MaturityModel’. After determining the currentsafety level, a concept is composeddefining all the measures necessary toreach the next level. With this step-by-step model, the expected expenditure canbe calculated more thoroughly. Througha goal-directed deployment of all means(within a level), even investments willbecome economically efficient.

AnalysisThe analysis examines technical as wellas organisational components and theirintegration in the corporate culture. Thelatter two fields have been minimallyconsidered by previous procedures.

Therefore, SMM offers a real opportu-nity for assessment.

TechnologyAll possible components are tested ontheir state of technological developmentand effective usage. The focal points ofthis assessment are capacity, quality andintegration. Costs of purchase and opera-tion as well as future capacity are alsoconsidered.

OrganizationTechnology is, however, only effectiveif organisational concepts use it effec-tively. Besides the definition of an ITsecurity policy and its correspondingresponsibilities, the focus of interest liesin a holistic IT security concept.Connections with suppliers, customersand partners are of central importance.We therefore examine particularly thecontracts and the stated parameters in theService Level Agreements. Furthermore,we will assess the precise predefinitionsof authorisations and responsibilities.

Corporate CultureTechnical and organisational measuresmust be deeply integrated into the corpo-rate culture if a satisfactory effect is to beproduced. Only decisive daily behaviour

from both employees and management,also understood in the long term as aquality assurance process, will guaranteesuccess. An atmosphere of attention andcare can improve IT security consider-ably and can be very cost-effective. Thisis not only an assessment of IT expertsand people in charge of IT security, butof all employees working with IT appli-cations.

Apart from psychological aspects, suchas the sense of responsibility, we alsoassess existing know-how and qualifica-tions. A significant factor is purposefulstaff development and its appropriateplanning.

AssessmentThe results of the analysis are then eval-uated. Rather than being looked at sepa-rately, the various factors are observed insuch a way as to take into account howthey combine with each other. Thisallows us to accurately determine thecurrent level of security and ensure theappropriate procedure. Thus, one canreconstruct how this particular level hasbeen attained. Existing measures andactions necessary to reach the next levelwill also be identified to enable a goal-directed investment in those components

Are you investing your money for IT security in a holistic solution or are you guardingyour frontage, while leaving the back doors open? The Fraunhofer Institute forSoftware and Systems Engineering (ISST) has developed a ‘Security Maturity Model’(SMM) to assess a company’s IT security.

SMM – Assessing a Company’s IT Security by Holger Kurrek

SPECIAL THEME: INFORMATION SECURITY

26 ERCIM News No. 49, April 2002

allows the integration of system designcase-tools with analysis tools or tools forvulnerability and treat management., asshown in the following figure. StandardXML commodity component toolsprovide much of the basic functionality.

ConclusionsCORAS aims to support the designprocess by developing an innovativetool-supported risk analysis method-ology and process integrating: • methods for risk analysis

• semiformal description methods – inparticular, state-of-the-art methodsfor viewpoint- and object-orientedmodeling (UML, MSC, RM-ODP)

• tool-integration technologysupporting openness andinteroperability.

The main innovations of the CORASproject stem from its emphasis on inte-grating risk analysis tightly into a UMLand RM-ODP setting, supported by aniterative process, and underpinned by a

platform for tool-integration targetingopenness and interoperability.

Link:Project website:http://www.nr.no/coras

Please contact:Theo Dimitrakos, CLRC Tel: +44 1235 44 6387E-mail: [email protected]

Ketil Stølen, SINTEF Group, NorwayTel: +47 22067897E-mail: [email protected]

Page 27: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 27

which are still missing. Often largeinvestments into technical componentscan be significantly reduced thoughorganisational measures and the normaland necessary development of corporateculture.

The figure shows the four levels of themodel. Starting point is level 0 ‘blindtrust’, ie, the complete renunciation of ITsecurity. By the use of minimal means,level 1 can be reached. A coordination ofmeasures leads to level 2. The highestlevel can be attained through the usageof continuous processes (in detail these

levels are much more complex and arespecified in comprehensive measurecatalogues).

AdvantagesUsing SMM is a fast way to identify thecurrent state of a company's IT security.A cost-effective realisation is madepossible through the optimisation ofexpenditure and concentration on thenecessary focal points. With thebundling of all measures, synergiesarise. Thus, goals are reached faster andexpenditure is simultaneously reduced.The comprehensive model avoidsdiscrepancies within the measures,which could otherwise cause gaps in ITsecurity. The identification of the goalsof the following level allows a strategicapproach and improves financial control.

ServicesBased on SMM, the Fraunhofer ISSToffers the following services: • evaluation of the current situation, eg

during Technical Due Diligence• formulation of measures to increase

IT security• quality assurance attendance of IT

security measures.

To realise the measures, the FraunhoferISST relies on a procedural model andreference architectures.

OutlookSMM is constantly being developed,since customers regularly test it in prac-tice (small businesses and also large-scale enterprises and public administra-tions). In this way, not only is the anal-ysis refined but the measures are alsorepeatedly rated on the current state oftechnology and can thus keep pace withits highly dynamic development.

An important focus is the comparabilitywith other companies, not only to clas-sify the competition but also to attain acontinuously high level of securitybetween partners. All developments willbe added to the procedural model andreference architectures. Collaborationsalso exist between the institutes of theFraunhofer-Gesellschaft, especially inthe field of encryption and its necessaryinfrastructures (Public KeyInfrastructure, PKI).

Links: FhG-ISST: http://www.isst.fhg.de/

Please contact: Holger Kurrek, FhG-ISSTTel: +49 30 243 06 355E-mail: [email protected]

The four levels of the Security MaturityModel.

MICOSec (www.micosec.org) is anOpen Source implementation of theCORBA Security Services(CORBASec), which was designed andimplemented by ObjectSecurity Ltd.(www.objectsecurity.com) as part of

Deutsche Telekom T-Systems Nova’sSecure CORBA project. The originalmain goal of the project was to evaluatethe usability of CORBASec for devel-oping telecommunications applications,eg, for service provisioning or network

management. Due to the fact that noCORBASec implementations that metthe specific requirements of the projectwere initially available, it was decided toimplement the complete CORBASec 1.7level 2 specification.

by Rudolf Schreiner and Ulrich Lang

MICOSec is an Open Source implementation of the CORBA Security Services, whichwas developed by ObjectSecurity Ltd. as part of T-Systems Nova’s Secure CORBAproject. It was successfully used to develop CORBA-based applications, whichcomprised mobile clients on Compaq iPAQ Pocket PCs.

MICOSec: CORBA as a Secure Platform for Mobile Applications

Page 28: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

28 ERCIM News No. 49, April 2002

MICOSec currently supports IIOP forunprotected communications, SSLIOPbased on SSL Version 3, extendedattributes for X.509 certificates,extended level 1 interfaces, authentica-tion and message protection, SecurityDomain Membership Management,domain-based auditing (with flat file,Unix syslog and Postgresql as datastorage), and domain-based accesscontrol. In addition to the MICO ORB,MICOSec uses OpenSSL as a crypto-library and Postgresql as a database forstoring audit events. The latestMICOSec version is based on PortableInterceptors, so the Security Services caneasily be ported to other standard confor-mant CORBA ORBs.

The very high deployment costs forfuture mobile communications networksmake a quick return on investment a keysurvival factor for many communica-tions carriers. Only attractive applica-tions and services beyond the level ofsimple voice communication can moti-vate customers to spend more for mobilecommunication. To develop these appli-cations quickly and economically, astandard application platform is needed.Since many of these applications will besecurity sensitive, the platform shouldalso provide security functionality.

CORBA middleware is a sound base forthe development of complex anddistributed applications, but is normallyregarded as too big and too complex tomeet the requirements of mobile deviceswith limited resources. The first attemptsto port CORBA ORBs to popular PDAsproved this prejudice to be true, whenseveral groups tried to port the MICO

ORB to the Palm Pilot with only limitedsuccess. The ORB had to be strippeddown to minimal client-side function-ality, because the performance of thePalm Pilot, CPU speed, memory andelectric power were not sufficient tosupport a full CORBA ORB.

As part of our project, MICOSec wassuccessfully used for the development ofCORBA-based mobile applications on aCompaq iPAQ Pocket PC, a new genera-tion of mobile devices. PortingMICOSec to a Compaq iPAQ Pocket PCrunning under Linux was straightfor-ward, and it was only necessary to set upan appropriate cross-development envi-ronment. Then MICOSec, the under-lying crypto-library and some demoscould be compiled without problems.Porting existing CORBA applicationswas also very simple. The main issuewas the user interface, since the PocketPC has no keyboard and only a smalldisplay. However, the performance ofMICOSec on the Pocket PC is more thansatisfying. The main bottleneck is theRSA authentication at the beginning ofthe SSL handshake, which takes lessthan one second with a 1024 bit key.

MICOSec is currently used at DeutscheTelekom and ObjectSecurity for variousprojects (eg, for the development of asecure Parlay platform), for severaldemo applications to prove the viabilityof the concept, and for a research projectin Ubiquitous Computing. It is antici-pated that MICOSec will be used forcommercial applications. The main fociof the further development of MICOSecare the secure interoperability with EJBby implementing the Common Secure

Interoperability Version 2 protocol, inte-gration into enterprise-wide securityinfrastructures, and secure CORBAcomponents.

CORBA on the Pocket PC has proven tobe a very useful platform for mobileapplications, as it leverages all theadvantages of CORBA and allows aseamless integration of mobile devicesinto existing applications. For example,it is possible to easily implement agraphical user interface with a CORBA-based geographical information system.It is also possible to use this platform forsecurity-sensitive applications, since theCORBA security services provide thenecessary functionality for the enforce-ment of various security policies.

MICOSec on mobile platforms willbenefit from the further enhancementplanned for MICOSec on standardsystems, since the code base is the same(eg, in the future it will be possible todeploy secure CORBA components onthe Pocket PC). The mobile device willthen be a first-class part of a distributedapplication, rather than just a simpleclient with limited functionality. Weanticipate that this platform will provevaluable for the rapid development of abroad range of applications.

Links:ObjectSecurity Ltd.: http://www.objectsecurity.comMICOSec: http://www.micosec.org

Please contact:ObjectSecurity Ltd, UKE-mail: [email protected]

Ameneh Alireza, T-Systems Nova GmbH, GermanyE-mail: [email protected]

CORBA security features.

Page 29: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 29

The classical approach in informationsystems security requires a partitioningfrom the point of view of the organisa-tion, geography and structure of infor-mation, in terms of their level of sensi-tivity and their domain. With the devel-opment of telecommunications however,a system can be distributed all over theworld: data and code can be distributedand shared. This tremendous evolutionprevents the classical approach beingused.

Existing security techniques (PGP, C-SET, SSL, X509, etc) are known as stan-dards working on a particular aspect ofsecurity and risks. We propose a comple-mentary approach at the applicationlevel, possibly based on the cited stan-dard techniques.

Besides existing system-level andnetwork-level mechanisms, we believethat it is necessary to provide applica-tion-specific configurable techniques.Further, information transfer in object

systems is enriched and more typed,compared with non-object systems(requests, replies, mobility of agents,remote object creation). Theseexchanges often require the definition ofsecurity attributes (authentication,integrity, confidentiality). Finally, in thesetting of a given distributed application,some computers will play a specific role,and as a consequence, require specificrights and protections (eg, two securedservers versus access to a portablecomputer).

It seems of interest to organise in a hier-archical manner the different computersparticipating in a given distributed appli-cation, and to associate specific rights

with these hierarchies. Our work can beseen as the creation of a Virtual PrivateNetwork at the application level.Another issue is secured meta-computing: how to use a federation ofcomputing resources in a secure manner.A security policy for an application isspecified in a declarative manner. Anexample of a security policy file is givenin Figure 1.

A prototype has been implemented inJava with the ProActive library (over thestandard RMI layer). This prototype ismade of two parts:• interpretation and verification of the

consistency between different

by Isabelle Attali, Denis Caromel and Arnaud Contes

In the domain of distributed applications, networks, and mobile agents, this research— started in spring 2001 in the Oasis Team at INRIA Sophia Antipolis — aims todevelop mechanisms for specifying a security policy at a high level of abstractionfor a given application.

Security for Distributed and Mobile Active Objects with the ProActive Library

Figure 1: Example of a security policy filefor an application.

A user defines in this policy two abstractdomains (INRIA, ETCA) and expressesthat communications – Queries andReplies – have to be authenticatedbetween all computers of INRIA and ETCA(+A).

Moreover, mobile agents (M) are allowedonly from ETCA to INRIA, and must be inmode authentication, confidentiality, andintegrity (+A, +I, +C).

DomainsINRIA = *.inria.fr;ETCA = *.etca.fr;

PolicyINRIA <—> ETCA: Q,R # [+A,+I,?C];ETCA —> INRIA: M # [+A,+I,+C];

Figure 2: Deploying and monitoring a ProActive application using a graphical interface(IC2D) that makes it possible to ‘drag and drop objects over the world’.

Page 30: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

The aim of the project was to design andimplement additional security features toa standard Mobile IP environment. Themain prerequisite was to leave theMobile Node (MN) and Home Agent(HA) unmodified. When the MN iscommunicating with nodes in the sameforeign network the traffic will be routedunnecessarily through the publicInternet. This causes a major problemthat had to be solved. Another significantobjective was to provide a mechanismfor the mobile user and the users in theforeign network to dynamically allowand deny connections in between.

Normally, when optimizing routes, theMN requests the HA to send BindingUpdate to all CNs. In this project thetriangular routing between the HomeAgent (HA), the Correspondent Node(CN) and the MN (via FA, ForeignAgent) was solved using a BindingUpdate message sent to the particular

CN by the FA. Only the nodes located inthe foreign network were altered, thusleaving the MN and the HA unmodified.

An additional application, which we callMobile Firewall Daemon (MFWD), wasdesigned to provide the dynamic connec-tion control. The FA notifies the MFWDwhenever a MN arrives to its network.All traffic to and from the MN would gothrough MFWD. MFWD would thenlisten to its web user interface for localand mobile users requesting changes infirewall rules.

These designed mechanisms were imple-mented on Linux-based systems with theintent to create a ‘proof-of-concept’ trialnetwork. The function of this trialnetwork has been demonstratednumerous times since its initial setup.

The success in the trial network setupshowed that the project achieved reason-

able results. However, many improve-ment ideas have arised, and we are plan-ning to take this concept for another iter-ation round. A major enchancementwould be to implement the MFWD in anActive Network environment, whichwould make it possible to offer moreflexible traffic control and supportingservices.

Please contact:Sami Lehtonen, VTTTel: +358 9 456 7240 E-mail: [email protected]

Mobile IP itself doesn’t offer security features such as access restrictions in foreignnetworks. In the project WLANSecu, started in 2000, a group of scientists at VTT,Kimmo Ahola, Sami Lehtonen, and Sami Pönkänen researched security of mobileusers visiting foreign networks.

Mobile IP Security in Foreign Networksby Sami Lehtonen

SPECIAL THEME: INFORMATION SECURITY

30 ERCIM News No. 49, April 2002

security policies, and policynegociation between hosts

• handling of these policies usingpublic, private and symmetric keys;we are using the SPKI public keyinfrastructure for encodingcommunications.

Examples have already been specifiedand executed and early performancemeasures have shown that this approachis viable. For instance, only 25ms wererequired for the full treatment of a securedmessage including encryption, transfer,decryption, and the checking of the sendercertificate and digital signature.

Our approach proposes security featuresat the application level, especially in thesetting of distributed objects withmobility.

Compared to related work, the ProActivesecurity can be characterised by fourmain advantages: • we use standard (non-modified) tools

(eg, standard JVM, standard SPKIinfrastructure); this gives a wideportability to our applications

• the declarative language allows easydefinition and composition ofsecurity policies, potentially in adecentralised manner

• object and activity mobility is nowpossible in distributed applicationsand shows up new solutions forinformation protection

• security can be finely tuned withrespect to information sensitivity andthreat levels by treating eachapplication separately. This wouldallow the high cost of cryptographicmechanisms in a local network to beavoided.

Future work will include a better controlof security policies in the presence ofmobile code and mobile agents. We willalso study the composition of securitypolicies (hierarchy, compatibility, staticverification, etc). Another issue is tostudy the centralisation of all (or part) ofa security policy. Lastly, we wish toformally model our approach in order toprove that policies are respected, in thepresence of migration and variableplacements of objects on machines.

Link:http://www.inria.fr/oasis/ProActive/

Please contact: Isabelle Attali, INRIATel: +33 4 92 38 79 10E-mail: [email protected]

Page 31: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 31

GRID technologies define a newpowerful computing paradigm byanalogy to the electric Power Gird.Based on the Internet, the GRID seeks toextend the scope of distributedcomputing to encompass large-scaleresource sharing including massive data-stores and high-performancenetworking, and shared use of computa-tional resources, be they supercomputersor large networks of workstations.TheGRID concept has been generalised tocover a virtual organisation, defined asany dynamic collection of individualsand institutions which are required toshare resources to achieve certain goals.

Currently the applications driving thedevelopment of this infrastructure arelarge-scale scientific collaborations,such as the Information Power GRID,(www.ipg.nasa.gov) and the EuropeanDataGRID Project (www.eu-dataGRID.org), which have a clear needfor the collaborative use of resources,both data and computational, and estab-lished communities which can pool theirresources for common goals. Tools areappearing to support the GRID concept,notably those developed in the projectsGlobus (www.globus.org), Condor(www.cs.wisc.edu/condor) and Legion(www.cs.virginia.edu/~legion). In thenear future, the GRID concept will findapplications in commerce and industrysupporting distributed collaborativedesign and engineering, or distributedsupply chains.

Grid technology will be used to allowenterprises to outsource computingresources and the ad-hoc creation ofVirtual Organisations (VOs) withincommercially available computing grids

will allow for an effective managementof computing resources at a global scale.

Security management is major obstacleto overcome in the route of commercial-ising GRID infrastructures. The tradi-tional GRID infrastructure, such as theGRID Security Infrastructure (GSI) fromGlobus, using the X.509 certificates asits authentication mechanism, dependson interfaces at the protocol level toprovide the security infrastructure.However, this approach has concen-trated on authentication and does notcover all aspects of security manage-ment. In particular there is little supportfor authorisation management, the speci-fication and enforcement of securitypolicies, the treatment of cases where the(agents managing the) collaboratingresources have no prior knowledge ofeach other (or their certifying authori-ties).

In the paper ‘Building Trust on theGRID - Trust Issues UnderpinningScalable Virtual Organisations’, weidentified the need to supplement thisinfrastructure by raising the level of thetrust within a GRID architecture. Thisbuilts upon the established literature intrust analysis, which provides a frame-work for analysing how trust should betransmitted between agents in distributedsystems, especially dealing with how topropagate trust between agents with littleor no prior knowledge of each other. Thebasis of this infrastructure is the explicitdeclaration and publication of trust poli-cies by participating resources on theGRID using an appropriate policy speci-fication exchange language. Agentswishing to utilise resources would beable to present their credentials, policiesand requirements to the participating

resources and an automated processwould verify the credentials, possiblyreferring to trusted third parties, to estab-lish identity, deduce authorisation basedupon supplier and consumer policies andto authorise (or revoke) access under thespecified the conditions of use. Tosupport this, one would need to add atleast the following: • resource brokerage services to

facilitate resource discovery andallocation in compliance with acontractual realisation of QoSrequirements

• a means of publishing, negotiatingand exchanging policy statements

• appropriate trust support services,such networks of trust authorities,and an infrastructure allowing for thedynamic formation of certificationchains

• a trust management framework ableto cope with the complexity anduncertainty underpinning mostinteractions in open dynamic systemssuch as the GRID architectures; thiswill need to draw a distinctionbetween perceived and actualsecurity, relate trust to enterpriseobjectives and weigh it againsttransaction risk

• a policy-driven security managementsystem, which is able to support thedynamic formation of collectives ofentities which are required to shareresources to achieve certain goals.

In a paper entitlet “Policy-Driven AccessControl over a Distributed FirewallArchitecture” (in Proceedings of theIEEE Policy Workshop 2002, IEEEPress, 2002) we propose to address thelatter by bringing together two currentlines of research:

by Theo Dimitrakos, Brian Matthews and Juan Bicarregui

GRID computing has emerged as a new approach to a high-performancedistributed computing infrastructure within the last five years. The GRID concepthas been generalised to cover a virtual organisation, defined as any dynamiccollection of individuals and institutions which are required to share resources toachieve certain goals. In this article we provide an overview of ongoing researchtowards building GRID-aware security and trust management solutions.

Security Issues underpinning Large Virtual Organisations

Page 32: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

32 ERCIM News No. 49, April 2002

• policy-driven access control, wherepolicies are identified as first-classdata objects in their own right, whichcan be negotiated and tailored toparticular groups of clients

• a distributed firewalls architectureaugmented with the concept ofClosed User Groups (CUG), whichhas the benefits of facilitating P2Pcollaboration, whilst allowing tomaintain the integrity of systemssupplied by central administration ofthe security policies.

So far we have focused on the realisationof policy-driven access control manage-ment for CUG-aware distributed fire-walls that can easily adapted to facilitatea GRID based implementation. In thefollowing months we plan to test theapplicability of this architecture in aGRID test-bed in areas such as e-Science(within CLRC e-Science programmehttp://www.escience.clrc.ac.uk) or e-Business (within GRASP a forthcomingEuropean project aiming to explore anadvanced infrastructure for Application

Service Provision based on GRID tech-nology.

Links: CLRC e-Science programme: http://www.escience.clrc.ac.uk/GRASP: http://www.bitd.clrc.ac.uk/Activity/GRASP/

Please contact:Theo Dimitrakos, CLRC Tel: +44 1235 44 6387E-mail: [email protected]

The main business of CLRC is topromote research, to support theadvancement of knowledge and topromote public understanding inscience, engineering and technology.This involves close collaboration with awide variety of academic and researchinstitutes and technological companiesworld-wide, and a free exchange ofinformation with both these organisa-tions and the general public is a funda-mental part of most of CLRC’s work.

The facilities offered by the Internet arenowadays essential to meet theserequirements for collaboration anddissemination, but they can only beemployed effectively by operating arelatively open Internet security policy.Often the operating regime within aninternational collaboration is determinedby that collaboration, and it is notpossible to impose security standardswhich might prevent that collaborationinter-working effectively.

Furthermore, because most parts of thelaboratory are involved in such collabo-rations, a very large number of servers of

various kinds must be visible to theInternet, yet the staff involved in main-taining those servers are scientists, notsecurity experts.

Maintaining adequate security underthese conditions requires flexibility anda high degree of expertise to configureand maintain the several protectionmechanisms deployed in the firewalls:these must keep intruders out yet notimpede the work of the laboratory. Howhas this been achieved? Initially weadopted two main techniques to limit theexposure of CLRC computers tointruders.

First, we divided our computers intothose that needed to provide externallyvisible services (let us call these Class Acomputers) and those that did not (ClassB), and we assigned IP addresses from aspecific range to Class A computers.This enabled us to block incomingconnections which attempted to contactClass B computers easily and efficientlyin the routers connecting our LAN to theInternet. This effectively hid over 90%of our computers from Internet intruders

without limiting the ability of thosemachines to initiate connections them-selves.

Second, we determined what preciseservices each Class A computer neededto provide, and limited connections tothose machines to the specific portnumbers which were required to deliverthose services.

Coupled with the requirement that thesystem administrators of Class Acomputers must maintain their systemsto specific standards, these relativelysimple techniques provided adequatesecurity for most of the period up to theend of 2000, and met the objective of notinterfering with the normal work of theLaboratory. However, during early 2001the continually increasing number ofvulnerabilities being actively exploitedby intruders, the widening variety andincreasing effectiveness of their attacks,and the appearance of automaticallypropagating worms necessitated furthermeasures.

Maintaining adequate security against the proliferating threats from Internet hackersis difficult enough for an organisation which needs only occasional access to theInternet; achieving it when continuous Internet access is absolutely essential tocarrying on the most important parts of the organisation’s mission, as it is forresearch laboratories involved in international collaborations, is a continual challengeto technical staff. We describe in non-technical terms some of the approachesadopted by CLRC over the last three years to meet this challenge.

Information Systems Security at CLRCby Trevor Daniels

Page 33: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 33

Because worms are able to propagatevery rapidly it is no longer effective torely on the manual application of patchesto systems to prevent infection. The timefor a propagating worm to probe theentire Internet is measured in hours oreven minutes with optimal search tech-niques, yet reactive system patching atbest takes several hours and this extendsto a day or two at weekends. To success-fully combat worms it is necessary toanticipate their characteristics anddeploy generic preventative measures inadvance.

Most worms propagate via either emailor websites. It is therefore essential to beable to intercept all email and all webbrowsing at the site periphery in order toscreen out network packets carryingworms. This requires forcing all email tofirst pass through a single logical receiptservice and for all external web browsingto be conducted via a proxy server.During 2001 both of these measureswere enforced by appropriate blocks inthe main site routers.

Once all the web and email traffic isbeing routed through specific machinesit is possible to install screening serviceson those machines. These screens taketwo forms. The first is a standard viruschecker, updated automatically at leastdaily and more frequently manually asnecessary. This measure prevents knownand established worms and virusesgaining access to the site by these routes.However, this alone is still not effectiveagainst new and rapidly propagatingworms for which no signature is yetavailable in the virus scanners. Toreduce the exposure to these it isnecessary to screen out generically thosefile types which are likely to carryexecutable content. This is difficult forthose file types which are likely to betransmitted as part of the normal busi-ness of the Laboratory, but a number ofthem, eg those used for screen savers, arenot essential to business and may beblocked.

All these measures were introduced atCLRC during 2001. How successfulhave they been? In spite of thesemeasures there have been a number ofcompromised machines within CLRC,but none of the compromises has been

serious and none have propagated inter-nally. Very little disruption to the workof the Lab has resulted from either thecompromises or the preventativemeasures taken, and to this extent theadopted approach has been successful.We believe the balance between preven-tion and working restrictions is aboutright.

DevelopmentsWhat is needed in 2002? There are somelessons to be learned from our experi-ence, and some further precautions thatwill be needed to prevent intrusions bythe more sophisticated techniques likelyto be employed by intruders in 2002.

The first observation is that the essentialsecurity methods are technicallycomplex and subject to human error.Most of the intrusions we have seenwould have been prevented by the proce-dures outlined above, but mistakes,perhaps inevitably, were made by thepeople responsible for their implementa-tion: filters in routers were installedincorrectly, system administrators failedto patch systems promptly or leftservices running which were notrequired, or misinterpreted oftencomplex instructions regarding systempatching. The lesson is that a single lineof defence is inadequate. As manyblocks, detection systems and protectivemeasures as possible must be deployed.Externally facing servers must becombined to reduce their number andtherefore also the number of staffinvolved in their maintenance, soconcentrating the technical expertisewhere it matters.

Second, in addition to installing virusdetection in all computers, relays andproxies it is now essential to installpacket filters in multiple routers to safe-guard against both human error and thesubsequent internal propagation of infec-tions should a system be compromised.This will require the reorganisation ofthe internal network to provide severallevels of protection. In this respect theuse of VLANs offers the simplest solu-tion.

Thirdly, specific action needs to be takento prevent infections of the more vulner-able home machines, visitor machines

and laptops propagating should they beconnected to the internal networks.Personal firewalls, specifically protectedsub-nets and various security proceduresall need to be considered and deployed.

All this in addition to maintaining all thevirus checking, router filtering and secu-rity procedures already in place.

ConclusionsComputing never stands still, and thatmaxim applies just as much to hackingtechniques and security measures as toany other aspect of computing.Continuous vigilance and continualchange are needed to safeguard anysystem with Internet access.Maintaining adequate technical exper-tise in networking has never been moreimportant. Achieving the right balancebetween freedom to exploit all featuresof the Internet and the restrictionsimposed by essential security measureswill remain a difficult technical chal-lenge for research laboratories for theforeseeable future.

Please contact:Trevor Daniels, CLRCTel: +44 1235 445755E-mail: [email protected]

Page 34: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

34 ERCIM News No. 49, April 2002

The properties of these systems intro-duce new security challenges that are notadequately addressed by existing secu-rity models and mechanisms. The scaleand uncertainty of this global computingenvironment invalidates existing secu-rity models. Instead, new securitymodels have to be developed along withnew security mechanisms that controlaccess to protected resources.

The past decade has seen the globalisa-tion of the information and communica-tion infrastructure. At the same time,distributed systems have grown fromcompany-wide networks to includeglobal federations of independent andseparately managed systems, eg, theInternet. Computing and communicationcapabilities are increasingly embeddedinto everyday objects; this means that wewill soon be able to interact with billionsof ‘intelligent’ devices whose owners wedo not know and which we should notnecessarily trust. The scale of suchglobal computing systems means thatsecurity policy must encompass billionsof potential collaborators. Mobilecomputational entities are likely tobecome disconnected from their homenetwork, which requires the ability tomake fully autonomous security deci-sions; they cannot rely on a specificsecurity infrastructure such as certificateauthorities and authorisation servers.Although a public key infrastructuremay be used to reliably establish theidentity of other collaborators, this iden-tity conveys no a priori informationabout the likely behaviour of the prin-cipal. Identity alone therefore cannot beused for access control decisions, ie, allparticipants are virtually anonymous.This fact excludes the use of most accesscontrol mechanisms currently in use onthe Internet. The dynamism of globalcomputing systems means that computa-tional entities which offer services willbe confronted with requests from entitiesthat they have never met before; mobile

entities will need to obtain serviceswithin environments that are unfamiliarand possibly hostile. A party faced withsuch a complex world stands to benefit,but only if it can respond to new entitiesand assign meaningful privileges tothem.

The challenges faced by mobile entitiesin a global computing system are notunlike those faced by human beingsconfronted with unexpected or unknowninteractions with each other. Humansociety has developed the mechanism oftrust to overcome initial suspicion andgradually evolve privileges. Trust hasenabled collaboration amongst humansfor thousands of years, so modellingtrust offers an obvious approach toaddressing the security requirementsfaced by the global computing infras-tructure. Trinity College Dublin leadsthe SECURE project, which aims todevelop a new trust-based securitymodel for global computing systems;other partners in the SECURE projectare the universities of Aarhus,Cambridge, Geneva and Strathclyde.The aim of the SECURE project is todevelop a formal model in which trustrelationships may be established on thebasis of interaction between entities,together with a security mechanismexpressed in terms of the trust model.

Trust is an elusive concept that defiesstringent definition. However, weconjecture that a notion of trust can berealised in sufficient detail to be opera-tional for a specific purpose, namely asthe underlying principle for a securitymechanism applicable in a globalcontext. Trust has been proposed as amechanism for reducing risk in unknownsituations. The explicit use of trust as adefining principle for security modelsand policy specification makes trust rela-tionships among entities explicit. Trustthus becomes the commodity that allowsan entity facing an interaction in an unfa-

miliar environment to weigh the risksassociated with particular actions.Conventional security mechanismsexpress policy in terms of the privilegesallocated to individuals; role-basedaccess control introduces a level of indi-rection, in which privileges derive fromroles, and policy determines which indi-viduals may enter each role. In eithercase the mapping from the trust model tothe risks inherent in the allocation ofprivileges is implicit. SECURE proposesto establish a trust-based security modelin which computational entities interacton a basis of (mutual) trust.

Interaction between entities may takemany different forms. It is worth lookingat one form of interaction in more detail.Suppose that a mobile entity needs toobtain a service from another entitywithin an unfamiliar environment. Theentity that offers the service can identifythe potential client, but its attributes andprobable behaviour are unknown. Weassume here that the functions of theservice are categorised and their integrityprotected by role-based access control.The service allows the potential client toenter role(s) on the basis of their mutualtrust. The client can then make use ofone or more of the functions of theservice. This may place the client underan obligation, for example to make amicro-payment. When the interaction iscomplete each party records their experi-ence of it, which will include informa-tion about the behaviour of the other.

The experience recorded by the servicecan be used in at least three ways. First,the service performes some function forthe mobile entity on the basis of trustalone; the service can learn from theinteraction to evolve the mappingbetween trust and role. Second, therecord can be transferred to the mobileclient, which can use it as a recommen-dation when approaching other entities.Finally, the record is available as

by Christian Damsgaard Jensen

SECURE is a newly started IST project, partly carried out at Trinity College Dublin,which addresses secure collaboration among computational entities in emergingglobal computing systems.

Secure Collaboration in Global Computing Systems

Page 35: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 35

A project is currently under way at IIT-CNR, in collaboration with two indus-trial partners, which aims at the integra-tion of biometric devices with digitalsignature technology (in conformity withcurrent Italian standards). The results ofthe activity are being tested periodicallyby other CNR Institutes interested in thistechnology. The main objective of theproject is the definition of standards thatguarantee:• confidentiality of non-public

information• authentication of the entities involved

in the protocol (smartcard, biometricdevice and the user)

• integrity of the messages.

All these steps will be necessary to guar-antee non-repudiation between authenti-cated users.

User authentication with biometric tech-niques does not require ‘knowledge’ of asecret piece of information, such as thetraditional PIN, but requests that the userperforms specific ‘actions’, necessary

for a live acquisition of public biometricinformation.

This is why biometric techniques areused today for physical access controlsor for other activities that require thepresence of the user. The effective pres-ence of the user during authenticationguarantees that the biometric informa-tion is not intercepted, stolen or improp-

erly used. Our aim is to perform remoteauthentication (eg logon to a remotesystem or unlock of a smart card to sign adocument) and to guarantee the presenceof the user by using special communica-tion protocols between the biometricdevice and the smartcard.

The main problems with the use ofbiometric techniques are:

by Luca Bechelli, Stefano Bistarelli, Fabio Martinelli, Marinella Petrocchi and Anna Vaccarelli

Biometric technologies are currently used for physical access controls. Scientists at CNR aim to integratethis technology with certificates, signatures and smartcards to handle remote authentication.

Integrating Biometric Techniques with an Electronic Signature for Remote Authentication

A fingerprint template.

evidence to modify the reputation of themobile entity.

The accumulation of such experience iswhat allows trust to evolve. Trust is indi-vidually formed through an entity’sobservations of the behaviour of otherentities; this allows interaction withunknown entities without prior configu-ration, a fundamental requirement forsecurity in the global computing envi-ronment. In the scenario above wepictured a mobile client interacting withsome service, but the essential feature isthat the properties of each entity areunfamiliar to the other. The mobileentity will write its own account of theinteraction, and may as a satisfied user

offer it to the service. That recordprovides an alternative account of theinteraction, and the combination of thetwo gives a lot of potential information.

Implicitly this scenario presents a rosypicture of a successful interaction, but alot of things may go wrong. Forexample, the service may be performedimperfectly, or the client default on thepayment in some way. Worse, the twoentities may in fact be in collusion, andpresent fictitious but consistent accountsof the interaction in order to boost theirjoint reputation in the world at large.

The research presented above is definedin the context of collaboration among

mobile users and intelligent devices in aglobal computing infrastructure.However, it is equally applicable to allareas with great risk and uncertainty andwhere it is difficult to establish a mean-ingful identity of other entities, eg,Internet collaboration, peer-to-peernetworks, smart environments and e-commerce.

SECURE is a Future and EmergingTechnologies project supported by theEuropean Commission under contractIST-2001-32486.

Please contact: Christian Jensen, TCDTel: +353 1 608 24 59E-mail: [email protected]

Page 36: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

Today more and more resale transac-tions involving both physical and digitalproducts are taking place betweenconsumers, as well as between busi-nesses and business customers, throughelectronic marketplaces or directlybetween users. The problem for thepurchaser in these situations is thathe/she does not know:• whether the product exists• whether the product may legally be

traded• whether the seller actually has all the

necessary legal rights to the itemoffered for resale

• whether the purchaser him/herselfwill actually acquire all the legalrights to the item once the resale hasgone through

• whether the holder can’t sell the sameproduct to several people.

Clarifying all of these points is not aneasy exercise and most importantly, todate it has not been possible to automatethis process. Trust, however, is impor-tant for e-commerce. Marketplaces needsecurity support on all levels to increasetrust for all transactions. To supportsecure resale transactions, we started aproject in 2001 called eDOT (electronic

document of ownership transfersystem). The aim of this project is tobuild a system which allows a secureresale of tangible and digital goods.With help of this system, rights to phys-ical and digital goods can be electroni-cally documented, the product and itsowner are clearly identifiable for allparties, the product and rights can betransferred without problems (ie, just forlimited time period) and forgeries, as inthe case of physical documents or prod-ucts, are not possible. Moreover, theelectronic document of ownership canbe used at the same time for authentica-tion and authorisation. The system is a

A secure transfer system for reselling physical and digital goods is being developedat the University of Zurich. The project is unique in its approach of dealing withsecurity issues to support resale transactions between users with or withoutelectronic marketplaces.

Secure Resale of Tangible and Digital Goodsby Harald Häuschen

SPECIAL THEME: INFORMATION SECURITY

36 ERCIM News No. 49, April 2002

• the need to ensure that the only wayto authenticate the user is the ‘action’and not the ‘knowledge’ of thebiometric information

• the association between biometricinformation and user identity has tobe certified.

If these two constraints are not realised, amalicious user could use his ownbiometric details together with a thirdparty identity (responsibility attack), orattach his identity to third partybiometric information (credit attack).

In our work, the biometric information isrepresented by a fingerprint. During theenrolment phase, a fingerprint templateof the user is stored in a secure environ-ment (in our solution inside the smart-card). For integrity and authenticitypurposes, the (hashed) fingerprint is theninserted in an ‘attribute certificate’signed by an Attribute Authority. In thesame smartcard we also store an X.509certificate of the user, which will be usedto digitally sign documents.

In order to validate the fingerprint-iden-tity pair, two important pieces of infor-

mation are added to the attribute certifi-cate:• the serial number of the smartcard (in

this way the fingerprint can only beused with that smartcard)

• the serial number of the X.509 userdigital certificate (in this way, thefingerprint can only be used togetherwith its owner).

This type of solution guarantees:• that the user can perform classical

authentication (with a secret PIN) anduse only his X.509 certificate

• the possibility of biometricauthentication, and the use of theX.509 certificate for remoteauthentication and digital signature

• the possibility of only handling thebiometric information locally andprivately.

In conclusion, biometric techniqueswork well if the verifier can check twothings:• that the biometric information was

supplied at the time of verification(livescan)

• that the biometric informationmatches the template on file.

If the system cannot do this, then it fails.In fact, biometrics data provide uniqueidentifiers, but are not secret.

We intend to implement the specifica-tion described above using the hardwareand software products of our industrialpartners with template-on-card technolo-gies.

We plan to extend the implementationemploying Match-On-Card (where theextraction of the certificate is performedon the card and the match on the system)and System-On-Card technologies(where match, extraction and storage ofthe template are performed inside thecard).

This work has been carried out within ascientific cooperation between IIT-CNRand BiometriKa, an Italian company.

Link: http://www.iat.cnr.it/attivita/progetti/progetti.html

Please contact:Stefano Bistarelli, IIT-CNRTel: +39 050 315 3438E-mail: [email protected]

Page 37: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 37

single component, which can be used toextend existing marketplaces.

Until now, customer uncertainty andlack of confidence are undoubtedlysignificant in the context of resale. It isimportant that potential customers feelsure that a one-off product they buy willactually belong to them once paymenthas been made and will not be re-soldseveral times over by the seller. Apotential customer also wishes to besure that the tangible goods or digitalproduct really exists and may in fact betraded. Therefore the system consists oftwo essential components: firstly, anelectronic certificate of ownershipwhich confirms that the product existsand that it is owned by the personcurrently in possession of this product,and secondly, a component wherebyownership rights may be transferredsecurely and unequivocally undercertain specified conditions.

The ‘ownership certificate’ allows thedescription of goods and the allocationof them to an owner. The main task ofthe transfer component, as a trustedentity, is to ensure the secure transfer ofownership together with the settlementof the corresponding payments. Securityis guaranteed at various levels. Thesystem is protected against forgeriesafter the ownership certificate is created.There is no risk of losing ownership andthe system is clear and transparent, ie,all activities can be checked and under-stood by those involved.

To validate our approach we made a firstimplementation for an agent-basedmarketplace. Agents, on behalf of theiroriginator, take over the task of lookingfor specific items, negotiating with theirseller and concluding a deal. However,agents are not just limited to purchasinggoods on behalf of users, since they toocan act as sellers in that they can offer

for resale goods that they themselveshave purchased. Our first implementa-tion shows that the use of the ownershiptransfer system could solve all the secu-rity problems without significantlychanging a marketplace.

The project represents an important stepforward that will enable secure and effi-cient business transactions involvingmulti-stage (intermediate level) tradingto take place alongside direct sales chan-nels. Implementation is simplifiedbecause existing systems only need tobe modified and not replaced. At themoment we are defining a new compo-nent to support resale via mobilephones.

Please contact:Harald Häuschen, University of ZurichTel: +41 1 635 43 11E-mail: [email protected]

The Amanda project is the commondenominator for a small collection ofprojects involving Microsoft Research,Cambridge, the Swedish DefenceMaterial Agency, SaabTech Systems,and SICS. Over the past two years, ateam of researchers have examined theproblems of authorisation, delegation,and authorisation management indistributed systems.

A very topical issue is the establishmentof Public-Key Infrastructures (PKI) as afundamental basis for secure interactionon the web. Whereas the establishmentof PKIs at the level of closed systems –individual organisations – is now to alarge extent routine, significant prob-lems remain to be solved at the level ofopen systems, slowing down the wideradoption of PKI in industry. Important

factors are concerns related to certificateinteroperability, certificate manage-ment, authorisation (roles and delega-tion) and organisational structure.

Important as it is however, in manyInternet applications a strong authenti-cation mechanism is not in itself thegoal. Authentication is needed tosupport other functions such as authori-sation and auditing. The development ofauthorisation and auditing mechanismsfor distributed systems is currently avery active field of research. Traditionalauthorisation mechanisms are based onthe Access Control List (ACL) model.The ACL model was originally designedfor access control decisions in closedsystems, and requires that a server knowin advance the identities and the permis-sions of its clients. Such information is

recorded in an ACL that is centrallyadministered. However the ACL modelis fraught with problems, in particular assystems scale up and become deployedin increasingly open contexts. One set ofproblems concerns management, rela-tions to organisational structure and theability to easily adapt to changes on bothindividual and group levels. Moreover,in open systems, a server may not knowwho is the next potential client.Therefore solutions based on the ACLmodel are not suitable for authorisationmechanisms in Internet applications,and there is a need for a more genericand common framework for authorisa-tion and auditing in open distributedsystems.

The research field of ‘trust manage-ment’ provides the seed for such a

The problem of authorisation, delegation, and authorisation management indistributed systems has been studied at SICS for the last two years. Our main focushas been the development of a delegation logic which is based on the idea ofdelegation as the explicityet constrained creation of new privileges.

Managing Authorisationsby Babak Sadighi Firozabadi and Mads Dam

Page 38: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

38 ERCIM News No. 49, April 2002

The transition from traditionalcommerce to electronic and mobilecommerce is fostered by aspects likeconvenience, speed and ease of use.However, security issues remainunsolved. Smart cards open new possi-bilities for the development of securityschemes and protocols that can providesecurity in applications such as elec-tronic payments or software protectionwhere traditional cryptographic tools arenot useful. The GISUM group isinvolved in several research projects thatmake use of smart cards. Current appli-cations include a secure electronic formsframework for government-citizen rela-tions, electronic ticketing systems forGMS phones and Internet, a PDA-baseddigital signature environment, publictransport, access control systems, soft-ware protection and banking applica-tions. This report focuses on two recent

projects: the eTicket electronic ticketingproject (1FD97 1269 C02 02 (TAP)), acoordinated project with the Carlos IIIUniversity of Madrid; and the Alcanceproject, consisting of the development ofa secure electronic forms framework forsecure Internet-based communicationbetween citizens and the public adminis-tration (1FD97 0850 (TIC)).

Electronic TicketingTicketing services represent an attractiveapplication that is both useful and conve-nient for the user. However, securityfeatures and greater flexibility areneeded in order to foster their extensiveuse and popularity. Most of theseservices use a single value (usually anumeric code) to represent the ticket. Inconsequence, tickets can easily becopied or forged, it is impossible torepresent different types of tickets, no

delegation is supported and finally, theverifier needs to receive a database oftickets emitted before allowing clients toaccess the service. Our project thereforefocuses on the following two points: thedevelopment of a representation for thetickets and the development of protocolsand mechanisms for the use of thetickets.

After a careful analysis of the electronicticketing requirements, the followinggoals were defined: versatility, compact-ness, security, payment, fast offlineticket verification, minimal number ofmessages emitted by users and ticketdelegation. Our system is based on cryp-tographic smart cards which contain akey pair generated inside the card andwhich ensure that the private key neverleaves the card. Each card will alsocontain a certificate of the public key

The GISUM research group at the University of Málaga is looking at the use ofsmart cards to increase security in different scenarios. The work is supported bythe EU and the Spanish Ministry of Science. Some interesting results are repre-sented by two recent projects. These are the eTicket project, which has definedand implemented a secure electronic ticketing procedure including related proto-cols and support services, and the Alcance project, which has developed a secureelectronic forms framework for secure communication between citizens and thepublic administration.

The Role of Smart Cards in Practical Information Securityby Javier López, Antonio Maña, Pedro Merino and José M. Troya

model. In trust management the coreaspect of authorisation is to answer thefollowing question: Does the set ofcredentials C prove that the request rcomplies with the set of local policies P?The local policies are the policies of aserver that controls access to someresources, and a client provides –directly or indirectly – some credentialsto support its request. These credentialswill typically take the form of attributecertificates, digitally signed by trustedparties or empowered authorities.

A good delegation logic is a key compo-nent in such a framework, since delega-tion is the central mechanism by which

administrative tasks and procedures arebroken down into manageable parts. It isalso in our opinion crucially importantthat a clear separation be made betweenadministrative and executive powers. Itis very easy to conceive of situationswhere a power to manage some admin-istrative attributes of a given resourceshould be granted, but direct access tothat resource should be denied. Anexample is outsourced management.

The main focus of the work at SICS hasbeen the development of a delegationlogic which is based on the idea of dele-gation as the explicit yet constrainedcreation of new privileges. We have

examined the basic principles of such amodel, considered the problem of revo-cation in this context, and produced anumber of prototype implementationsincluding adaptations to the ongoingwork on SPKI/SDSI at IETF.

Links: SICS Intelligent Systems Lab:http://www.sics.se/isl/pbr Amanda project: http://www.sics.se/fdt/amanda

Please contact: Babak Sadighi Firozabadi, SICS Tel: +46 8 633 1582 E-mail: [email protected]

Page 39: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 39

(signed by the card issuer), the publickey of the card manufacturer and somesupport software.

Ticket merchants are assumed to haveaccess to the public keys of all cardissuers whom they wish to accept for thesale of the tickets (usually a smallnumber). When the user requests a ticketfrom the merchant, an ‘open’ ticket (ie,one that is not associated with any user)is sent to him. Once the open ticket isreceived, the smart card verifies that it iscorrect, extracts the service identifierand authorisation components and storesthem in its memory. The authorisationcomponent includes, among othervalues, the number of closed tickets thatthe smart card is allowed to produce

from that open ticket. To spend an openticket the user must first close it, which isachieved by including some identifica-tion information for the spending of theticket. This process must be done in thecard of the user who bought the ticketfrom the merchant. Where the user andbuyer of the ticket are different entities,we say that the ticket is delegated. Ticketdelegation is possible because the soft-ware contained in the smart card is trust-worthy. For the same reason, there is nopossibility for the card to produce moreclosed tickets than are authorised.

The trustworthiness and the authenticityof the card software is guaranteedbecause the card issuer signs the publickeys of the cards it sells.

To spend the ticket the user presents theclosed ticket to the verifier, providingevidence that he has received the secretservice identifier. This is both a fastproof - the only operation involved is ahash - and a secure one, because it is notfeasible for a dishonest user to producethe result of the hash operation withoutknowledge of the secret service identi-fier.

Secure Software FrameworkFor public organisations, a telematicversion of administrative procedureswould bring meaningful benefitsconcerning accessibility and availabilityof documents and services, regardless oftime, location and quantity. While someimplementations exist, a number of tech-nological deficiencies hinder a higherdegree of communication betweenadministrations and citizens/companiesthrough the Internet.

It is convenient to define models toinclude security properties into applica-tions as well as components that havebeen already developed, and this must befulfilled making small changes to thecode.

The area of application of this projectwas determined by the computerisationof the Junta de Andalusia SanctionProcedures, which are associated with

Scenarios for ticket spending and delegation.

Figure 1: eTicket systemarchitecture.

Page 40: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

40 ERCIM News No. 49, April 2002

Trust and confidence are essential for theusers of networked systems, as for allmembers of the Information Society.The lack of trustworthy security servicesis the main reason of not using the elec-tronic and mobile technologies inprivate, business or in public services.

The basic term of trust means reliabilityin some person or thing, or to allow to dosomething without fear of the outcome.Trust is of different categories, eg,Impersonal/Structural trust, Dispositionaltrust, Personal /Interpersonal trust.

In order to motivate individuals to use acertain information system, users have tobe convinced that it is safe to use the

system, their data will not be modified,lost, used in other way as defined previ-ously, etc. In case the individual hasbeen convinced, one will trust the systemand will use it.

Access control (identification), authenti-cation, privacy, and confidentiality areservices forming the sense of trust for ahuman being. To achieve these servicesthree basic building blocks of securitymechanisms are applied: encryption (forproviding confidentiality, authenticationand integrity protection), digital signa-tures (for authentication, integrityprotection and non-repudiation), check-sums/hash algorithms (for integrityprotection and authentication).

Smart cards can become essential trustelements in a security infrastructure asthey are able to integrate different secu-rity mechanisms besides the currentapplication. They are efficient devices toexecute security functions, such asdigital signatures. The workable interop-erability of technical and organizationalframeworks and supporting infrastruc-tures is a big problem, as the lack of themcan decrease SC usability. Overcomingthis problem can help the softwarereconfiguration.

In the close future smart cards will havea role more important than today. Multi-functional cards can integrate differentapplications; identity card, bank-card,

Trust from users is a fundamental element in network-based services. Buildingblocks of trust are different security mechanisms. A smart card (SC) is a devicethat can integrate different security mechanisms in a handy form, but interoperabilityproblems can decrease its wide usability. Software reconfiguration can be a wayto overcome this problem. A project has been started at SZTAKI to develop anontology-based reference architecture that supports SC reconfiguration.

Realizing Trust through Smart Cardsby István Mezgár and Zoltán Kincses

the Ground Transport ArrangementLaw. More specifically, Alcance is beingdeveloped inside the Strategia Project,associated with the Works and TransportCouncil of Junta de Andalusia. Theproject involves several independentsystems, with the Sanction System beingone of the most important. Inside theSanction System, the main task related toour research project is the design of amodule which, by using Web browsers,will allow over fifty thousand privateorganisations and companies to trackand transact the sanction files assigned tothem in one or more sanction proce-dures. The module developed allowsprivate organisations and companies,whose access is controlled by Webdigital certificates, to monitor their filesindependently of location and time.Certificates on smart cards support theauthenticity necessary for the communi-cation between companies and theCouncil, and allow the exchange ofsigned official documents.

We have designed and developed a FormDescription Language, called LDF,which is based on XML, and moreprecisely on XFDL. The use of LDF andrelated tools introduces many advan-tages in comparison with traditional useof HTML. Regarding forms status, it iseasy to add new components notincluded in HTML. These new compo-nents can be useful for avoiding invalidinputs in the electronic forms, thusachieving a more dynamic management.Additionally, automatic data validationcan be done without programmingspecific code for that operation, as theform specification includes the checkitself. Regarding forms management,LDF includes the possibility of formsvisualisation by using a traditionalbrowser (for on-line operations) or anindependent application (for any off-lineones). Signed forms management iseasier, as signers can store in their ownhard disk a copy of a partially filleddocument, which can be opened later forcompletion using a browser. Moreover,

one or more users can sign forms that canbe encrypted using unconstrained imple-mentations of algorithms.

Regarding communication, a specialisedformat ensures the context of the signa-ture is not lost, so that the authenticity ofthe data is never compromised. Besidesthis, the document is audited (personsinvolved, date of the agreement, etc) onits own. In contrast to HTML, LDFprovides a data structure and separatesapplication, presentation and logiclevels.

Link: GISUM research group:http://www.lcc.uma.es/~gisum/

Please contact:Javier López, Antonio Maña, Pedro Merino,José M. Troya, University of Málaga, SpainE-mail: {jlm,amg,pedro,troya}@lcc.uma.es

Page 41: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

SPECIAL THEME: INFORMATION SECURITY

ERCIM News No. 49, April 2002 41

health card, etc. They can be a keycomponent for mobile phones used asmobile personal terminals, a personaltrusted device (PTD).

The very broad field covered by thedifferent applications needs differentsolutions for the expected functions ofSCs. The way of identification (able-bodied, or handicapped), the differentencrypting algorithms (strong/weakencryption) need different HW and SWsolutions/configurations of the SC. Theset of applications of a smart card canalter during its life cycle, so the SWconfiguration has to be modified. Inthese cases applications have to be addedor removed, namely, the software of thesmart card has to be reconfigured prob-ably several times. Another demand is tohave standardized interfaces to handlethe different application software (inter-operability), and standardized HWbuilding blocks.

A balanced system means that theneeded functions of the current applica-tion are realized with the optimal HWand SW. Optimal means to select theproper technical parameters with acombination of an economic financialsolution. This demand needs a complex,flexible configuration that can bealtered/modified according to the user'sactual needs (eg geographic spot, newservices). Defining this configuration isa real complex task, consequently, thereis a need for a kind of structured descrip-tion of the present (and possible future)requirements of the different applica-tions, and of the HW and SW possibili-ties. This representation structure can bea reference architecture that is basedtheoretically.

The development of the reference archi-tecture is going on in the frame of theproject ‘The Theoretical Elaboration andPrototype Implementation of a GeneralReference Architecture for Smart Cards(GRASC)’, supported by the HungarianScientific Research Found (OTKA). Thebasic goal of the project is to produce afirst qualitative version of a theoretical-based multi-view, multi-layer, multi-element description of SC functions, SWand HW systems and applications. Thisrepresentation will be integrated,unified, consistent, and – very important

– dynamic, as it will also describe theconnections among the elements.

The starting point of research was todevelop a smart card ontology (SCO).Smart card ontology, this special struc-tured representation is the guarantee forthe full description of entities (applica-tions and system elements), their levelsand the logical connections between thelevels and the entities. The main charac-teristics of smart card ontology are; theformalization level is structuredinformal/formal, the purpose of applica-tion is the interoperability amongsystems and it is domain ontology. Asthe description of SC ontology wouldwell exceed the given extent of thepaper, only a few elements of the SCOare introduced: meta-ontology (definesthe basic terms of ontology), activities,functions, architectures and buildingblocks, applications, etc. There arefurther subgroups, numerous terms anddefinitions completing the ontology.

Based on the ontology, the number andcontent of the dimensions of the refer-ence architecture (RA) can be defined.As a second step, discrete referencemodels (RM) can be allocated based onthe RA. Based on the content of the RAsets of the reference models will bedefined. It can be done based on theelements of the ontology taking thelogical, functional and the derivative

connections into consideration. EachRM is in a close, strict functional/logicalcontact with the neighbouring referencemodels. This results that the boundarycommunication between the RMs andthe logic (I/O dataflow) of this commu-nication can be defined exactly. Basedon this knowledge exact protocols can bedetermined.The figure shows a three-dimensional graphical representation ofa function-application-realisation archi-tecture with reference models (repre-sented by cubes).

The expected results of the GRASC are astructured description of smart cardsystems from different aspects, easyconfiguration/reconfiguration possibili-ties of SCs for different (multi) applica-tions, and content/form of communica-tion can be clearly described in case ofdifferent functions/applications.

Please contact:István Mezgár, SZTAKITel: +36 1 279 6141E-mail: [email protected]

Zoltán Kincses, SEARCH Laboratory, Budapest University of Technology and EconomicsE-mail: [email protected]

A three-dimensional graphical representation of a function – application – realisationarchitecture with reference models (represented by cubes).

Page 42: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

42 ERCIM News No. 49, April 2002

RESEARCH AND DEVELOPMENT

The idea seemed attractive. Collect ourscrap iron, and feed it to the tiny organ-isms in the oceans called phytoplankton.They love it and their population willgrow on it. They love carbon dioxidetoo, extracting it from the atmosphere(the present intake is several gigatons peryear). This carbon is transported down-ward by sinking phytoplankton species,and will eventually end up ‘in the deepbosom of the ocean buried’ – a mecha-nism called ‘Biological Pump’ (Figure1) . Thus two birds would be killed withone stone: disposal of industrial wasteand reduction of the greenhouse effect.Alas, the idea lasted as long as itremained qualitative. One of the studieswith the mathematical model concerned,which is fairly realistic because it incor-porates the most essential mechanisms,explains the recent field observation thatiron fertilization fails to enhance carbonexport to the deep ocean.

Phytoplankton is a generic name for agreat variety of micro-organisms (algae)that live in lakes and oceans. Modellingits dynamics is of great interest, sincephytoplankton provides the basis of thefood chain. Phytoplankton requires lightfor photosynthesis. In a water column,light intensity decreases with depth, dueto absorption. As a result, the phyto-plankton production rate, which is deter-mined by the local light intensity,decreases with depth. Furthermore,mortality rates and transport by turbulentdiffusion in a water column (mixing)play a role. Also, phytoplankton speciesoften have a specific weight differentfrom that of water, giving rise to verticaltransport in the form of sinking or buoy-ancy. All these processes are incorpo-rated in an integro-partial differentialequation of advection-diffusion-reactiontype describing the population dynamicsof phytoplankton. On the basis of thismodel we developed both analytical andnumerical tools for several studies. Oneresult is that the long term survival of a

particular phytoplankton species(‘bloom development’) cruciallydepends on the mixing rate, the watercolumn depth and the vertical velocity.Another result is that the maximalsinking velocity of phytoplankton thatcan be sustained is inversely propor-tional to the water turbidity. Hence thedownward flux of phytoplanktonbiomass is very different for clear andfor turbid water, explaining why ironfertilization does not enhance downward

carbon transport (see Fig. 1). Since thereare thousands of different phytoplanktonspecies, slowly drifting in lakes andoceans (Figure 2), a natural extension ofthe model is to systems in which variousspecies are competing. Buoyant speciesseem to have an advantage over sinkingspecies since they can easily reside in thewell-lit upper zone close to the watersurface, and hence capture a lot of light.Moreover, they have the additionaladvantage to shade other species atdeeper levels. Nevertheless, as it turnsout, the balance between survival anddeath for the various species is very deli-cate and depends on values of otherparameters as well, such as backgroundturbidity, incident light intensity, andspecific light attenuation coefficients ofthe different species.

Link: http://www.cwi.nl/projects/plankton

Please contact:Ben Sommeijer, CWITel: +31 20 592 4192E-mail: [email protected]

by Ben Sommeijer

A fairly realistic mathematical model describing the dynamics of phytoplankton was developed atCWI jointly with the research group on Aquatic Microbiology at the University of Amsterdam. Oneresult negates a recent idea to reduce two environmental problems in one action.

Phytoplankton Dynamics — the Struggle for Light

Figure 2. The several thousands ofphytoplankton species have very differentshapes. Source: Dr Petra Visser, IBED -Aquatic Microbiology, University ofAmsterdam.

Figure 1: CO2 is emittedinto the atmosphere byseveral sources. Uptakeof CO2 by the ocean isenhanced throughstimulation by fertilizersof the Biological Pump.Source: DOE Center forResearch on OceanCarbon Sequestration.

Page 43: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 43

RESEARCH AND DEVELOPMENT

Work of the future, in industry and incommerce, will have an increasing arrayof technical systems available, of everincreasing sophistication and improvingfunctionality. These new technicalsystems, however, will be of no valueand may prove even disruptive orharmful if they are not implementedwith proper understanding of their capa-bilities and of the human and organisa-tional issues surrounding their use.

‘Virtual and Interactive Environmentsfor Workplaces of the Future’ (VIEW)is a EU-funded research and develop-ment project that addresses theserequirements in the particular context ofVirtual reality technologies and virtualenvironments. The Project started inJanuary 2001 and will be completed atthe end of 2003.

The overall aim of VIEW is to developbest practice for industrial implementa-tion and use of virtual environments, andintegrate Virtual Environments in productdevelopment, testing and training forworkplaces of the future. The mainproject objective is to face the numerousquestions that arise from the lack ofappropriate guidelines and standards onbest practice implementation and use ofVirtual Environment (VE) technologies.Moreover, VIEW focuses on initiatingand supporting the integration of VEs inproduct development, testing and trainingprocesses, as well as on the broad adop-tion of VEs in European industries. In thiscontext, the VIEW Project aims todevelop essential technologies, method-ologies and tools that will enable theindustry to develop and use VEs in anappropriate way. In order to achieve thisgoal, the project will:• study and analyse the impact of VEs

on their users• identify potential barriers for

industries in making use of VEs

• provide guidelines and strategies toovercome those barriers

• use its findings in practice fordesigning appropriate VE workplaces

• provide tools and guidelines forindustrial users so as to makeappropriate use of VEs.

The project will review existing andemerging VE systems and applicationsand will establish a user forum to facili-tate the conduct of surveys of VE users,in order to identify user requirements. Ausability test battery will be developed tocapture physiological and cognitiveaspects of the impact that VEs may haveon end-users. The project will alsodevelop new, multi-modal, mobile andmulti-media interfaces by combiningdifferent input/output (I/O) modalities,as well as new intuitive navigation andmanipulation concepts. Those interfaceswill be implemented and tested invarious pilot applications, which willallow reliable evaluation of the usabilitytest battery, as well as critical assessmentof interaction concepts, devices anddesign guidelines. The evaluationprocess will cover several topics, such asthe ergonomics of VE technology; issuesof usability, health and safety; socio-economic impact for users and society;appropriateness of the VE technology tomeet user company needs; gains in termsof work effectiveness and quality.

The knowledge and experience acquiredduring the project will be analysed andreported in a code of good practice,which will allow the application of prac-tical and validated guidance to the selec-tion, design, implementation and use ofVE systems in different types of work-places and for different needs. This codeof good practice will be integrated intoan interactive design support tool, whichwill provide the knowledge developedby the project (guidelines, code of good

practice, best use guidance, examples,etc.) to decision makers, VE designers,developers and users in an appropriateway and format.

The VIEW of the Future Consortiumincludes three ERCIM members,namely the Fraunhofer Institute forIndustrial Engineering (IAO) (with therole of Technical Coordinator), VTTand ICS-FORTH. Other consortiumpartners are: University of Nottingham,United Kingdom (Administrative coordi-nator); University of Stuttgart Institutefor Human Factors and TechnologyManagement (IAT), Germany; SuomenWINTEC Oy, Finland; John Deere,Germany; Alenia Spazio, Italy; Volvo,Sweden; PSA Peugeot Citroen, France;Center of Applied Technologies inMental Health (COAT-Basel),Switzerland; Institute ofCommunication and Computer Systemsof the National Technical University ofAthens (ICCS-NTUA), Greece.

Link:http://www.view.iao.fhg.de/

Please contact:John Wilson, University of Nottingham, UKTel: +44 0 115 9514004E-mail: [email protected]

by John Wilson

VIEW is an ongoing R&D project, involving three ERCIM members and addressingbest practice in integrating Virtual Environments within industrial productdevelopment, testing and training for workplaces of the future.

Virtual and Interactive Environments for Workplaces of the Future

Virtual environments for workplaces of the future.

Page 44: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

44 ERCIM News No. 49, April 2002

RESEARCH AND DEVELOPMENT

The Planck mission, to be launched inFebruary 2007, will produce a very largeamount of data, in the form of maps ofthe entire celestial sphere in themillimeter and submillimeter-wavebands. Its main goal is to map thecosmic microwave background (CMB)anisotropy with unprecedented angularresolution and sensitivity. It is knownthat CMB is a relic radiation comingfrom an epoch very close to the big-bang. Studying its anisotropy willprovide important information on theorigins and evolution of the universe.

The antenna temperatures measured bythe satellite-borne radiometers are gener-ated both by CMB radiation and by otherastrophysical processes, whose effectsare superimposed on the maps. Whereasthe interest of cosmologists is focused onCMB, all the processes mapped are ofinterest in astrophysics and should thusbe separated out.

The problem can be briefly formulatedas follows: the data set is given by anumber of sky maps, each on a differentfrequency channel; each map is a linearmixture of the maps related to the indi-vidual physical processes (sourceprocesses); the mixing coefficientsdepend on frequency through the sourceemission spectra and the frequencyresponses of the radiometers over thedifferent measurement channels. Ourobjective is to separate the individualsources from knowledge of the mixedmaps on all the channels.

There are two main reasons why this isdifficult. First, the mixing coefficientsare normally not known; second, thesensor noise is expected to be particu-larly strong because of the very smallquantities (tens of microkelvin) that areto be measured and nonstationary, dueto the uneven sky coverage of the Plancktelescope scanning strategy.

One approach to this problem assumesthat the mixing coefficients are perfectlyknown. However, this does not guar-antee a good result, especially when thematrix assumed is significantly differentfrom the actual matrix and the noise isparticularly strong. The ideal solutionshould come from a totally ‘blind’approach, ie, from trying to estimateboth the source maps and the mixingcoefficients from knowledge of themeasured data alone. Clearly, this is notpossible, since the problem is underde-termined, but the data model (ie, themixing coefficients) is not the only placewhere we can exploit prior knowledge.Indeed, the different radiation processesin the sky are very likely to be statisti-cally independent. Moreover, with theexception of the CMB, the different radi-ations can be modeled as nongaussianprocesses. The Independent ComponentAnalysis (ICA) principle states that, if a

number of independent randomprocesses are linearly combined, a linearoperator can be applied to the mixed datain order to obtain independent outputs. Ifall the source processes, except at mostone, are nongaussian, the outputs of thelinear operator are copies of the originalsource processes. On the basis of thisprinciple, or an equivalent one, differentblind separation approaches can beadopted. These approaches are character-ized by explicit or implicit assumptionson the component distributions and in theinclusion of noise in the data model. Atour Institute, a neural algorithm for blindseparation has been experimented on datasets simulating the Planck instrumentoutput, giving good results for uniformlow-level noise. Successively, a fast non-neural algorithm (Noisy fastICA) hasbeen applied to separate the sources fromthe same data sets with a higher, but stilluniform, noise level. The latter procedure

by Emanuele Salerno, Luigi Bedini, Ercan Kuruoglu and Anna Tonazzini

Blind image processing techniques are being studied at IEI-CNR, with the aim of extractinguseful information from satellite radiometric maps of the celestial sphere. This activity isundertaken on behalf of the European Space Agency Planck Surveyor Satellite programme.

Blind Image Analysis helps Research in Cosmology

An example of separation of two source processes from their mixtures over twomeasurement channels. Top row: simulated CMB anisotropy (left) and galactic dustradiation maps, assumed as original source processes. Middle row: noisy mixturemaps. Bottom row: estimated sources.

Page 45: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 45

RESEARCH AND DEVELOPMENT

A gamma-ray burst hit the Italian-DutchBeppo-SAX satellite for about 80seconds on February 28, 1997 (Figure1). Its monitor accurately measured theposition of the burst, and within eighthours the spacecraft’s x-ray telescopefound a rapidly fading source of x-rayson the spot. Never before a burst hadbeen pinpointed so accurately and soquickly. Subsequent observation withthe powerful Hubble Space Telescope inthe optical domain revealed a bright spotsurrounded by a somewhat elongatedbackground object (Figure 2). The latteris believed to be a galaxy. If so, it mustbe near the edge of the observableuniverse, and the gamma-ray burstrepresented the most powerful explo-sion observed thus far. Such bursts emitwithin the span of minutes or evenseconds more energy than the Sun willin its entire life. Since their (accidental)discovery in the late 1960s, their originand incredible energy remain a mystery.One scenario is a collapsing binaryneutron star system, where gravitationalenergy is converted into kinetic energyof an expanding cloud of protonsmoving at relativistic speeds. Gammarays are then generated by electronsaccelerated by the intense electromag-netic fields occurring in shock waves.These waves result from collisionsinside the proton cloud as well as withthe surrounding gas. This scenario with

shock waves implies that gamma-raybursts are followed by long afterglowsof x-rays and visible light. The burst ofFebruary 1997 provides strong evidencefor such a tail.

Since experiments are impossible,insight into the nature of gamma-raybursts can only be attained by solvingthe full relativistic equations of magne-tohydrodynamics (MHD). At CWI theComputational Fluid Dynamics groupaddresses this problem, with partners atthe University of Utrecht (Astro-Plasmaphysics) and the Institute forPlasma Physics in Rijnhuizen(Numerical Plasma Dynamics). As afirst step, a computational method isdeveloped for the relativistic equationsof gas dynamics, which form a systemof hyperbolic partial differential equa-tions. A tailor-made discretization willtake into account the different propaga-tion velocities of rarefaction waves,shock waves, and contact discontinu-ities. An approximate Riemann-solverwill be derived, as well as a staggered-grid approach. These two methods willbe tested against a highly relativisticspherical explosion, for which an exactsolution exists, thus serving as a severenumerical benchmark. After havingpassed this test, and a few others, elec-tromagnetic effects will be incorporatedand the second step of solving the rela-

tivistic MHD equations will be made.This system of equations is still hyper-bolic, but its wave patterns are morecomplex. For its numerical solutionapproximate Riemann solvers and astaggered-grid approach will be devel-oped as well. CWI’s research contribu-tion is crucial for the further develop-ment of large-scale computer codes inastrophysics.

Please contact:Barry Koren, CWITel: +31 20 592 4114E-mail: [email protected]://www.cwi.nl/~barry/

by Barry Koren

Gamma ray bursts constitute the most energetic events in the Universe so far. Theirnature is still a mystery. Recent observations enable comparison with modelcomputations. CWI develops numerical methods to solve the relativistic equationsof magnetohydrodynamics (MHD) underlying such models.

Relativistic MHD Computation of Gamma-ray Bursts

Figure 1: Gamma-ray burst GRB970228.Beppo-SAX team, Agenzia SpazialeItaliana, ESA.

Figure 2: GRB970228 housed in a galaxy?Team of J. van Paradijs, Hubble SpaceTelescope, NASA.

is now under experimentation by astro-physical research teams in the Planckcollaboration.

We are now focussing on the modellingof nonuniform noise and the estimationof the source statistical distributions.We are experimenting with the recentlyproposed Independent Factor Analysis

(IFA) approach for this purpose. Beforeestimating the data model, this tech-nique learns a statistical source model,thus enabling a more accurate estima-tion of both the data model and the orig-inal source maps. An interesting featureof this approach is that prior informationcan be inserted easily into both the datamodel and the source distributions. This

means that the IFA approach can bemade only partially blind, since it is ableto exploit both the simple independenceand any additional information avail-able, thus providing a better solution.

Please contact:Emanuele Salerno, IEI-CNRTel: +39 050 3153137E-mail: [email protected]

Page 46: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

46 ERCIM News No. 49, April 2002

RESEARCH AND DEVELOPMENT

Any software user knows about in-stable software programs. Today we aresoftware users in many life situations:eg, if we are customers of a (real or e-commerce) shop or travel agency, usersof any communication system, playersof electronic games, and even passen-gers of a (private or public) transportvehicle. The utilization of this softwareprovides benefits and comfort to itsusers, as long as they trust the systemand/or services. It is not necessary to gointo psychological details to understandthe increasing stress if software behavesdifferent to its declared expectations, ormore formally: if the software does notconform to its specification.

What gives a user the trust in software?First of all, there will be her experienceof availability and reliability. But, whatif she is lacking these experiences? Thesituation may be even more difficult:what do users perceive and think, if theyare exposed with a particular applica-tion, eg in the Internet? In almost allcases there is a complex system behind,which is layered and often distributedwith a set of different services involved.And it seems obvious that end-user mayavoid an application, if it is embedded ina faulty system. That leads to the viewthat middleware, which is the ability ofcomputer programs to make use of anyservice implemented on heterogeneoushardware or operation system, is a tech-nology which is crucial for softwareapplications and needs to be verifiedcarefully.

In the telecommunication sector it iswell accepted that the efforts for testingmay cover up to 50% of the completedevelopment process, since the servicereliability is an extreme critical point.During the last decades an elaboratedConformance Testing Methodology

Framework (CTMF) [ISO/ITU-T andETSI 1997] has been developed andestablished in this context. But its appli-cability appears to be much moregeneral.

In January 2000 the EU commissionstarted funding the R&D projectCORVAL2 to enhance confidence insoftware quality in the realm of middle-ware. The target of this initiative is theconformance of the Common ObjectRequest Broker Architecture (CORBA)based infrastructure for interworking incomputer networks towards its interna-tional requirement specification[CORBA standard V.2.3,OMG 1999]. The idea isto enhance software reli-ability due to the applica-tion of high-qualifiedtechnical tests.

Due to the nature of theCORBA middleware,any conformance testingof the CORBA infras-tructure will focus on theinteraction between aservice user or providerand the CORBA infras-tructure. Further, the communicationwithin the CORBA entities across soft-and hardware boundaries is subject oftesting. The major testing aspects inCORBA are:• the provision of a specification-

conformant user interface (API)• the verification of the standardized

translation of CORBA interfacedefinitions (IDL) into the targetlanguage of the ApplicationProgramming Interface (API)

• the protocol-conformant data-exchange between CORBA Objectrequest broker (ORB) entities ondifferent platforms or systems.

Obviously, these aspects cover a largefield of conformance aspects and testingtechnologies: operation-based interfacetesting, compiler testing, and protocolconformance testing.

The major outcome of CORVAL2 is theavailability of the conformance test suitefor both C++ and Java API languagebinded CORBA ORB implementations.The total amount of test cases is verylarge, it varies on the conformancetesting aspect to be verified: thenumbers for the C++ test suite are forexample: 400 syntax tests and 380dynamic behaviour tests with reference

to the IDL compiler output, 1240 APIdeclaration tests and 470 API semanticbehaviour tests, and about 150 tests onGIOP, the protocol definition for theinter-ORB communication. Thedifferent numbers are due to thedifferent testing purposes. The highnumber of API declaration tests, forexample, is reasonable due to thevarious features an API provides to anuser (member functions, class inheri-tance etc.).

The beta version or the test suite is avail-able for inspection and download via theCORVAL2 project web site. Due to the

by Ina Schieferdecker, Axel Rennoch and Dorota Witaszek

The European Commission has initiated the development of a conformance testsuite for the validation of Common Object Request Broker Architecture (CORBA)infrastructure products. Investigations of enhanced techniques for CORBA validationis the core of the CORVAL2 project.

The CORVAL2 Contribution to achieve Confidence in Middleware

Corba conformance aspects.

Page 47: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 47

RESEARCH AND DEVELOPMENT

IPR ownership, the test suite is publiclyavailable, but the Open Source Licensedoes not apply. In CORVAL2, the testsuite has been applied for the differentCORBA ORB products, which areavailable among the project partners.Some failures have been identifiedduring this test campaign, eg, incorrectextraction/insertion of CORBA: Anytypes or wrong GIOP codings.

The big volume of the test suite and thelevel of details of the tests give reasonsto trust the quality of the System underTest (SUT) which have passed theconformance tests. But this technical

argumentation might be sufficient forthe technical engineers only and not tothe end user of CORBA middleware.Therefore a Brand program has beenstarted in October 2001 to give vendorsthe opportunity to prove the applicationof the tests to their CORBA ORBproduct. Branded ORB products will geta certification document and logo forpromotional usage. If any deficit hasbeen discovered within the SUT (due tothe application of the conformance tests)the granularity of the test suite allowsidentifying and correcting the failure.

The partners of the project are: TheOpen Group, Reading, UK; FraunhoferFOKUS, Berlin; IONA - ObjectOriented Concepts, Karlsruhe,Germany; Fujitsu - ICL, Dublin; EricLeach Marketing Ltd., London; ObjectManagement Group, Inc., Needham,MA, USA.

Links:www.opengroup.org/corval2www.fokus.fhg.de/tip/corval2/CorbaTests

Please contact: Ina Schieferdecker, FhG FOKUSTel: +49 30 3463 7236E-mail: [email protected]

The objectives of the EU-fundedCOVAX project are:• to build a web service for search and

retrieval of contemporary Europeancultural documents from memoryinstitutions

• to make existing library, archive andmuseum document descriptionsaccessible over the Internet

• to assist memory institutions toprovide access to their collections,regardless of document type orcollection size

• to implement standards and achieveinteroperability between retrievalsystems operating in the culturalheritage area.

Partners in the project include tech-nology developers and providers (publicresearch organisations and privatecompanies) and content owners(memory institutions). The contentowners have collections of varying typeand size, catalogued using a variety oflibrary, museum and archiving systems.The project is assessing ways to improveaccess to these collections by convertingsamples of existing data into a limited set

of common structured formats, each ofwhich can be expressed using XML(eXtensible Markup Language).

According to the philosophy adopted bythe project, future catalogs for libraries,museums and archives will be stored in avariety of XML formats instead ofproprietary formats, or formats such asMARC which have not gained wideacceptance outside of their developmentcontext. Since much material is alreadydescribed in machine-readable form, theproject worked on developing tools toconvert such descriptions to XML and tointegrate them with native XML data inorder to build user-friendly websites anddata archives.

COVAX is converting existing files intohomogeneously-encoded documentdescriptions of bibliographic records,archive finding aids, museum recordsand catalogs, and electronic texts usingXML and adapting the different docu-ment type descriptions (DTDs) currentlyused for library cataloguing (MARC),archive finding aids (EAD), museumdata (AMICO DTD) and cultural texts

(TEIlite). COVAX is designed to form anetwork of XML repositories structuredin a distributed database and will act as ameta-search engine, offering access toall types of cultural data.

The COVAX system has implemented amultilingual user interface to accessdifferent data (catalog records, findingaids….) and documents (manuscripts,electronic texts, images, etc). The projectis not creating new standards but willadopt existing standards and concepts(XML, existing DTDs, http…). TheZ39.50 protocol provides a conceptualbasis for communication between themultilingual user interface and meta-search engine and Dublin Core MetadataElement Set elements as cross-domainaccess points.

A comprehensive set of documents forthe implementation of the prototype wasselected. It contains a wide variety ofdocuments, descriptions, formats anddatabases: standard and non-standardbibliographic records (including fivedifferent MARC formats), four differentstructures for archive and museum

by Luciana Bordoni

COVAX is testing the use of XML to combine document descriptions with digitisedsurrogates of cultural documents. The aim is to build a global system for searchand retrieval, increasing accessibility via the Internet to the digital collections ofmemory institutions regardless of their location.

COVAX: an Internet Access to Libraries, Archives and Museums

Page 48: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

48 ERCIM News No. 49, April 2002

TECHNOLOGY TRANSFER

finding aids and information in sixdifferent languages (Catalan, Italian,English, German, Spanish, Swedish).

COVAX is intended to satisfy the needsof the general public as well as profes-sional users. User requirements are basi-cally structured around these criteria:• the user must be able to select any or

all of the COVAX databases forhis/her search

• the user must be able to select any orall document types for his/her search

• the user interface must permit simplesearches, suitable for the generalpublic, or more complex searches, forspecialised users.

COVAX partners have implementedtwo different database models: ad hocXML databases, or existing non-XMLrepositories. In the latter case, informa-tion is retrieved from the originaldatabase and transformed into XMLformat before presenting it to users. Tosummarise, COVAX is not only incor-porating XML as a basic standard butalso integrating other standards, andadapting them to XML.

COVAX partners have implementedXML repositories using two software

packages, Tamino from Software AG, aCOVAX technical partner and TeXtMLfrom IXIAsoft. Sites have been estab-lished in London, Rome, Salzburg, Grazand Madrid.

COVAX will test the benefits of XML toencode and process cultural heritageinformation, explore the feasibility ofconverting existing cultural heritagedescriptions into XML encoded informa-tion, adapt cultural information systemsto user requirements and contribute to the

extension of standards for presentationand dissemination of cultural heritage.

Ultimately, it will enhance access tocultural heritage (one of Europe’s mostimportant competitive advantages) forall citizens.

Link: http://www.covax.org/

Please contact:Luciana Bordoni, ENEA/UDA, ItalyTel: +39 06 3048 3503E-mail: [email protected]

Search in the COVAX Prototype.

EUTIST-AMI started in July 2001, andis intended to create a replication effectin order to push the use of these tech-nologies within European industry.EUTIST-AMI, coordinated by ENEAUDA (Italy) in collaboration withLogOn (Germany), CSIC (Spain),SZTAKI (Hungary) and DFKI(Germany), will last for three years andits purpose is to improve the efficiencyof the management and the dissemina-

tion of the results deriving from the 13projects within the cluster. It also helpsemphasising the European dimension ofthe projects. A key benefit of the clusteris the ability to coordinate disseminationactivities. As the results of the individualprojects become available, the coordina-tors will organize European widedissemination campaigns. These willfocus on success stories of the projects,in order to show other potential users the

benefits that these two types of tech-nology can offer.

The effectiveness of the message is alsoreinforced by the fact that the 13 projectsare realized by partners coming fromnine different European countries: Italy,France, Hungary, Switzerland, Austria,Germany, United Kingdom, Sweden andSpain.

European Take-Up of Essential Information Society Technologies – Agents andMiddleware (EUTIST-AMI) is a cluster of 13 different application-oriented projectsthat aims at demonstrating the potential of agent-based systems and middlewaretechnologies when applied to real industrial environments.

A Cluster of European Projects on Agents and Middleware Technologiesby Massimo Busuoli and Emanuela Rasconi

Page 49: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 49

TECHNOLOGY TRANSFER

Each project represents a real example inwhich agents and middleware technolo-gies are used to solve problems relevantto a different industrial area. Forexample, in MODA-ML (a project real-ized by a Franco-Italian consortium) theproblem of the data exchange in thesupply chain of the textile sector will beaddressed by providing a commonlanguage based on XML technology(one of the main languages used torealize middleware tools). This willallow the electronic exchange of data ordocuments (such as orders, technicaldocuments, and administrative informa-tion). This should lead to a five to tenfold reduction in costs compared to theprocessing of paper documents.

In the medical sector, the MOWGLIproject (realized by an Italian consor-tium) will use middleware to processpatients' data. This involves the retrievaland exchange of all of the heterogeneousinformation (such as patients' demo-

graphic data, the allocation of diagnosticmachines, and diagnostic exams, reports,and images) which is generated in a radi-ology department by patients from thetime they enter the hospital until theyleave. The information related to theindividual patients will be available inelectronic format and in a secure andconfidential way through the Web. Thiswill reduce to zero the use of paper forexchanging information between admin-istrative, clinical and diagnostic staff,allowing the possibility of real-timefeedback from a radiology examination.

A good example of the use of agent tech-nology in the public administration is theCASSY project (realized by a Germanconsortium) in which agent technologiesare used to create a virtual receptionistthat will provide the citizens of aGerman city with administrative infor-mation and personal assistance. Thereceptionist is able to answer requestsmade by citizens in natural language thus

easing the problem of navigation withinthe web site and reducing the level skillsneeded to interact with the site.

These are only three examples from the13 projects; more information about theother projects and the cluster as a wholecan be found at the EUTIST-AMIwebsite. EUTIST-AMI is funded by theEuropean Commission within theInformation Society Technology initia-tive of the Fifth Framework Programme.

Link: EUTIST-AMI: http://www.eutist-ami.org

Please contact:Massimo Busuoli, ENEA UDA-Agency for Sustainable Growth, ItalyTel: +39 051 6098 178E-mail: [email protected]

László Kovács, SZTAKI Tel: ++36 1 279 6212 E-mail: [email protected]

Jalios has been created in December2001. This spin-off from INRIA(Rocquencourt) and Bull develops andprovides a content management softwarewhich is the result of a cooperationbetween INRIA and Bull organizedthrough DYADE, a GIE (EconomicInterest Group). Jalios is the thirdcompany resulting from this fruitfulcooperation.

A Project: PharosJalios has its origins in the INRIA projectPharos, which developed a collaborativeinfrastructure for web knowledgesharing. This project had two goals: thefirst goal, at the application level, was tobuild a collaborative recommendationservice; the second goal, at the technicallevel, was to design a replication mecha-nism supporting disconnected replicates.

The application Pharos was primarilytargeted at internal use, to take morebenefit of the resources available fromthe web. Today, web sites based on thisapplication are still of great interest andused either by technical contributors suchas java-channel.org (http://www.java-channel.org/) or by communities ofinterest such as educadoc (http://www.educavie.francetelecom.com/), a Frenchthematic index of Web resources forteachers by teachers.

The database replication mechanism wasintended to allow any kind of topologyand to allow to perform updates in eachreplicate even if not connected. This is ofinterest for nomad users willing to updatetheir data (on their replicate) on theirlaptop while being on the road as well asfor confidentiality and security reasons in

defense or national security businesses.The work, conducted in a thesis, has leadto the definition of a framework for opti-mistic replication. The main problem wasto handle synchronization between repli-cates: to detect conflicts, to solve thosewhich can be solved automatically, andnotify users of the other conflicts.Detection of the conflicts is achieved bythe framework: the developer canprogram conflict resolution either bychoosing default algorithms or by devel-oping some others according to thesemantic associated with the data. Thismain functional innovation has involvedthe INRIA research group SOR(http://www-sor.inria.fr/).

From Research to the ProductA light object-oriented data-base wasdeveloped to support the application and

Jalios is a recent spin-off company emerging from the joint INRIA-Bull EconomicInterest Group ‘DYADE’ . The company develops and provides innovative contentmanagement software.

Jalios: Master your Contentby Vincent Bouthors

Page 50: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

50 ERCIM News No. 49, April 2002

TECHNOLOGY TRANSFER

to prototype the replication protocol.This data-base is very similar to anexplicit persistence mechanism: all basicoperations (creation, update, deletion)are stored in a journal in XML format.This native in-memory data-baseappears to offer obvious benefits for thedevelopers: it makes programming veryintuitive and it reduces drastically theneed to perform request to retrieve data.On the other side, memory size andloading time of the database whenrebooting the server are obvious limita-tions, but not for most of the applications: multimedia information are stored inthe file system not in the data-base, andRAM is cheaper every day. Since thecollaborative recommendation applica-tion had shown a need to share other kindof information we have decided todesign a generic content managementapplication, and to use the light object-oriented data-base as its kernel.

There are a lot of competitors in thedomain of content management. Likeother content management applications,Jalios handles independently content andpresentation ; it takes use of the structureof objects to make easier multiple contri-bution, validation process, diffusion, andquerying.

The replication is the main functionalinnovation of Jalios. Jalios also providesprocess innovations : it empowers each

categories of users contributing to acontent management site, letting themfully use their existing tools. Peopleresponsible for web design may continueto use their favorite authoring tool, likeDreamweaver, in order to edit HTMLtemplates. All other users of the solutioncan achieve their task through theirbrowser: the administrator who givesrights to members, the editorialist incharge of creating new types of publica-tion and validating them, the writers whopublish their articles and modify themafter reviews, the readers who mayinteract, ask questions or possibly givetheir advice.

Our first customers have cheered Jaliosfor being a plateform that is easy and fastto install, easy to customize, and fullyextensible. Some of them have selectedit because it was a rapid prototypingplateform and kept using it, because it isan efficient solution for their productionneeds. Even though we are nowlaunching our commercial operation, weare still interested in developing andlooking for new partnerships to explorenew usage and develop dedicated appli-cations. One such application is beingdeveloped in a project with Renault(MAGIE) to provide a support for inno-vation, working group, survey, conver-gence meeting, knowledge transfer.

We would like to emphasis two points.First, it is very important for ResearchInstitutes to provide researchers withsupport for creating a company to catchbusiness opportunities. In our case, thehelp from INRIA-transfert was decisive.Second, we believe that process innova-tions may be as noble as functional inno-vation and if many research works focuson the second ones, the first ones areprobably more important for industrialsuccess in a competitive business such asthe content management market.

Links:http://www.jalios.com/

Please contact:Vincent Bouthors, Jalios, FranceTel: +33 1 39 63 51 53E-mail: [email protected]

www.java-channel.org is an example of a website based on the Pharos application.

Page 51: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 51

EVENTS

Initially the Proceedings of the seminarwere published in Czech and Slovak.Since 1990 the proceedings started toappear in English and since 1995 theyhave been published in the Springer-Verlag Lecture Notes in ComputerScience (LNCS) series.

The Session in 2001 - which was also thesecond session held in Slovakia since thedivision of former Czechoslovakia - wasorganized by the Slovak Society forComputer Science and Faculty ofMathematics, Physics and Informatics ofComenius University (Bratislava) incooperation with Czech Society ofComputer Science, Faculty of ElectricalEngineering and InformationTechnology of Slovak University ofTechnology, Institute of Informatics andStatistics, Mathematical Institute ofSlovak Academy of Sciences andSOFTEC. It was supported by EuropeanAssociation for Theoretical ComputerScience and ERCIM (EuropeanResearch Consortium for Informaticsand Mathematics).

The scientific programme of the seminarwas divided into three principal areas:Trends in Informatics, EnablingTechnologies for Global ComputingPractical Systems Engineering andApplications. The Program Committeewas chaired by Peter Ruzicka of theComenius University, Bratislava.

The above areas were covered through12 invited talks presented by prominentresearchers. There were 18 contributedtalks, selected by the internationalProgramme Committee from among 46submitted papers. The conference was

accompanied by workshops onElectronic Commerce Systems (coordi-nated by H. D. Zimmermann) and SoftComputing (coordinated by P.Hajek,ERCIM Working Group). TheProceedings of the conference appearedas volume LNCS 2234 of the LectureNotes in Computer Science of theSpringer-Verlag. The contributed paperspresented of the workshop on SoftComputing were published in NeuralNetwork World, No. 6, 2001. Thecontributed papers of the workshop onElectronic Commerce Systems werepublished in the local proceedings ofFaculty of Mathematics, Physiscs andInformatics of Comenius University.

SOFSEM 2002 will be held at Milovy inthe Bohemian Highlands (approximatelytwo hours by car from Brno). We hopethat it will be as successful as it was onnumerous previous occasions in thesame venue.

Links:SOFSEM 2002:http://www.sofsem.cz/sofsem02SOFSEM 2001:http://www.sofsem.sk/

Please contact: Gabriela Andrejkova, SRCIME-mail: [email protected]

SOFSEM 2001 - 28th Conference on CurrentTrends in Theory and Practice of Informatics by Gabriela Andrejkova

The SOFSEM (SOFtware SEMinar) is an annual international computer science andcomputer engineering conference of generalistic and multidisciplinary nature.SOFSEM roots are in the former Czechoslovakia. SOFSEM 2001, the 28th in a row,was held in Piestany, Slovakia, from 24 November to 1 December 2001. SOFSEM2001 received support from ERCIM and several other institutions from the IT industry.There were 147 participants from 18 countries, about two thirds from Bohemia andSlovakia.

SPONSORED BY ERCIM

European InteroperabilityTour 2002

The World Wide Web Consortium willbe holding a series of one-day eventsaround Europe this spring to promoteW3C technology recommendationsand show how they facilitate interop-erability on the World Wide Web.

The mission of the World Wide WebConsortium (W3C) is to ‘lead the Webto its full potential’ by evolving theWeb as a ‘robust, scaleable, adaptiveinfrastructure’. W3C’s main deliver-ables are its recommendations (specifi-cations) that evolve the Web protocolsplus timely conversion tools, valida-tion systems and checklists.

W3C is hosted by three organizationsin three countries: the MassachusettsInstitute of Technology (MIT) in theUnited States, the French NationalInstitute for Research in ComputerScience and Control (INRIA) inFrance, and Keio University in Japan.

All events will feature talks from:• Daniel Dardailler,

Director of W3C Europe• Ivan Herman, Head of W3C Offices.

Each event will also feature a speakerfrom W3C and a local W3C memberwho will provide a case study of thebenefits of W3C technologies. Eachevent will also include a panel discus-sion lead by the chair of the local W3COffice on W3C recommendations andhow they facilitate interoperability.

Dates and locations of the events:• 21 May: Paris• 23 May: Milan• 28 May: Vienna• 30 May: Dublin• 3 June: Brussels

Link:http://www.w3c.rl.ac.uk/

Please contact:Marie-Claire Forgue, W3C European Communications OfficerTel: +33 4 92 38 75 94E-mail: [email protected]

©

Page 52: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

52 ERCIM News No. 49, April 2002

EVENTS

MOSART is a 3-year project of theEuropean Commission within theResearch Training Networksprogramme. The aim is to ‘promotetraining-through-research, especially ofyoung researchers, both pre- and post-doctoral level, within the frame of highquality trans-national collaborativeresearch projects, including those inemerging fields of research’. MOSARTpromotes research in the field of Soundand Music Computing, focussing onmachine analysis and understanding ofmusical aspects of sound, such as timbrespace and control and virtualisation ofinstruments. Issues regarding the areasof Interactive Musical Performance andof Human Computer InteractiveConducting Tools are studied, withspecial attention being given to the useof computers in music analysis, digitalmusic representation and computerassisted musical composition andperformance.

The institutions involved in the coordi-nation of the MOSART project areDIKU-University of Copenhagen,DAIMI-DIEM-University of Aarhus,DTU-Danish Technical University,Denmark; NTNU-NorwegianUniversity of Science and Technology,Norway; KTH-Royal Institute ofTechnology, Sweden; University ofSheffield, United Kingdom; NICI-University of Nijmegen, TheNetherlands; Austrian ResearchInstitute for Artificial Intelligence,Vienna, Austria; DEI-University ofPadua, DIST-University of Genoa,CNUCE/IEI-CNR, Pisa, Italy; LMA-Centre Nationale de la RechercheScientifique, Marseille, France; IUA-Pompeu Fabra University, Barcelona,Spain.

The Workshop was structured aroundfour main topics: Music Generation(emphasizing algorithmic compositionand composition systems), MusicPerformance (focussing on quantitativemodels of performance, cognitivemodels of perception and production,and expressive performance andemotion); Music Interfaces (with atten-tion to gesture based interaction,mapping strategies and multimodality);Music Sound Modelling (concentratingon instrument modelling, instrumentrecognition and content processing).

There were three types of events:presentations of overview and shortpapers (published in the proceedings),posters/demos allowing in-depth discus-sions between presenters and partici-pants, and panels which provided theopportunity for more informal discus-sions on the state of the art and futuredirections of the topics.

LinksWorkshop website: http://www.iua.upf.es/mtg/mosart/MOSART Network resources: http://www.diku.dk/research-groups/musinf/mosart/cART Lab, CNUCE-CNR: http://www.cnuce.pi.cnr.it/tarabella/cART.html

Please contact: Leonello Tarabella, CNUCE-CNRTel: +39 050 315 3012 (office) 2043 (lab)E-mail: [email protected]

Graziano Bertini, IEI-CNRTel: +39 050 315 3125 (office) 3144 (lab)E-mail: [email protected]

by Leonello Tarabella, Graziano Bertini and Gabriele Boschi

A workshop, held in November 2001 at the Audiovisual Institute, Pompeu FabraUniversity, Barcelona, provided an overview of the first year of activities of theEuropean Network: MOSART (Music Orchestration Systems in Algorithmic Researchand Technology). The event was structured around four main topics: MusicGeneration, Music Performance, Music Interfaces and Music Sound Modelling.

Workshop on Current Research Directions in Computer Music

CALL FOR PARTICIPATION

The International Conferenceon Pervasive Computing –PERVASIVE 2002

Zurich, Switzerland, 26-28 August 2002

The objective of this conference will beto present, discuss, and explore the latesttechnical developments in the emergingfield of pervasive computing as well aspotential future directions and issues.The conference will focus on technical

infrastructure and application issues. Itwill include presentations, panel discus-sions, short paper sessions, and demoson topics such as:

• system architectures and platformsfor pervasive computing

• middleware and pervasive computinginfrastructures

• mobile, wireless, and wearabletechnologies

• innovative small computing andintelligent devices

• emerging applications and mobilebusiness issues

• scenarios for information appliances• service discovery protocols• content distribution and delivery• user interfaces for invisible and

embedded computing• context awareness• security and privacy issues.

Further information: http://www.pervasive2002.org/

Page 53: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002 53

EVENTS

CALL FOR PAPERS

7th ERCIM Workshop ‘User Interfaces for All”Chantilly, France, 23-25 October 2002

The vision of User Interfaces for Alladvocates the proactive realisation of the‘design for all’ principle in the field ofHuman-Computer Interaction (HCI), andinvolves the development of userinterfaces to interactive applications andtelematic services, which provideuniversal access and usability topotentially all users. The emphasis of thisyear’s event is on ‘Universal Access’ andinvites contributions on a broad range oftopics, including technologicalapplications and policy developmentsaiming to advance the notion ofInformation Society Technologiesaccessible and acceptable by the widestpossible end-user population.

The requirement for Universal Accessstems from the growing impact of thefusion of the emerging technologies, andfrom the different dimensions of diversitythat are intrinsic to the Information Society.These dimensions become evident whenconsidering the broad range of usercharacteristics, the changing nature ofhuman activities, the variety of contexts ofuse, the increasing availability anddiversification of information, knowledgesources and services, the proliferation oftechnological platforms, etc. In this context,Universal Access refers to the accessibility,usability and, ultimately, acceptability ofInformation Society Technologies byanyone, anywhere, anytime, thus enablingequitable access and active participation ofpotentially all citizens in existing andemerging computer-mediated humanactivities. The user population includespeople with different cultural, educational,training and employment background,novice and experienced computer users,the very young and the elderly, and peoplewith different types of disabilities, invarious interaction contexts and scenariosof use. As people experience technologythrough their contact with the user interfaceof interactive products, applications andservices, the field of HCI has a critical andcatalytic role to play towards a universallyaccessible, usable and acceptableInformation Society.

Scientific/technological contributionsshould be on concepts and tools thatadvance our understanding of, andcontribute towards, Universal Access tothe new computer-mediated virtual spaces.Areas of interest include, but are notlimited to, future and emergingtechnologies, novel computing paradigms,computer-mediated virtual spaces,architectures and tools, interactionplatforms, interaction metaphors,experimental or empirical studies, etc.,which bear an impact on the scope ofhuman access to digital content in anInformation Society.

Applications-oriented contributions mayaddress practice and experience in theapplication of Universal Access principlesin critical domains such as health,education, employment, etc. In thiscontext, this year’s workshop encouragescontributions that elaborate upon, adopt,apply or validate a Universal Access codeof practice in selected application domains.

Finally, contributions on policydevelopments should discuss the impactof non-technological factors, such aslegislation, standardisation, technologytransfer, etc., on developing a culture forUniversal Access in the InformationSociety, for all parties concerned and inparticular the industry. Policycontributions may cover success storiesof the past or lay out prevailing obstaclesto be addressed and removed by effectivepolicy interventions.

Areas of InterestAreas of interest for which papers aresolicited, include, but are not limited to,the following topics:• Adaptable and adaptive interaction,

user modelling• Intelligent interface technologies• Multilinguality,

internationalisation/localisation ofinteractive applications

• Novel interaction techniques,multimedia/multimodal interfaces

• Dialogue design methodologies andapproaches

• Universal Access designmethodologies and tools

• Interface architectures, developmenttools, interoperability

• Evaluation techniques and tools• Universal Access scenarios for

ambient intelligence

• Universal Access and novelinteraction paradigms (wearable andubiquitous computing, tangibleinterfaces, Virtual Reality)

• Interaction metaphors• User Support• Cooperation• Personalised content delivery• Access to e-services• National and European policies for e-

accessibility• Standards development for Universal

Design and Universal Access• Advances in legislation for Universal

Access• Accessibility of (public) web sites• User involvement.

Important Dates• 24 June 2002: Deadline for electronic

submission of all papers• 30 July 2002:

Conditional notification ofacceptance (confirmation will begiven upon registration)

• 16 September 2002: Deadline for electronic submission ofcamera-ready papers

• 4 October 2002: Deadline for registration.

Keynote Speakers• Alfred Kobsa, University of

California, Irvine, USA• Steven Pemberton, CWI.

Special eventsTwo special pre-ERCIM Workshopevents will be organised in cooperationwith the IS4ALL project (IST-1999-14101) on 23rd October 2002:• IS4ALL Seminar: Half-day

(morning) IS4ALL seminar on‘Universal access: A practitioner’sguide in Health Telematics’.

• IS4ALL Thematic Workshop: Half-day (afternoon) workshop entitled‘Methods and tools for designinguniversal access’. This will includeinvited position papers by IS4ALLmembers, and other actors in thefields of Health Telematics and HCI,followed by open discussion.

More details on these events will bepublished on the IS4ALL project website:http://is4all.ics.forth.gr.

Further information:

http://ui4all.ics.forth.gr/workshop2002/

Page 54: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

54 ERCIM News No. 49, April 2002

EVENTS

ERCIM News is the magazine of ERCIM.Published quarterly, the newsletter reports onjoint actions of the ERCIM partners, and aimsto reflect the contribution made by ERCIM tothe European Community in InformationTechnology. Through short articles and newsitems, it provides a forum for the exchange ofinformation between the institutes and alsowith the wider scientific community. ERCIMNews has a circulation of over 7500 copies.

Copyright NoticeAll authors, as identified in each article, retaincopyright of their work.

ERCIM News online edition is available athttp://www.ercim.org/publication/ERCIM_News/

ERCIM News is published by ERCIM EEIG, BP 93, F-06902 Sophia-Antipolis CedexTel: +33 4 9238 5010, E-mail: [email protected] 0926-4981

Director: Bernard LarrouturouCentral Editor:

Peter [email protected]

Local Editors:CLRC: Martin Prime

[email protected]: Michal Haindl

[email protected]: Henk Nieland

[email protected]: Carol Peters

[email protected]: Leta Karefilaki

[email protected]: Michael Krapp

[email protected]: Bernard Hidoine

[email protected]: Truls Gjestland

[email protected]: Harry Rudin

[email protected]: Kersti Hedman

[email protected]: Gabriela Andrejkova

[email protected]: Erzsébet Csuhaj-Varjú

[email protected]: Ann McNamara

[email protected]: Pia-Maria Linden-Linna

[email protected]

Free subscriptionYou can subscribe to ERCIM News free ofcharge by: • sending e-mail to your local editor• contacting the ERCIM office (see address

above) • filling out the form at the ERCIM website at

http://www.ercim.org/

CALL FOR PARTICIPATION

13th Eurographics Workshopon Rendering

Pisa, Italy, 26-28 June 2002

The Workshop on Rendering is wellestablished as a major internationalforum for the exchange of ideas andexperiences in the area of renderingalgorithms and techniques. TheWorkshop is organised this year by theVisual Computing Group of the Institutefor Information Sciences andTechnologies, CNR, Pisa, in associationwith Eurographics and within the frame-work of the Eurographics WorkingGroup on Rendering Activities.

In addition to a first class program offull papers, the workhop includes invitedtalks by well known experts in the fieldof Rendering. The invited speakers thisyear are: Hans-Peter Seidel (MPI -Saarbruecken, Germany) and DougRoble (Digital Domain, USA).

Further information: http://vcg.iei.pi.cnr.it/egrw02.htm

CALL FOR PAPERS

ECDL 2002 – 6th EuropeanConference on Research and Advanced Technology for Digital Libraries

Rome, Italy, 16-18 September 2002

ECDL 2002 is the sixth conference inthe series of European Digital Librariesconferences. The focus of ECDL 2002is on underlying principles, methods,systems and tools to build and makeavailable to final end users effectivedigital libraries.

Important Dates• 1 May 2002: Deadline for all proposals• 15 May 2002: Notification of

acceptance for tutorials andworkshops

• 15 June 2002: Notification ofacceptance for papers, panels, demosand posters

• 1 July 2002: Camera ready papersfrom the authors.

Scientific ProgrammeSubmissions on all topic areas arewelcome and will receive full and equalconsideration. Submissions may be fullpapers, posters, demos, panels, tutorials,or workshops. Although submissionsare not restricted in topic or scope, weexpect that submissions will fall intoone or more of the following broadareas:• Research: Significant research results

on all aspects of digital libraries,focussing on integration of methods,interoperability across differentservices, data and metadata structuresand algorithms, information and textmining, knowledge and multimediacontent management, validation alsothrough implementation and use, aswell as evaluation.

• Policy: Discussion of significantpolicy issues related to the design,operation, and economics of digitallibraries.

• System: System issues in design,implementation, and building ofdigital libraries, preferably based onprototypes and strongly backed bypractical experience.

• Experience/Evaluation. Analysis ofactual implementations of and userinteractions with digital libraries indifferent application areas, possiblyincluding contributions from thehumanities, semiotics, and other areas.

• Fundamentals: Studies associatingdigital libraries with previous areas ofthought and discourse. This explicitlyincludes topics ranging fromlibrary/information science tophilosophy. However, contributionsin this area, as with the other areas,must be accessible to the range ofconference attendees, including themore practical outlook of systemdevelopers.

ECDL 2002 also provides a forum fordiscussing applications of digital libraryconcepts and techniques in areas not yetreally considered as being part of theDigital Library world, such as educationand health care applications, digitalearth-sky-law-art and music,humanities, social sciences,environmental monitoring, naturalsciences, and historical and scientificarchives.

Further information:http://www.ecdl2002.org

Page 55: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

ERCIM News No. 49, April 2002

IN BRIEF

The Research Council ofNorway has launched an eight

year, 23 million Euro, ICT program,ICT-2010. The primary goals of thisprogram is to generate new knowledgewithin communications technology,distributed IT systems, and large infor-mation and software systems. Amongother things the program will emphasizethe transfer of expertise and cooperation.The transfer of expertise to users insociety in general, industry and appliedresearch will be ensured by trainingcandidates who will move to non-academic positions, and via cooperationbetween basic research groups and usersof research results. Such cooperationalso provides valuable feedback and abasis for studying new problems in thebasic research environment. Anotherimportant aspect of cooperation is thatthe basic research groups are given theopportunity to profile their research insuch a way that it is of relevance toNorwegian user groups. The programwishes research groups to build upstrong international contacts. Suchcontacts are important as antennas forcapturing new development trends at anearly stage, in an area in which muchresearch takes place abroad. In order topromote such contacts, the program’sdoctoral students should spend sixmonths to a year abroad.

International Review of UKResearch in Computer Science.

The International Review of UKResearch in Computer Science wasconducted by a panel of experts from allover the world on behalf of the Office ofScience & Technology, the Engineering& Physical Sciences Research Council,the IEE, the British Computer Societyand the Royal Society. A final reportpublished in November 2001 confirmsthe traditional high quality of UKcomputer science, but warns that thenation’s position as a world leader is byno means assured. Declines in certainfields are already evident, it claims, andmore will follow given current levels ofsupport and the nature of today’s univer-sity research environment. Identifyingseveral broad areas in which UKresearch strength is particularly at risk –including programming language

design, artificial intelligence, human-computer interaction and bio-informa-tics – the panel highlights algorithmsand complexity research as key areas ofpotentially massive future importancewhere research needs to be encouraged.Given the opportunities that will becreated by the recently announced three-year e-Science initiative, it adds, theabsence of funding within the schemefor longer-term computer scienceresearch seems ‘ill considered’ at a timeof declining UK research activity inhigh-performance scientific computing.See http://www.iee.org/Policy/CSreport/

SZTAKI — Tamás Roskareceived the Bolyai Prize,

Hungary’s highest non-state scientificaward named after the world-renownedmathematician János Bolyai (1802-1860), from Ferenc Mádl, President ofHungary, in the Budapest Convention

Centre on 2 March 2002. The BolyaiPrize Foundation was set up in 1998 at anon-governmental initiative. The prize,with a purse of USD 50,000 was firstawarded in 2000. Tamás Roska is thehead of the Analogic and NeuralComputing Research Laboratory,SZTAKI and Professor and Dean of theFaculty of Information Technology atthe Pázmány P. Catholic University,Budapest. He is a co-inventor of theCNN Universal Machine (with Leon O.Chua) and the analogic CNN Bionic Eye(with Frank S. Werblin and Leon O.Chua), both are US patents owned bythe University of California at Berkeley.Tamás Roska recently coordinated anEU-NSF strategic workshop on Bionicsorganized by ERCIM.

CNR — Franco Denoth hasbeen appointed as Director of the

newly constituted Istituto di Informaticae Telematica, located in the CNRResearch Area, Pisa. IIT-CNR has been

formed as a result of the merging of theInstitutes for Computational Mathematicsand Telematic Applications, part of theongoing restructuring of CNR. The mainareas of interest at IIT are: web algorith-mics; search engines; safety in sharedenvironments; computational biology;electronic transactions and safetyprotocol; advanced and safe systems forthe production of network structuredinformation. Further information can befound at http://www.iit.cnr.it/

INRIA — The new innovationand communication center

(PIC) of the INRIA Rocquencourt unitwas inaugurated on 15 January 2002.The center has a show floor and confer-ence area, including a large amphitheaternamed after the famous mathematicianJacques-Louis Lions, in honor of theInstitute’s past President from 1979 to1983. An information area is open to thepublic and another area is dedicated todevelopment and transfer. The latter willhost incubating companies in the field ofICST wishing for scientific proximity.The whole building covers an area ofnearly 5,000m2 and was designed byarchitect Henri Ciriani. The constructionwas financed by the Government, theYvelines Departmental Council and theÎle-de-France Regional Council.

Hungarian President Ferenc Mádl (left)presents the Bolyai Prize to TamásRoska.

Page 56: Special Theme: Information Security - ERCIM · 14 Towards Digital Credentials by Stefan Brands, McGill University, School of Computer Science, Canada ... 26 SMM – Assessing a Company’s

Fraunhofer-Gesellschaft, Information and Communication Technology AllianceSchloß Birlinghoven, D-53754 Sankt Augustin, GermanyTel: +49 2241 14 0, Fax: +49 2241 14 2889http://www.fraunhofer.de/

Consiglio Nazionale delle Ricerche, IEI-CNRArea della Ricerca CNR di Pisa, Via G. Moruzzi 1, 56124 Pisa, ItalyTel: +39 050 315 2878, Fax: +39 050 315 2810http://www.iei.pi.cnr.it/

Centrum voor Wiskunde en InformaticaKruislaan 413, NL-1098 SJ Amsterdam, The NetherlandsTel: +31 20 592 9333, Fax: +31 20 592 4199http://www.cwi.nl/

Institut National de Recherche en Informatique et en AutomatiqueB.P. 105, F-78153 Le Chesnay, FranceTel: +33 1 3963 5511, Fax: +33 1 3963 5330http://www.inria.fr/

Central Laboratory of the Research CouncilsRutherford Appleton LaboratoryChilton, Didcot, Oxfordshire OX11 0QX, United KingdomTel: +44 1235 82 1900, Fax: +44 1235 44 5385http://www.cclrc.ac.uk/

Swedish Institute of Computer ScienceBox 1263, SE-164 29 Kista, SwedenTel: +46 8 633 1500, Fax: +46 8 751 72 30http://www.sics.se/

Technical Research Centre of FinlandVTT Information Technology, P.O. Box 1200, FIN-02044 VTT, FinlandTel:+358 9 456 6041, Fax :+358 9 456 6027http://www.vtt.fi/

Foundation for Research and Technology – HellasInstitute of Computer ScienceP.O. Box 1385, GR-71110 Heraklion, Crete, GreeceTel: +30 81 39 16 00, Fax: +30 81 39 16 01http://www.ics.forth.gr/FORTH

Czech Research Consortium for Informatics and MathematicsFI MU, Botanicka 68a, CZ-602 00 Brno, Czech RepublicTel: +420 2 688 4669, Fax: +420 2 688 4903http://www.utia.cas.cz/CRCIM/home.html

Swiss Association for Research in Information TechnologyDept. Informatik, ETH-Zentrum, CH-8092 Zürich,SwitzerlandTel: +41 1 632 72 41, Fax: +41 1 632 11 72http://www.sarit.ch/

Slovak Research Consortium for Informatics andMathematics, Comenius University, Dept.of ComputerScience, Mlynska Dolina M, SK-84248 Bratislava,Slovakia, Tel: +421 7 726635, Fax: +421 7 727041http://www.srcim.sk/

Magyar Tudományos AkadémiaSzámítástechnikai és Automatizálási Kutató IntézeteP.O. Box 63, H-1518 Budapest, HungaryTel: +36 1 279 6000, Fax: + 36 1 466 7503http://www.sztaki.hu/

ERCIM – The European Research Consortium for Informatics and Mathematics is an organisationdedicated to the advancement of European research and development, in information technologyand applied mathematics. Its national member institutions aim to foster collaborative work withinthe European research community and to increase co-operation with European industry.

Trinity CollegeDepartment of Computer Science,Dublin 2, IrelandTel: +353 1 608 1765, Fax: 353 1 677 2204http://www.cs.tcd.ie/ERCIM

Austrian Association for Research in ITc/o Österreichische Computer GesellschaftWollzeile 1-3, A-1010 Wien, AustriaTel: +43 1 512 02 35 0, Fax: +43 1 512 02 35 9http://www.aarit.at/

Norvegian University of Technology, Faculty of Information Technology, Mathematics andElectrical Engineering, N 7491 Trondheim, NorwayTel: +47 73 59 80 35, Fax: +47 73 59 36 28http://www.ntnu.no/

Order Form

If you wish to subscribe to ERCIM News free of charge

or if you know of a colleague who would like toreceive regular copies of

ERCIM News, please fill in this form and wewill add you/them to the mailing list.

send, fax or email this form to:ERCIM NEWS

Domaine de VoluceauRocquencourt

BP 105F-78153 Le Chesnay Cedex

Fax: +33 1 3963 5052 E-mail: [email protected]

Data from this form will be held on a computer database. By giving your email address, you allow ERCIM to send you email

Name:

Organisation/Company:

Address:

Post Code:

City:

Country

E-mail:

You can also subscribe to ERCIM News and order back copies by filling out the form at the ERCIM website at http://www.ercim.org/publication/Ercim_News/