computer-linked autofabricated 3d models for teaching...

1
Computer-Linked Autofabricated 3D Models For Teaching Structural Biology (sketches_0157). Alexandre Gillet*, Suzanne Weghorst, William Winn, Daniel Stoffler*, Michel Sanner*, David Goodsell* and Arthur Olson* *The Scripps Research Institute. {gillet, stoffler, sanner, goodsell, olson}@scripps.edu †Human Interface Technology Laboratory. {weghorst, billwinn}@u.washington.edu 1. Introduction. We present an application which applies two cutting edge research technologies – 3D printing and augmented reality toward improving learning structural biology. Understanding this complex subject is essential in our society, both to foster progress and support critical decision- making in biotechnology and bio-nanotechnology. It is a challenging subject requiring the comprehension of the spatial structure and interactions among complex molecules comprising thousands of atoms. Our driving hypothesis is that the addition of augmented tangible elements to the perceptual experience of students will enhance and accelerate their understanding of structural molecular biology. Computer-generated physical models allow direct experience of the complex shapes and relationships of biological molecules (Bailey 1998). Coupling these models with computational input and output provides a natural interface between the learner and the wealth of data coming from the structural biology community. Physical molecular models, while vastly more informative and intuitive than 2D drawings or textual descriptions, are fixed in form and are limited in the number of properties they can represent. We use computer-based spatial tracking and rendering methods (“augmented” or “mixed” reality technologies) to enhance the semantic content of our models and to show dynamic properties. 2. System. We have developed a software framework to enable the fabrication design and augmented display of the models to be performed within the same environment. The physical models can be specified by a wide range of molecular computational models, including molecular surfaces, extruded volumes, backbone ribbons, and atomic ball and stick representations. Our development is based on the extensible Python Molecular Viewing environment (PMV) (Sanner 1999), a modular approach to molecular modeling. PMV is built within the interpreted language Python. Our AR interface combines real-world presence with virtual object presence. The user manipulates a model and the model is tracked by a video camera and displayed on the computer screen or in a lightweight head mounted display. A virtual model (e.g., another 3D rendering of the same model, textual labels, or a 3D animation. An electrostatic field is shown on the virtual model in the figure 1) is superimposed over the video image, and spatially registered with the model as the user explores the structure. The result is a quite compelling sense of virtual object realism. Our approach is based on the widely used ARToolKit (Billinghurst 1999). By using several markers, the AR overlay can be maintained and appropriately occluded while being arbitrarily manipulated. Figure 1: model of superoxide dismutase built with the Stratasys machine Prodigy Plus(left), overlay showing a volume render electrostatic field and animated electrostatic field vectors (right) 3. Evaluation To evaluate the functionality of the tool, the implemented prototype was user-tested during a weeklong technology assessment with an Advanced Placement Biology high school class. The students were allowed to manipulate and explore tangible models of proteins while answering questions composed by their teachers. They were then allowed to manipulate similar models with attached markers while observing a registered 3D animation, as well as virtual properties. Pre- and post-exposure concept mapping demonstrate significant learning of the key concepts. We also completed a thorough usability test of the system with novices and experts in the content, which has allowed us to iron out some interfaces issues. 4. Future Work We will develop a spatially tracked “data probe’ designed to enable interaction with both physical and virtual models. Our system currently relies on fiducial tracking markers, we will develop new algorithms and code for markerless spatial tracking of our models which will be added to our system. We plan to extend the use and assessment of our augmented tangible model technologies to a wide range of grade level and settings (including K-12, undergraduate, graduate and science exhibits). 5. References Bailey, M., Schulten, K. and Johnson, J. (1998). "The use of solid physical models for the study of macromolecular assembly." Curr Opin Struct Biol 8: 202-208. Billinghurst, M. and Kato., H. (1999). "Collaborative mixed reality." In Proceedings of International Symposium on Mixed Reality (ISMR '99). Mixed Reality--Merging Real and Virtual Worlds: 261-284. Sanner, M. F. (1999). "Python: a programming language for software integration and development." J Mol Graph Model 17(1): 57-61. Figure 2: A user is holding a tangible model of 30S Ribosome subunit, the computer screen displays AR added 50S ribosome subunit which assembles with the 30S.

Upload: others

Post on 20-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Computer-Linked Autofabricated 3D Models For Teaching ...mgl.scripps.edu/projects/tangible_models/publications_repo/siggraph04.pdf · A virtual model (e.g., another 3D rendering of

Computer-Linked Autofabricated 3D ModelsFor Teaching Structural Biology (sketches_0157).

Alexandre Gillet*, Suzanne Weghorst†, William Winn†, Daniel Stoffler*,Michel Sanner*, David Goodsell* and Arthur Olson*

*The Scripps Research Institute. {gillet, stoffler, sanner, goodsell, olson}@scripps.edu†Human Interface Technology Laboratory. {weghorst, billwinn}@u.washington.edu

1. Introduction.We present an application which applies two cutting edgeresearch technologies – 3D printing and augmented reality– toward improving learning structural biology.Understanding this complex subject is essential in oursociety, both to foster progress and support critical decision-making in biotechnology and bio-nanotechnology. It is achallenging subject requiring the comprehension of thespatial structure and interactions among complex moleculescomprising thousands of atoms.Our driving hypothesis is that the addition of augmentedtangible elements to the perceptual experience of studentswill enhance and accelerate their understanding ofstructural molecular biology.Computer-generated physical models allow directexperience of the complex shapes and relationships ofbiological molecules (Bailey 1998). Coupling these modelswith computational input and output provides a naturalinterface between the learner and the wealth of data comingfrom the structural biology community. Physical molecularmodels, while vastly more informative and intuitive than 2Ddrawings or textual descriptions, are fixed in form and arelimited in the number of properties they can represent. Weuse computer-based spatial tracking and rendering methods(“augmented” or “mixed” reality technologies) to enhancethe semantic content of our models and to show dynamicproperties.

2. System.We have developed a software framework to enable thefabrication design and augmented display of the models tobe performed within the same environment. The physicalmodels can be specified by a wide range of molecularcomputational models, including molecular surfaces,extruded volumes, backbone ribbons, and atomic ball andstick representations. Our development is based on theextensible Python Molecular Viewing environment (PMV)(Sanner 1999), a modular approach to molecular modeling.PMV is built within the interpreted language Python.Our AR interface combines real-world presence with virtualobject presence. The user manipulates a model and themodel is tracked by a video camera and displayed on thecomputer screen or in a lightweight head mounted display.A virtual model (e.g., another 3D rendering of the samemodel, textual labels, or a 3D animation. An electrostaticfield is shown on the virtual model in the figure 1) issuperimposed over the video image, and spatiallyregistered with the model as the user explores the structure.The result is a quite compelling sense of virtual objectrealism. Our approach is based on the widely usedARToolKit (Billinghurst 1999). By using several markers, theAR overlay can be maintained and appropriately occludedwhile being arbitrarily manipulated.

Figure 1: model of superoxide dismutase built with the Stratasys machineProdigy Plus(left), overlay showing a volume render electrostatic fieldand animated electrostatic field vectors (right)

3. EvaluationTo evaluate the functionality of the tool, the implementedprototype was user-tested during a weeklong technologyassessment with an Advanced Placement Biology highschool class. The students were allowed to manipulate andexplore tangible models of proteins while answeringquestions composed by their teachers. They were thenallowed to manipulate similar models with attached markerswhile observing a registered 3D animation, as well as virtualproperties. Pre- and post-exposure concept mappingdemonstrate significant learning of the key concepts. Wealso completed a thorough usability test of the system withnovices and experts in the content, which has allowed us toiron out some interfaces issues.

4. Future WorkWe will develop a spatially tracked “data probe’ designed toenable interaction with both physical and virtual models.Our system currently relies on fiducial tracking markers, wewill develop new algorithms and code for markerless spatialtracking of our models which will be added to our system.We plan to extend the use and assessment of ouraugmented tangible model technologies to a wide range ofgrade level and settings (including K-12, undergraduate,graduate and science exhibits).

5. ReferencesBailey, M., Schulten, K. and Johnson, J. (1998). "The use of solid

physical models for the study of macromolecular assembly." Curr Opin Struct Biol 8: 202-208.

Billinghurst, M. and Kato., H. (1999). "Collaborative mixed reality." In Proceedings of International Symposium on Mixed Reality (ISMR '99). Mixed Reality--Merging Real and Virtual Worlds: 261-284.

Sanner, M. F. (1999). "Python: a programming language for software integration and development." J Mol Graph Model 17(1): 57-61.

Figure 2: A user is holding a tangiblemodel of 30S Ribosome subunit, thecomputer screen displays AR added50S ribosome subunit whichassembles with the 30S.