3d visualisation of breast reconstruction using …1266886/...3d visualisation of breast...

34
UPTEC F 18058 Examensarbete 30 hp November 2018 3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask

Upload: others

Post on 28-May-2020

13 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

UPTEC F 18058

Examensarbete 30 hpNovember 2018

3D visualisation of breast reconstruction using Microsoft HoloLens

Amanda NorbergElliot Rask

Page 2: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Teknisk- naturvetenskaplig fakultet UTH-enheten Besöksadress: Ångströmlaboratoriet Lägerhyddsvägen 1 Hus 4, Plan 0 Postadress: Box 536 751 21 Uppsala Telefon: 018 – 471 30 03 Telefax: 018 – 471 30 00 Hemsida: http://www.teknat.uu.se/student

Abstract

3D visualisation of breast reconstruction usingMicrosoft HoloLens

Amanda Norberg Elliot Rask

The purpose of the project is to create a Mixed Reality (MR) application for the 3D visual-isation of the result of a breast reconstruction surgery. The application is to be used beforesurgery to facilitate communication between patient an surgeon about the expected result.To this purpose Microsoft HoloLens is used, which is a pair of Mixed Reality (MR) glassesdeveloped and manufactured by Microsoft that has a self-contained, holographic renderingcomputer. For the development of the MR application on the Hololens, MixedRealityToolkit-Unity is used which is a Unity-based toolkit available. The goal of the application is thatthe user can scan a torso of a patient, render a hologram of the torso and attach to it aprefabricated breast which possibly follows the patient’s specification.

To prepare a prefabricated breast, a 3D model of the breast is first created in the 3D modellingsoftware Blender. It then gets its texture from a picture taken with the HoloLens camera.The picture is cropped to better fit the model and uploaded as a 2D texture which is thenattached to the prefabricated breast, which is imported into Unity.

To scan objects, the HoloLens’s operating system feature Surface Observer is used. Theresulting mesh from the observer is cropped using a virtual cube, scaled, moved and rotatedby the user. The cropped mesh is then smoothed using the Humphrey’s Classes smoothingalgorithm. To fuse the smoothed mesh with the prefabricated breast model, the Unitycomponents: Colliders and Transforms are used. On a collision the breast’s transform parentis set to the mesh’s transform, making the objects transforms behave depending on eachother.

The MR application has been developed and evaluated. The evaluation results show that thegoal has been achieved successfully. The project demonstrates that the Microsoft HoloLensis well suited for developing such medical applications as breast reconstructive surgery vi-sualisations. It can possibly be extended to other surgeries such as showing on a patient’sbody how the scar will look after a heart surgery, or a cesarean section.

ISSN: 1401-5757, UPTEC F18 058Examinator: Tomas NybergÄmnesgranskare: Ping WuHandledare: Christian Alex

Page 3: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Contents

1 Introduction 61.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71.2 Purpose and goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81.3 Overview of the project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81.4 Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81.5 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2 Mixed Reality and Microsoft HoloLens 102.1 Mixed Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.2 Microsoft HoloLens . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.2.2 HoloLens RS4 Preview . . . . . . . . . . . . . . . . . . . . . . . . . . 122.2.3 Development tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3 Theory 133.1 Mesh . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.2 Mesh smoothing algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2.1 Laplacian Smoothing Algorithm . . . . . . . . . . . . . . . . . . . . . 143.2.2 Humphrey’s Classes Smoothing Algorithm . . . . . . . . . . . . . . . 15

4 Method and implementation 154.1 Unity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.1.1 GameObject . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164.1.2 Component . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164.1.3 Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.1.4 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.1.5 Mesh Filter Mesh Renderer . . . . . . . . . . . . . . . . . . . . . . . 174.1.6 Colliders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.1.7 Spatial mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184.1.8 Image processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.2 Creating breast prefabricate . . . . . . . . . . . . . . . . . . . . . . . . . . . 204.3 Navigating the application . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214.4 Scanning a targeted object . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.5 Smoothing of the obtained object mesh . . . . . . . . . . . . . . . . . . . . . 224.6 Photograph area and render it as texture of object . . . . . . . . . . . . . . . 224.7 Merging scanned object and breast prefabricate . . . . . . . . . . . . . . . . 24

5 Results and discussion 245.1 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

5.1.1 Interacting with the application . . . . . . . . . . . . . . . . . . . . . 24

3

Page 4: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

5.1.2 Scenes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255.2 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

5.2.1 Limitations of the concept . . . . . . . . . . . . . . . . . . . . . . . . 305.2.2 Imitating the human skin with textures . . . . . . . . . . . . . . . . . 305.2.3 Breast prefabrication . . . . . . . . . . . . . . . . . . . . . . . . . . . 305.2.4 The surface observer . . . . . . . . . . . . . . . . . . . . . . . . . . . 315.2.5 The current spatial observer . . . . . . . . . . . . . . . . . . . . . . . 31

6 Conclusion and future work 326.1 Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

4

Page 5: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Abbreviations

3D Three dimensionalAPI Application Programming InterfaceAR Augmented RealityCPU Central Processing UnitHC algorithm Humphrey’s Classes algorithmMR Mixed RealityMRT Mixed Reality ToolkitUI User InterfaceUX User experienceVR Virtual Reality

5

Page 6: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

1 Introduction

“Our way of visualizing the body in medicine has historically been as a two-

dimensional abstraction. Now through mixed reality, we can view it in three

dimensions and that is pretty transformative [1],”

- Dr. Simon Kos, Chief Medical Officer, Microsoft

As the advancement of technology moves forward, so does the vast applications of it withinmedicine. New technology and software can facilitate diagnostics, education and even guidesurgeons through surgeries by the aid of holograms projected on the skin of patients [2].

Mixed Reality aims to be a seamless integration between the virtual and real world. Itis a computer augmented environment where "real data are sensed and used to modifyusers’ interactions with computer mediated worlds beyond conventional dedicated visualdisplays" [3].

Holograms are 3D objects that are made of light and sound. A Mixed Reality device suchas Microsoft’s HoloLens display holograms while the background is still the real world. Theholograms can be created, altered and implemented in an application for a MR device.Central in the devices that offer MR is an understanding of the environment in which theuser operates. The device uses sensors to perceive environmental input such as surfaces,boundaries, sound and location [4].

With the use of MR, holograms of different parts of the human body such as organs can beviewed and interacted with by physicians in a 3D scope. The object can be scaled, rotatedand enhanced at the fingertips of the user. It can also be placed, or "anchored", in thephysical space so that the user can walk around, closer or further away from the object.These kinds of interactions with a program is a great advancement in the field of HumanComputer Interactions. It allows for a feeling of simplicity on behalf of the end user whenintuitive hand motions can control and interact with the application.

Figure 1.1: Venn diagram visualisation of the interplay between humans, environment and

computers where Mixed Reality is in the intersection of all three [4].

6

Page 7: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

This thesis aims to investigate the possibility of and implement a proof of concept MRapplication for viewing a holographic aesthetic result of a surgery directly on the patient’sbody when viewed through a MR device. More specifically to show the possible resultof a breast reconstruction before the actual reconstruction surgery for a patient who hasundergone a mastectomy. This by creating a hologram of a new breast as a 3D object. The3D object will have the same texture as the skin on the chest that will be expanded into anew breast.

1.1 Background

A mastectomy is the surgical procedure where the breast is removed. In a total mastectomythe full breast with tissue and nipple are removed as shown in Figure 1.2 [5].

Figure 1.2: Illustration of the removal of a breast in a total mastectomy [6].

Some patients have their breast reconstruction at the same time as their mastectomy proce-dure. However, the most recommended practise for breast reconstruction post mastectomyis to wait some months until the patient is finished with possible additional treatments likechemotherapy [7]. When reconstructing the breast, the expansion can be done in severalways: with an implant, tissue from the own body, or a combination of the two [7].

It can be difficult for the patient to make a sufficiently informed decision on whether toundergo breast reconstruction. A recent, although very small, study shows that more thanhalf of women have made decisions about breast reconstructive surgery that did not alignwith their goals or preferences [8].

"As breast cancer providers, we need to talk about the pros and cons of surgery to

help women make treatment choices. Shared decision-making between the surgeon

and patient would be particularly useful for this decision. We need to connect pa-

tients with decision aids to help them really think through what is most important

to them [9]."

- Clara Lee, M.D., Ohio State University Comprehensive Cancer Center

7

Page 8: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

The HoloLens is a good tool for 3D visualisation of breast reconstruction. This leads to thepurpose and goals of the project that are addressed in the following section.

1.2 Purpose and goals

The purpose of this project is to develop an application in Unity for a Microsoft HoloLens.The application aims to scan a patient’s torso using the HoloLens and render a hologramusing that data, to then be merged together with a prefabricated breast. The prefabricatedbreast should also be able to get its texture from a picture of the chest area taken using theHoloLens.

1.3 Overview of the project

The idea is that the many advanced sensors in a MR device can be used to obtain sufficientlyaccurate scan of the human torso to fit the purpose of the application. The MicrosoftHoloLens offers an API called Spatial Mapping that uses the device’s sensors to scan thesurroundings and render them into a mesh. The mesh can then be processed in a programand used by the application as the basis for attaching a hologram of a breast.

Development of the application based on the Hololens will be done in three steps: 1) scan aspecified area of the human body and store it in 3D format using the Hololens 2) render anobject (e.g., breast) to be placed on the body at a specified position; the object is scalableand sizable 3) place the rendered object on the body at a specified position in the ARenvironment

1.4 Tasks

The main part of the project will consist of writing and altering scripts in the programminglanguage C#. The tasks are distributed in blocks to easily divide the work and have a stepby step approach.

Literature study Due to the HoloLens and MR being very new technology, there is limiteddocumentation and resources such as articles. Thus the literature study will be short as thefocus is to get started with writing code.

Limit spatial mapping area The first issue of the project is to limit the spatial mappingarea, since spatial mapping is the method used to scan objects.

8

Page 9: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Scan torso and render mesh Once the limitation of the spatial mapping is done, thattechnique can be used to scan a torso and render a 3D hologram of it.

Smooth obtained mesh The obtained mesh is expected to be rough. Mesh processing bysmoothing algorithms acting on the mesh vertices will be used to smooth the model of thetorso.

Create 3D breast model The 3D breast model will be created in the open source 3D mod-elling software Blender. The model will be shaped from half of a UV sphere and the surfacewill be UV unwrapped. Then the model will be imported into Unity for use in the applica-tion.

Map photo as texture on 3D breast model A photo will be taken in application by theHoloLens of the area on the chest where the breast will be operated. The HoloLens APIPhotoCapture allows access to the web camera. The photo will be UV mapped onto theprefabricated 3D breast model during run time.

Merge breast model with mesh of torso Collisions of holographic objects will be used tomerge the 3D mesh of the torso with the prefabricated 3D breast model.

Display result in a final view A final view where the user will be able to clearly see theresult, had low priority but added a nice touch to the application.

User interface The user interface needs to be changed and adapted as the project grows.As functionality is the priority, the development of the UI will be performed if there istime.

Manage scenes To be able to structure the code, scene management has to be stable aswell as dynamic when adding more content.

1.5 Outline

The report is organised in the following manner; Mixed Reality and the Microsoft HoloLensare presented in Chapter 2. In Chapter 3 theory needed for the project is addressed indetail, such as APIs, Unity components and algorithms. Chapter 4 consists of method andimplementation where basic information of the development software is explained as well

9

Page 10: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

as the creation of the application. Chapter 5 and 6 consists of result and discussion, andconclusion and suggestions for future work respectively.

2 Mixed Reality and Microsoft HoloLens

2.1 Mixed Reality

Mixed Reality uses immersive technology to blend the real and virtual world, where real anddigital objects interact an co exist in real time. It encompasses a wide spectrum contain-ing both Augmented Reality, merging virtual objects into the real world, and AugmentedVirtuality that merges real objects into the virtual world.

MR is usually visualised using wearable glasses such as Microsoft HoloLens.

Figure 2.1: Location of devices on the MR spectrum [4].

Figure 2.1 visualises the spectrum reaching from physical reality to digital reality.

• Towards the left (near physical reality) Users remain present in their physical environ-ment and are never made to believe they have left that environment [4].

• In the middle (fully mixed reality) These experiences perfectly blend the real worldand the digital world [4].

• Towards the right (near digital reality) Users experience a completely digital environ-ment and are oblivious to what occurs in the physical environment around them [4].

10

Page 11: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

It is shown in Figure 2.1 how MR breaches the gap between the two realities. Microsoft’sHoloLens is close to the physical reality part of the MR spectrum. This implies that whenusing the device, the user experiences themselves to be mostly in the "real world" with onlysome digitally enhanced elements.

2.2 Microsoft HoloLens

Microsoft HoloLens is a head mounted, self-contained, holographic rendering computer inthe form of a pair of glasses that takes the user into MR.

2.2.1 Hardware

The HoloLens contains several sensors, listed below, and computer hardware. It is able tocreate spatial sound and track the users gaze, gestures and voice at the same time as creating3D objects, holograms, in real space. The sensors are displayed in Figure 2.2 showing allsensor hardware contained in the HoloLens. The actual glass is see-through holographiclenses that uses optical projection which renders the holograms.

• 1 IMU

• 4 environment understanding cameras

• 1 depth camera

• 1 2MP photo / HD video camera

• MR capture

• 4 microphones

• 1 ambient light sensor

11

Page 12: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 2.2: The sensors of the HoloLens

2.2.2 HoloLens RS4 Preview

HoloLens RS4 Preview is an update for the HoloLens operating system released for re-searchers and developers. It contains the possibility to enable Research Mode that givesaccess to data streams from the HoloLens sensors. An example application can be seen inFigure 2.3 where all eight sensor streams available in the research mode are displayed.

RS4 Preview also allows for improvements in Spatial Mapping; more details of the scannedsurroundings can be represented with fewer triangles in the mesh grid.

12

Page 13: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 2.3: All the eight sensor streams accessible by Research Mode displayed in an example

application.

The research mode is strictly for development and is not allowed to be uploaded to theWindows Store [10].

2.2.3 Development tools

The recommended development tool to be used when developing an application for a HoloLensis Unity. Unity is a game development software and offers an interface for setting up a MixedReality application in a 3D view.

The Mixed Reality Toolkit for Unity is a collection of scripts and components to help devel-opers speed up their development of HoloLens and Windows Mixed Reality headset applica-tions [11]. The toolkit contains APIs to easily get started using spatial mapping, basic userinputs and UX controls.

User input with the HoloLens is handled using gaze and gesture or voice input.

3 Theory

3.1 Mesh

A mesh consists of triangles to represent a surface in 3D space and is used to visualise objectsin MR. The triangles themselves consist of vertices which marks up points in the physicalspace. The Mesh class contains of two arrays, vertices and triangles. Each entry in the

13

Page 14: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

vertex array consists of the x, y and z coordinated for one vertex. In the triangle array eachentry holds three indexes that points to the vertices that makes up a triangle [12].

3.2 Mesh smoothing algorithms

When scanning the environment, the vertices of the mesh is frequently updated which easilyleads to an uneven surface. Due to the relatively low resolution of the allowed number oftriangles per cubic meter, the cropped target object mesh can easily look distorted.

Smoothing algorithms are used to process meshes that are uneven at the surface. The pur-pose is to even the mesh without losing the original base shape. The algorithms recalculatethe position of the vertices that make up the mesh based on the surrounding vertices’ posi-tions.

3.2.1 Laplacian Smoothing Algorithm

In Laplacian smoothing the coordinates of the vertices adjacent to the target vertex are usedto weigh a new position of the target vertex, pi, so that it is proportionally far away fromall adjacent vertices as seen in Equation (1) and visualised in Figure 3.1.

In the simultaneous version of the Laplacian smoothing algorithm, all new positions of thevertices are calculated based on the original adjacent vertices. This in contrast to the sequen-tial version where all changes in vertices are propagated through the mesh. The simultaneousversion is more computationally expensive as it requires the updated vertices to be store aswell as the original vertices before the mesh is updated all at once. However, this methodyields a better result in regard to shape and smoothness of the resulting mesh [13].

pi :=

(1

|adj(i)|P

j2adj(i) qj i 2 Vvar

qi i 2 Vfix

(1)

Figure 3.1: Updated target vertex position using Laplacian smoothing [13].

The problem with the Laplacian smoothing algorithm is that it shrinks the mesh as well as

14

Page 15: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

smooths it due to the nature of the algorithm. This leads to issues when the aim of the meshis to replicate an object.

3.2.2 Humphrey’s Classes Smoothing Algorithm

The Humphrey’s Classes algorithm is an extension of the Laplacian smoothing algorithm.The algorithm uses the updated mesh that is obtained by running the mesh through theLaplacian smoothing algorithm. It pushes the vertices of the smoothed mesh vertices, pi,towards the original vertex positions, as seen in Equations (2) and (3), making the meshmaintain its original volume while being smoothed. The resulting position vectors are visu-alised in Figure 3.2 [13].

bi := pi � (↵oi) + (1� ↵)qi) (2)

di := �bi +1� �

|adj(i)|X

j2adj(i)

bj (3)

The scalar weight ↵ decides emphasis on previous points, qi, and original points, oi, respec-tively. The scalar weight � weighs the centre target vertex into the new position.

Figure 3.2: The new position of the target vertex i when pushing the vertices p back towards

the previous points q with the HC-algorithm [13].

4 Method and implementation

4.1 Unity

In Unity the view that contains the environment and surrounding details is called a Scene.Scenes can be seen as levels in a game, and multiple scenes can be included in an applicationto allow for different environments. The setup of a Scene is shown in Figure 4.1 with alight source, a main camera and three directional arrows representing the 3D coordinatesystem.

15

Page 16: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 4.1: An empty 3D Scene containing a main camera and a directional light.

4.1.1 GameObject

GameObjects are the most fundamental objects in Unity that is the base for any object ina Scene (4.1). A GameObject is nothing but a container for components and cannot becreated in the Unity editor without the transform component that defines the GameObject’sposition, rotation and scale in the space [14].

4.1.2 Component

A component is a base class for everything attached to a GameObject. It is not createddirectly but by writing a script and attaching that script to a GameObject.

Examples of components:

• Camera

• Light

• Transform

• Scripts in general

The transform component must be a part of a GameObject because it defines the GameOb-ject’s position, scale and rotation in the game world and enables the concept of parenting.Parenting is where one GameObject is set as the parent and one or more as children. Itis set by calling SetParent on the transform component of the child. Changing the parentof the child will make its transform be relative to the parent but keep the world position,

16

Page 17: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

rotation and scale the same. Moving the parent also moves the child, moving the child doesnot affect the parent [15].

4.1.3 Gestures

A user can interact with the MR world using tap gestures with their hands. There are severalmoves that can be done such as a single tap to interact with buttons and a long press tomove around an object, where the object will follow the hand’s movement. Using two handsthe user can do more 3D movements such as scale and rotate by pinching and rotating thehands.

Mixed Reality Toolkit provides a prefabricate for handling basic gestures, both one and twohands [11]. The scripts track the hand movement and change the transform of the currenttarget being gazed at to match the movements of the hands.

A transform controls the position, scale and rotation of a Game Object. Manipulating thetransform of a Game Object does not affect the physics engine, applying transforms is muchlike teleporting the object through space.

4.1.4 User Interface

The user interface is how the user interacts with the program. In a MR application the basicinteraction components found on websites such as buttons and sliders cannot be applieddirectly to the screen. This is since the gaze will move the button when the user tries tointeract with it. Instead, the buttons must be placed and anchored to the physical space ofthe room.

Interactions can be created by voice commands or gestures for using single and both hands.The elements to interact with can be types such as buttons and sliders. A canvas componentcan be used to build a basic window interface, but it will be floating in the room at thelocation where it was placed. Using the KeywordRecognizer class supplied by Mixed RealityToolkit, all that is required to use voice recognition is to input an array of keywords withcorresponding action.

An app bar is a bar that you can bind to an object so that it floats at the bottom of theobject. The app bar will follow the user’s movement, which removes some occlusion issuesthat may occur otherwise. An app bar contains between one to five buttons on which theuser can click to interact with the object.

4.1.5 Mesh Filter Mesh Renderer

Mesh filter is the form of an object on the screen. The mesh renderer component visualisesthe mesh filter component and its submeshes with their attached materials. Several materials

17

Page 18: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

can be applied to a renderer. If there are more materials than submeshes, all the remainingmaterials will be applied to the first submesh in the mesh filter.

4.1.6 Colliders

Colliders are invisible components that are required to detect physical collisions betweenGameObjects. Depending on the detail of the collider, more processing overhead is re-quired. When colliders interact with each other events are called and handled by the assignedscript.

Colliders behave differently depending on if and which kind of rigid body is assigned to it.A static collider does not contain a rigid body but solely a collider. GameObjects with rigidbodies will collide with it but not move it. The physics engine makes assumptions whichoptimises the performance. Should the size or position change the physics engine wouldhave to do more computations to restart the processes needed. This makes static collidersperfect for static planes such as floors, walls and furniture. A kinematic rigid body is abody which can be transformed but will not be responding to collisions and forces. A non-

kinematic object is fully simulated by the physics engine and can react with collisions andforces applied from scripts.

Primitive colliders Primitive colliders are of primitive forms such as cube, cone or sphere.The collider tries to fit the object as good as possible but given how complex the shapeis the collider will not be able to match the target shape. A lot of primitive colliders canbe added to one GameObject and can, with the correct sizing and positioning, create anaccurate model of the object with low processor overhead.

Mesh colliders Mesh colliders, in contrast to primitive colliders, maps the shape of a com-plex object to the extent of its ability. This to make the collisions for the shape behave asthe real object but this requires more processor overhead. A mesh collider is usually unableto collide with another mesh collider, but can if the collider is convex. A convex mesh col-lider is generated as a hull around the body and is restricted to a maximum of 255 meshtriangles.

4.1.7 Spatial mapping

For spatial awareness MR, the HoloLens uses a technique called Spatial Mapping. SpatialMapping is used by the operating system to map out the current room of the user, enablingrealistic interactions with surfaces which can be used to place holograms with collisions. TheSpatial Mapping low level API is made up of two central components from the Mixed RealityToolkit [11]: the Surface Observer and the Surface Data.

18

Page 19: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Surface Observer The sensor streams provide data that can be accessed using a SurfaceObserver. A Surface Observer is a class that applies an observer given an origin and extentof a volume. It detects addition, removal and updates of surfaces within the volume. TheAPI is used to request mesh data for nearby surfaces when the Surface Observer detectssome change in the environment. This mesh data can then be used to be displayed or savedfor later use.

Surface Data Surface Data is a struct that stores the object mesh, collider, anchor andtargeted number of triangles used to render a mesh. The resulting grid built up by the APIis limited to between 500 - 2000 triangles per cubic meter.

4.1.8 Image processing

For the final 3D breast model to have a more realistic result, photos need to be integratedinto the creation of the model.

PhotoCapture PhotoCapture is the API that allows for accessing the HoloLens’ web camerato take photos. The command TakePhotoAsync asynchronously takes a photo with the webcamera and saves it directly on the CPU memory on the device. When a photo is taken andsaved to memory, a PhotoCaptureFrame is returned which contains image data and spatialdata matrices that holds the spatial information on where the image was taken [16].

The image data in the PhotoCaptureFrame has to be uploaded into a Texture2D object inUnity for any post processing of the image. It is most commonly done by the UploadIm-ageDataToTexture command that is executed on the main thread of the program. Sincethe command is resource intensive it is important to consider the effect on the programsperformance [16].

Textures Textures are the surface of a mesh triangle. It is a bitmap image that is mappedon the surface of the mesh providing detail to the object. The information of the positionof the texture on the mesh is set in the modelling software where the original 3D object iscreated, in the case of this thesis Blender v.2.79b is used [17].

An image such as a photograph can be used as a texture, but for the image to be correctlymapped on the 3D model UV unwrapping is required. U and V denotes the coordinate axisof the 2D texture as to differentiate it from the x, y and z axis that describes the coordinatesof the 3D object on where the texture is mapped. In UV unwrapping of a 3D object somechosen edges of the object are marked as seams and the object is cut along those seams.This allows the object to be laid flat on a surface. The concept is taking a 3D object andmaking it 2D so that a 2D texture can be mapped onto it. In Figure 4.2 it it shown how acube is UV unwrapped onto a flat surface.

19

Page 20: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 4.2: 3D model of a cube UV-unwrapped onto a flat surface for application of texture.

The texture is rendered on the object by a material. Materials uses graphics programs calledshaders that stores the information about how light should reflect of the surface as well as towhat extent. This can create the illusion a shiny or coarse surface depending on the settingsof the shader [17].

4.2 Creating breast prefabricate

The 3D model of the breast was created in the 3D modelling software Blender. The modelwas shaped from half a sphere with 16 x 32 surface divisions for the UV grid. It was shapedwith the aim to resemble a common breast shape, with the tip of the breast model being thevertex node for the sections of the grid divisions. Three seams were placed on the bottomhalf of the breast model for the UV unwrapping of the model. The seams can be seen as redlines in Figure 4.3.

(a) Front view (b) Right view (c) Back view

Figure 4.3: Mesh representation of the 3D breast model created in modelling software Blender.

The red lines represent the seams used for UV unwrapping the model

In Figure 4.4 the manually UV unwrapped model of the breast can be seen. The meshsections close to each seam were stretched as to not create a visible edge in the image thatwill be mapped on the surface.

20

Page 21: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 4.4: UV unwrapped mesh of the 3D breast model with three seams.

The finished model was exported to Unity in the FBX file format which allowed export ofthe UV unwrapped surface as well as the 3D model itself.

4.3 Navigating the application

Two parallel scenes were always running; one singleton scene which carries all the neededinformation in all scenes like shared objects, and multiple scenes handling different stages ofthe process.

The singleton scene contains the input manager, some basic light, one camera and a State-Manager. StateManager is a singleton script that keeps references to the scanned object,the completed breast, the resulting object all together and also handles the navigation tothe next screen and what to do when navigating.

The other scenes were divided into the steps that the application was supposed to fol-low;

• Scanning a targeted object [Section 4.4]

• Smoothing the obtained mesh [Section 4.5]

• Mapping a photo to the scanned object and breast textures [Section 4.6]

• Merging the breast and the scanned object [Section 4.7]

All of the scenes are extensions of the BRSceneManager. It contains functions to handlewhat will happen if or when the scene is being dismissed, how to go to the next scene andthe setup of KeywordRecognizer.

The setup of the KeywordRecognizer consists of adding the phrase "Next scene" to alwaysstick around in the background of the application, attached to the singleton scene and tomove the boiler plate of instantiating the recogniser in any of the child scenes.

21

Page 22: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

4.4 Scanning a targeted object

Starting the spatial mapping granted access to a large number of vertices spread out to fitthe room. The targeted object was somewhere in the room amongst all vertices. Loweringthe extents of the surface observer to the size of a cube constraining the targeted area wasthe first approach but the extents did not shrink below 1m3.

Instead, the cube was used as a constraint where the bounds of the cube filters out thevertices that is not inside. This was solved by checking if the vertex centre was inside thecube and then including this vertex index in its own vertex array inside a new triangle array.Using the new triangle array, the mesh of the scanned object was constrained inside the cubeand the scanned object was rendered inside it. A problem with this method was that thevertices were still present and removing them required lots of recalculations of the indicesarray. This was solved by moving all the outside vertices to the centre of the cube, reducingthe collider extent to the targeted mesh’s size.

4.5 Smoothing of the obtained object mesh

The resulting mesh obtained by the spatial perception scan was coarse and required furtherprocessing to function as a model of the torso. Smoothing algorithms were required to eventhe mesh without losing the original base shape. Functions for implementing the smoothingalgorithms were called after the collection of the object’s mesh was complete.

The code [18] in C# for the Laplace and HC smoothing algorithms was adapted fromMarkGX’s code on mesh smoothing. The triangle array of the object’s mesh as well as thecorresponding vertex array was read into the function hcFilter. When previously cropped,the mesh maintained the original vertex array that holds all the vertices from the scan. How-ever, the triangle array only holds the indexes of the triangles remaining post crop. Becauseof this the code was adjusted so that when propagating through the vertices, only verticesthat also matched an index in the triangle array were smoothed. This to make the algorithmmore computationally efficient. The script finds the adjacent neighbours for each vertex, andthen recalculates the position of the vertex based on the algorithm. First the vertices wererun through the Laplace smoothing algorithm, then through the HC algorithm. A changewas made in the mesh smoothing source code to optimise the algorithm by only findingadjacent neighbour and indexes once, when running the Laplace smoothing algorithm, andthen storing the indexes for later use in the HC-algorithm.

4.6 Photograph area and render it as texture of object

The HoloLens API PhotoCapture gives access to the web camera in the device. Since thecall to the asynchronous method CallPhotoAsync returns a photo of the full current view by

22

Page 23: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

the camera it was necessary to crop the image to only contain the part of the photo framedby a photo frame in the application’s window.

For this scene three requirements had to be met. In the user interface of the application, theuser must be able to:

• Discern what area in the view port will be the photo mapped on the 3D breast model.

• Give command to take the photo.

• View result and decide whether to keep the texture or take a new photo.

The photo scene of the application was set up so that immediately when entering the scene,the StartPhotoModeAsync method was initiated. Consequently, the application is constantlyin photo mode and listening for the command to take a photo.

When entering the photo scene, the 3D breast model is in view. The model was importedinto the Unity scene when setting up the application. An app bar is attached to the objectwhere the options are "Take Photo" or "Done". When choosing to take photo, the Photo-SceneManager displays the photo frame in the view port. A user interface frame as shown inFigure 4.5 was attached to the main camera in the scene in Unity so that the frame followsthe movements of the user’s head.

Figure 4.5: 230x230 pixel frame with instructive caption visible in the HoloLens view port in

photo mode. The frame follows the gaze of the user.

The frame was set to be 230x230 pixels which was a suitable frame size for the requiredtexture size. The script PhotoCaptureObj handles all functions required for taking andcropping a photo. When a photo is captured to memory, the image is uploaded as a Tex-ture2D object. The texture object is then cropped in the CropImage function that returns aTexture2D object by 230x230 pixels. The function traverses the data arrays containing thepixels, deleting the pixels outside of the bounds. The returned texture object is then set asthe texture of the material on the 3D model’s renderer.

Finally, the user can view the resulting image applied at the surface of the 3D breast model.Then choose whether to move to the next scene or take a new photo, replacing the current

23

Page 24: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

texture with a new one.

4.7 Merging scanned object and breast prefabricate

Manipulating objects and have them collide required the TwohandManipulatable script tobe able to manipulate an objects speed. The TwohandManipulatable uses transform tomove objects which makes collisions not register. This was fixed by using a non-kinematiccollider and instead of moving the object by just setting its position in space, the velocitywas manipulated. The manipulation is described in Equation (4) where C is an arbitraryconstant dependent on how smooth the object should move through space. When the objectis released from manipulation the velocity is set to zero. Using velocity manipulation on anon-kinematic body enables collisions between the objects and the scanned object can beset as parent on collision which will make them seem like one solid object.

v = (xnew � xcurrent) ⇤ C (4)

The collider that was being used on the breast prefabricate is a primitive collider in the formof a cube. A primitive collider was chosen because a mesh collider has an upper bound of255 triangles in the mesh filter and the breast consists of more than that. The collision wasgood enough since the backside of the breast is completely straight and therefore resemblesa cube’s surface. If the collision behaves properly in front of the breast is irrelevant since itis always attached with its backside against the other object.

5 Results and discussion

5.1 Result

The result of the thesis was a Microsoft HoloLens application where the suggested outcome ofa breast reconstruction surgery is created and displayed as a hologram, based on a real timescan of the patient’s torso. It is a proof of concept application to show the possibilities ofusing Mixed Reality devices such as HoloLens to help patients and physicians communicatepossible outcomes of a reconstructive surgery before the actual procedure.

5.1.1 Interacting with the application

The application is navigated by the user with the HoloLens’ different options for input andinteraction. To press a button hologram, the user must look at the button and then air tapwith their finger.

24

Page 25: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Navigation with the hands is also used when grabbing, scaling, rotating and moving holo-grams. In scene 1 the user is required to move, rotate and scale a hologram of a cube that isused as the real-world boundary for the patient’s torso. The cube is moved by looking at itand "grabbing" it with the finger tips of one hand. It can then be moved in three dimensionsaround in the room. To release it the user releases the grip of their fingers. For scaling androtating the user does the same gripping movement with the fingers but uses both hands.The hologram can then be manipulated with intuitive movements by the hands.

When taking a photo as required in scene 2, a voice command is used as input instead. Theuser looks at where they want the photo taken and simply says "Take Photo". Figure 5.5shows the view in where the "Take Photo" voice command is used.

5.1.2 Scenes

To facilitate for the user, the final application was divided into four steps, or scenes. In eachscene a part of the final hologram is created before allowing the user to move on to the nextscene. All scenes are in a sequence and needs to be completed in one run of the applicationfor the resulting hologram to be complete.

Scene 1: Scan body A cube with an app bar appears in front of the user at the start ofthe application. First, the user will have to scale, rotate and position a holographic cubeto fit around the patient’s torso. In Figure 5.1 it is shown how the 3D hologram of thecube is placed around the torso before the scan. The cube acts as the real world 3D boundsthat limits what area should be scanned. When done, the "Start" button on the app bar ispressed, starting the spatial observer. Starting the spatial observer will also start renderingmesh around the room as visible in Figure 5.2. The constraints of the cube will kick in oncethe "Update" button is pressed on the cube’s app bar.

25

Page 26: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 5.1: 3D hologram of a cube is placed around the torso, acting as real world 3D bounds

to limit scan volume.

Figure 5.2: Spatial scanning is initiated, scanning the whole room to create a mesh of the

surroundings.

26

Page 27: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

Figure 5.3: The rendering of the mesh is updated. All mesh outside the bounding cube is

removed leaving only the 3D mesh representation of the torso.

When updating, the user can see that pieces of the mesh starts to disappear one by one.When all have disappeared the screen freezes for a couple of seconds while computing thesmoothing before a smoothed mesh appears in the place of the previously constrained meshof the torso.

(a) Front view (b) Right view

Figure 5.4: Mesh representation

If the torso looks as wanted, pressing the "Done" button will start the next scene, if not,pressing "Redo" will start the process over.

Scene 2: Texture mapping on the breast The "Create breast" scene is where the breasthologram that will be attached to the torso is created. The shape of the breast cannot be

27

Page 28: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

chosen but is imported as a prefabricated 3D model. In the scene a photograph is takenof the region on the chest where the reconstruction will be done. The photograph is thenmapped on the 3D breast model as to imitate the skin being stretched over an implant.

When entering scene 2 a 3D model of a breast can be seen as a hologram. Below the modelis an app bar that allows for navigation within the scene. The options on the app bar are"Take Photo" and "Done". To create the "skin" on the breast model the user presses the"Take Photo" button to enter photo mode.

The user targets an area on the chest enclosed by the photo frame in the view of the HoloLensand says "Take Photo" to take a photo of the enclosed area. The photo is then immediatelymapped onto the 3D breast model as "skin".

Figure 5.5: The frame displaying the crop used on the image to render the targeted texture.

After taking a photo the user is returned to the scene where the breast model can be viewed.If the user is not pleased with the result, they can press "Take Photo" on the app bar belowthe model again to re-enter the camera view. If the user is satisfied with the result theypress "Done" on the app bar to move into the next scene, scene 3.

Scene 3: Attach breast to body Starting this scene, the smoothed, scanned mesh appearsat its previous position again. The voice command "Create model" generates a copy of thebreast created in scene 2. The user can position and scale the breast as preferred and thendrag it on top of the torso which will attach the breast on top of it. The user can then scale,

28

Page 29: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

rotate and inspect the torso with the attached breast to make sure the breast is attachedcorrectly. If not, the user can just target the breast and move it as they please. When done,the user can say "Next scene" to move to the final scene.

Figure 5.6: Mesh and breast model

Figure 5.7: Mesh and breast model over torso

Scene 4: View final hologram This scene is for inspecting the final hologram.

The scanned hologram’s rotation axis has its origin at the world’s origin, no matter wherethe scanning occurred, making the rotation and translation behave awkward and not how itshould be.

29

Page 30: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

5.2 Discussion

Mixed Reality opens a range of possibilities in medicine. It builds a bridge between theexactness of computers and data and skill of humans. This thesis has focused on developingan application that hopes to ease the process for a patient undergoing a major surgery. Thefinal application was a proof of concept and cannot be used as is. But as a proof of conceptapplication it has shown the possibility and capacity of the HoloLens Mixed Reality deviceto be used as a communication device between a physician and a patient concerning thevisual result of a body altering surgery.

5.2.1 Limitations of the concept

There are psychological implications of making a promise if that promise cannot be kept. Thedanger with showing a possible aesthetic outcome of a surgery to a patient is straightforward:the actual result may not look like the prediction. To sidestep this, the application is focusedon plastic surgery, where the purpose of the surgery is the purely aesthetic result. In asituation where the purpose of the surgery is other than aesthetic, it is very likely thatthere could be unforeseen complications during surgery. Complications could lead to, foran example, a larger scar than predicted. But complications can of course occur in plasticsurgeries as well and even if there are no complications it is difficult to predict how thehuman body will react to a surgical procedure.

5.2.2 Imitating the human skin with textures

For a patient to look at the hologram and say that what they are looking at is their upperbody, the application should be able to attach a texture to the scanned torso in the sameway a texture is mapped onto the 3D breast model. The most straightforward way to achievethis is to take a photo of a small area of the patients skin, duplicating it as the texture overthe whole torso. However, the result may be perceived as strange since real skin over a largerarea like a torso varies in structure and colour, which the texture will not. A solution couldbe a partial scan where each section of the whole target object, such as the torso, is scannedseparately having its texture captured and mapped; then assembling all parts as a wholeobject.

5.2.3 Breast prefabrication

Not all breasts look the same. In this application there is only one breast prefabricate to workwith. Creating many different prefabrications is tedious work and should be automatisedscanning real breasts and gathering data. A database could be created with patient valuesand their breasts, volunteer’s breasts and have an UI where the surgeons can browse through

30

Page 31: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

and find the perfect breast model to use as base for the patients new breast. If the model isnot perfect there should be an option to do small adjustments right away by just manipulatingpoints in the model.

There is also the possibility of sharing the view between multiple HoloLenses. Sharing theapplication with multiple devices lets the patient observe what the surgeon sees, enablingthem to come up with pointers how they would like their breasts to look.

5.2.4 The surface observer

The surface observer is made to observe the surroundings of the HoloLens to be able to real-istically interact with the Mixed Reality. Simultaneously scanning and running applicationsrequires limitations in computation. Maximum of 2000 mesh triangles per square meter maybe more than required to place holograms perfectly aligned with the physical world but notto perform detailed object scanning. In the result, you can clearly see that the scannedobject is a torso but to be able to use the application as a medicinal tool, the details willhave to be clearer than they are right now. Using the RS4 preview to access the sensor datastream directly would open the possibility to manage what data to use and how to use it,hopefully to a more detailed object scan.

5.2.5 The current spatial observer

When the user starts the spatial observer, the spatial mesh appear in a large volume inthe room. The minimum size of the spatial observer volume is larger than what would bepreferable when scanning an object. This results in superfluous mesh data being collected.Rewriting the collection of mesh data by the surface observer could make it possible to, inreal time, filter out the unnecessary mesh points that are outside of the focused area. Thiswould simplify for the user to understand what is going on when the observer is started.

Currently the mesh that is received from the surface observer is put in a queue to updatethe scanned vertex points first in first out. Prioritising the scan to update what the userlooks at primarily, would increase the user’s experience as that is what is expected when"scanning". Furthermore the smoothing of the mesh freezes the entire application duringrun time. Freeing the thread every frame and pausing the computation will let the user lookaround whilst the mesh is smoothed.

Scanning objects using the HoloLens creates many opportunities to develop applicationswhere the user can easily copy real life objects into holographic data, which can be storedin the cloud and presented anywhere where the HoloLens are.

31

Page 32: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

6 Conclusion and future work

The goal of the thesis was to develop a Mixed Reality application for Microsoft HoloLensfor 3D visualising a breast reconstruction surgery. The resulting application uses the spatialmapping sensors of the HoloLens to scan the torso of the patient into a 3D mesh. It mapsan image of the breast area, captured in the application, onto a prefabricated 3D model ofa breast. This to imitate the stretch of the skin over an implant. In the final stage of theapplication the user attaches the breast model to the mesh of the torso and the result isdisplayed as a hologram. The existing techniques in Unity and Mixed Reality Toolkit aregood enough to build a proof of concept application to scan an object, render a hologram ofit and attach another, prefabricated holographic object onto it.

6.1 Future

MR is a valuable tool for the future. Counting on the fact that the technology of MR is onlygoing to keep developing and this rapidly, there is great promise in being able to create anapplication that can realistically show the desired outcome of a surgery. This is not limitedto breast reconstructive surgeries but could be applied to showing on a patient’s body howthe scar will look after a heart surgery, or a cesarean section.

Another direction in which this application could be developed is as a guide for surgeonsduring the reconstruction surgery. The surgeon would wear a MR device and hologram ofthe desired target breast shape could be displayed on the patient’s body at the exact placeon the torso where the surgery will take place. This would allow the surgeon to match thebreast they are reconstructing to a holographic target shape.

32

Page 33: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

References

[1] G. Spencer. (2017) Mixed reality and medicine: Surgery with no surprises.

[2] H. J. Kamps. (2017) Touch surgery brings surgery training to augmentedreality. [Online]. Available: https://techcrunch.com/2017/01/06/touch-surgery-ar/?guccounter=1

[3] P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE

Transactions on Information Systems, vol. Vol E77-D, no. 12, 1994.

[4] B. Bray and M. Zeller. (2018) What is mixed reality? [Online]. Available:https://docs.microsoft.com/en-us/windows/mixed-reality/mixed-reality

[5] Breastcancer.org. (2018) What is mastectomy? [Online]. Available: https://www.breastcancer.org/treatment/surgery/mastectomy/what_is

[6] A. B. C. Specialists. (2018) Mastectomy plus reconstruc-tion. [Online]. Available: http://www.arizona-breast-cancer-specialists.com/treatment-of-early-stage-breast-cancer.html

[7] Breastcancer.org. (2018) Mastectomy plus reconstruction. [Online]. Available: https://www.breastcancer.org/treatment/surgery/mastectomy/plus_reconstruction

[8] R. H. e. a. Clara Nan-hi Lee, Allison M. Deal, “Quality of patient decisions about breastreconstruction after mastectomy,” JAMA Surgery, vol. 152, no. 8, 2017.

[9] Breastcancer.org. (2018) More than half of women don’t get enough informationabout reconstruction from surgeons. [Online]. Available: https://www.breastcancer.org/research-news/many-dont-get-enough-info-about-reconstruction

[10] U. Technologies. (2018) Hololens research mode. [Online]. Available: https://docs.microsoft.com/en-us/windows/mixed-reality/research-mode

[11] Microsoft. (2018) Mixedrealitytoolkit-unity. [Online]. Available: https://github.com/Microsoft/MixedRealityToolkit-Unity

[12] U. Technologies. (2018) Anatomy of a mesh. [Online]. Available: https://docs.unity3d.com/Manual/AnatomyofaMesh.html

[13] R. M. J. Vollmer and H. Müller, “Improved laplacian smoothing of noisy surface meshes,”University of Dortmund, Germany, vol. 18, no. 3, 1999.

[14] U. Technologies. (2018) Gameobjects. [Online]. Available: https://docs.unity3d.com/Manual/GameObjects.html

[15] ——. (2018) Transforms. [Online]. Available: https://docs.unity3d.com/Manual/Transforms.html

33

Page 34: 3D visualisation of breast reconstruction using …1266886/...3D visualisation of breast reconstruction using Microsoft HoloLens Amanda Norberg Elliot Rask The purpose of the project

[16] ——. (2018) Hololens photo capture. [Online]. Available: https://docs.unity3d.com/Manual/windowsholographic-photocapture.html

[17] ——. (2018) Textures. [Online]. Available: https://docs.unity3d.com/Manual/Textures.html

[18] MarkGX. (2011) Meshsmoother. [Online]. Available: http://wiki.unity3d.com/index.php?title=MeshSmoother

34