magnusson m finalproject

20

Upload: matthew-magnusson

Post on 24-Jul-2016

232 views

Category:

Documents


2 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Magnusson m finalproject

Final Project: Thermal and Visual Light Image Blending

Using Low Cost Infrared Camera Technology Matthew Magnusson

CS 6475: Computational Photography Due Date: December 2, 2015

Dr. Irfan Essa

1 Introduction

Figure 1: The electromagneticspectrum (Permission granted byVictor Blacus under CCBY - ShareAlike 3.0)

This course in computational photography has focused on

the visible light portion of the electromagnetic spectrum (ap-

proximately 390 to 700 nanometers ). However, visible light

is only a tiny portion of the over-all electromagnetic spec-

trum. Longer wave electromagnetic radiation immediately

beyond visible light is broadly classi�ed as infrared (IR) and

spans a broad range of wavelengths from 700 nanometers to

1 millimeter. While the behavior of IR varies depending

on the band of wavelength, most thermal radiation|energy

transferred by the emission of electromagnetic waves from

objects near room temperature|is classi�ed as long wave-

length infrared (LWIR) and occurs between eight to 15 mi-

crometers.

While humans are not capable of visually perceiving L-

WIR, some biological organisms have developed a height-

ened sense of perception to LWIR that allows them to

\see" LWIR. This includes some species of snakes, bats, and

beetles. Of particular note is the black �re beetle (genus

Melanophila) that can detect small levels of heat transmit-

ted from forest �res signi�cant distances away from the bee-

tle (distances thirty-�ve miles away or greater)(Yong, 2012).

Utilizing special imaging equipment LWIR can be captured

and visualized. This project investigates the unique consid-

erations associated with LWIR in computational photogra-

phy and applies it in a \blending" context of combining visible light and LWIR images.

1.1 Goal of the Project

There were several goals pursued in this project. The �rst was to gain a greater understanding of

the unique considerations and dynamics associated with LWIR computational photography. A

second motivating goal was to gain experience program scripts in MATLAB R utilizing computer

vision (CV) based libraries. A third goal was to gain hands on experiencing using new low-cost

thermal imaging technology available from digital camera manufacturer FLIR to develop an IR

1

Page 2: Magnusson m finalproject

light visualization.

While LWIR-based computational photography shares many similarities with more conven-

tional visible light-based computational photography, it also has some unique considerations.

A primary di�erence is that it is based on the emissions of electromagnetic waves from objects

within a scene in contrast to the re ectance of visible light waves within a scene. This allows

for the detection of non-illuminated objects including night applications. The information col-

lected by LWIR can be used to calculate temperature measurements within a scene. This type

of computational photography has a broad range of scienti�c, commercial, and artistic/creative

applications.

I have not used MATLAB R previous to this course and wanted to understand its features and

capabilities relative to image processing. This software if frequently utilized in academic research

and has many well-developed libraries in computer vision and machine learning. For this project,

I purchased a relatively low-cost thermal camera called the \Flir One for Android" that attaches

to an Android device. I'm interested in applications that visualize thermal radiation and am

using this project as an opportunity to get acquainted with the technology. I have not used this

technology previous to this assignment.

My longer term goal is to build on the skills developed in this project to develop software that

manages the integration of the ultraviolet, visible, and infrared spectrum images. This would

allow for systems (including drone-based) to capture, manage, integrate, and intepret signals

from a broader range of the electromagnetic spectrum. The bene�t being that more information

could be collected from the scene of interest to be applied in scienti�c or commmercial domains.

1.2 Final Results

(a) Visible light 640x480 JPEGformat

(b) (

RAW 14 bit sensor signal320x240)

(c) Example resulting blendedthermal/visible light image

Figure 2: Input/Output results of the project

1.3 Best Way to See the Project

The results of this project are image-based, so the best way to view di�erent examples is at

http://imgur.com/a/ZfuGK. All images at that link are also included in this �nal report on the

project.

2

Page 3: Magnusson m finalproject

2 Technology

2.1 FLIR ONE

Figure 3: FLIR ONE for Android

The FLIR ONE for Android is the second generation of

FLIR's cell phone based thermal imaging camera. The �rst

generation was only compatible with the Apple Iphone 5.

In addition to working with Android phones, this second

generation of camera also features a version for the Apple

Iphone 6. The camera features an attractive price point at

$249 and attaches to the mobile Android device via a micro-

USB connector. Figure 3 shows the camera attached to a

smart phone. The device is capable of capturing a scene

temperature range between -4 to 248 degrees F and can de-

tect changes as small as 0.18 degrees F. The camera can

present images in a variety of color palettes and features

FLIR's MSX blending technology. This technology allows

a simultaneously captured visible-light digital image (640 x

480 resolution) to be merged with a thermal image(320 x240 digitally scaled). This results in

enhanced resolution by adding the visible spectrum detail to the resulting blended image. The

camera also has two di�erent image capture modes: still, and video capture.

The FLIR ONE stores the digital images taken in a \fat" JPEG �le format. This allows the

camera to produce a standard \blended" thermal image that can be accessible by any JPEG

compliant software. However, it embeds the visible-light JPEG �le and the RAW 14-bit IR

sensor data as binary �les in the JPEG �le. Therefore the underlying binary data (and other

sensor data) take advantage of the standards compliant EXIF structure of JPEG �les to store

proprietary data. This �le format was key to being able to work with the output from the FLIR

ONE camera in this project. Figure 4 shows the MSX image produced by the FLIR camera .

This is the default output, not the output that I produced. Figure 5 shows the di�erent default

thermal image color palettes that FLIR provides for blended images.

2.2 LEPTON Camera

Figure 6: LEPTON 3 LWIR camera

The FLIR ONE uses the LEPTON 3 long-wave infrared

(LWIR) camera module. This camera was speci�cally

designed to be a relatively low-cost camera that can be

intergrated into mobile phone products or be used in In-

ternet of Things (IOT) or other consumer electronics.

The camera captures IR radiation input in the nom-

inal response wavelength band of eight to 14 microns

and outputs a uniform thermal image. The camera has

a height of 160 active pixels and a width of 120 active

3

Page 4: Magnusson m finalproject

Figure 4: FLIR \fat" JPEG of Car Image (Iron color palette)

pixels. Stated applications for the product include: mo-

bile phones, gesture recognition, building automation,

thermal imaging, and night vision (FLIR, 2015)

2.3 ExifTool

ExifTool is an open-source image processing software that reads, writes, and edits meta infor-

mation on a wide-variety of �le formats1. The tool was developed and is actively maintained

by Phil Harvey. The tool consists of a Perl library with a command-line interface application.

The version of ExifTool used to process image �les in the project was 10.07. This software was

used in this project to read and extract meta data stored in the FLIR JPEG �le format.

1ExifTool is available for download at http://www.sno.phy.queensu.ca/ phil/exiftool/

4

Page 5: Magnusson m finalproject

Figure 5: FLIR default color palettes

5

Page 6: Magnusson m finalproject

3 Methodolgy

This section provides an overview of the processing steps undertaken to produce the resulting

image. The original input image was a "fat" JPEG �le format that has been developed by FLIR.

This image �le consists of a visible image that is a blend of a visible and infrared representation

of the scene. However, the �le is "fat" as within the meta-data of the JPEG �le, both the original

digital visible light image is stored as we as a 16 bit integer 2-dimensional array of the raw IR

sensor readings. Section 2: Technology provides more in-depth discussion of the FLIR ONE

camera technology and the associated �le format. Figure 7 provides a visual representation of

the high level steps in the processing pipeline.

3.1 Image Processing Steps

Download FLIR “FAT” JPEG FILE

Extract embedded JPG and RAW files using exiftool

Read input files into MATLAB

Normalize RAW file

Swap byte order in RAW file

Blend or cut heat map with embedded JPG

Resize heat map file

Apply color palette

algorithm to create heat map

Figure 7: Flowchart of image processing steps

1. The �rst step is to extract the digital image taken by the FLIR ONE camera. FLIR

provides a mobile app to manage the images taken by the FLIR ONE camera and it

allows for many di�erent methods for sharing the images taken by the camera. For this

project, I saved the image directly to the Google Drive account associated with my phone,

but I could have also used other mechanisms including email, or a bluetooth connection.

2. The next step was to use the ExifTool to extract the embedded meta information asso-

ciated with the produced image. See Section 2: Technology for additional discussion of

ExifTool. Also please refer to the Appendix for a detailed output of all meta-tags extract-

ed that were associated with this �le. The code below shows the contents of the batch �le

that was used to extract images and data from all FLIR formatted JPEG �les stored in

a directory.

for %%f in (flir\*) do (

% Extracts out all meta-tags to a text file

6

Page 7: Magnusson m finalproject

exiftool %%f > out\%%f.meta.txt

% Extracts the binary files based on meta-tag

% flag -b indicates a binary file

% the flags -embeddedimage and -rawthermalimage are the actual tag

% names with any spaces removed; flag is case insensitive

exiftool -b -embeddedimage %%f > out\%%f.embedded.jpg

exiftool -b -rawthermalimage %%f > out2\%%f.raw.png

)

3. The next step was to read the resulting embedded JPG �le (\normal" digital camer-

a image) and the raw binary �le of IR readings into MATLAB R . At this point any

programming platform could be used to further process the images including Python.

However, as stated in Section 1: Introduction, I was interested in using MATLAB R to

perform my �nal assignment. The code used to process and output the images is in the

Appendix.

4. The next step was to swap the byte order from \Little-endian" to "Big-endian" in order

for the processed signal to be interpreted correctly. This was accomplished using the

MATLAB R swapbytes function. Figure 8 demonstrates a visualization of the before and

after swapping byte order. It's interesting to note that the visualization before swapping

the byte order gives an interesting \topographical-in-appearance visualization, but it is

not the actual measurement readings. The image generated after swapping the byte order

provide a very faint, but accurate representation of the sensor data.

(a) Before little-endian 16 bit integer RAW for-mat

(b) After big-endian 16 bit integer RAW for-mat

Figure 8: RAW IR binary visualization

5. The next step was to scale the IR readings on a scale from 0 to 1. This is necessary to

build a \heat map" or intensity index. The resulting image can stand as a gray scale

image intensity heat map on its own as shown in �gure 9 or can serve the basis for color-

based heat maps (color palettes). Figure 10 provides a three-dimensional plot of the the

7

Page 8: Magnusson m finalproject

IR readings of the car. The car was recently driven and this surface plot is interesting as

it shows area where heat energy is accumulating on the vehicle.

Figure 9: Normalized IR intensity

Figure 10: Surface plot of relative IR intensity in image

6. The next step is to apply a color palette to the image intensity map if color thermal images

are desired. In this case, I developed a simple RGB-based heat map which followed

the following algorithm. Direct mapping of intensity to the red channel (to represent

high temperature as red); a direct inverse mapping of intensity to the blue channel (to

represent cold temperatures as blue) and a triangular peak at 0.5 for the green channel

(so that temperatures in the middle have a higher green intensity). The overall e�ect of

8

Page 9: Magnusson m finalproject

this algorithm produced a reasonable color-based heat gradient as shown in �gure 11. It

should be noted that this is an area where there is signi�cant opportunity for additional

processing and creativity as any intensity gradient based color-mapping scheme could

be developed and implemented. The resulting heat map or color palette produces color

output that is similar to the \arctic" color palette output from FLIR.

Figure 11: Normalized IR intensity

7. The heat map image is only half of the size of the visual light-based jpeg image. However,

the �eld of view between the thermal image and the light image are not the same so a

simple linear doubling of the thermal image does not directly match the scene in the visible

light image. Therefore, the image needs to be scaled in size at a factor greater than one

but less than two. For this project, I utilized trial and error and visual based matching to

size the image. A more sophisticated approach would have been to use feature matching

and allowed the algorithm to directly scale the thermal image to the light image.

8. The �nal step is to superimpose the thermal image with the light image. I took a basic

approach of manually positioning the thermal image on the underlying light image and

then applying a transparency through a trial and error application of averaging with

a weighting factor. This is another area with a signi�cant amount of opportunity for

the additional application of techniques learned in our computational photography class

including: feathering, cutting, and blending.

3.2 Who Performed the Work

For the �nal project, I worked individually. I took all pictures, performed all of the analysis,and

was sole author on the �nal project report.

9

Page 10: Magnusson m finalproject

4 Discussion

The goal of this assignment was to take a visible light image and merge it with a raw in-

frared sensor 2-d array to create a novel \blended" image using the MATLAB R programming

language. \Blending" is a common application in the IR photography �eld, however a good

portion of it is performed using \closed" manufacturer-speci�c image manipulation tools. I was

able to accomplish the goal of producing a reasonable \blended" image using MATLAB R (in

combination with open source software).

I think the �nal blended result provides a compelling bene�t of IR imaging. By looking at

the blended image, one can tell that the car was recently driven as both the engine compartment

and tires are emitting higher levels of IR intensity. If you had just looked at the visible-light

photograph of the vehicle you would not have not been able to make that conclusion. This

provides a simple example of how additional information can be derived from a scene by looking

at a broader range of the electromagnetic spectrum than you could determine by just looking

at images in only the visible light spectrum. This has applications across a wide variety of

domains including machine learning and arti�cial intelligence.

4.1 Resulting Images

The following �gures show the three \input images" and the resulting output image. To clarify,

�gure 12 shows the actual image produced by the FLIR camera. However embedded within

this image were the visible light image (�gure 13) and the raw binary IR data (�gure 14). In

this project, I extracted these two embedded data �les from the FLIR image �le to produce a

new blended image (�gure 15) independent of any FLIR image processing steps.

Figure 12: FLIR \Input" image

10

Page 11: Magnusson m finalproject

Figure 13: Embedded visible light image

Figure 14: Embedded RAW IR sensor data

11

Page 12: Magnusson m finalproject

Figure 15: Final blended image produced

12

Page 13: Magnusson m finalproject

4.2 What Worked

Each segment of the imaging processing pipeline was able to be accomplished and the �nal

resulting image succesfully demonstrates the blending of a visible light image with super imposed

IR information. The MATLAB R platform proved to be a very e�ective tool in this assignment.

In fact, I �nd this tool better to use for exploratory work (such as the type undertaken in

this project) than the SciPy Python stack. The MATLAB R programming IDE has many

features that help speed-up interactive developing which allowed me to be more productive as

I was exploring the problem. The algorithms developed in MATLAB R can then be used in

MATLAB R or they can be transcribed to any other programming language. While I did have

some issues with the FLIR ONE camera on one cellphone, I was able to get it to successful work

and the camera consistently provided high quality images when used on an Android tablet.

4.3 What Didn't Work

I was disappointed that calculating temperature data from the IR sensor data was not one

of the results of this project. While, one can tell that FLIR is attempting to enter the mass

sensor market through their LEPTON camera and FLIR ONE products, they still appear to

be heavily oriented to working with engineers in commercial and industrial applications of their

technology. The result is that they are not adept at providing a consistent product support

experience. I had mixed results with their technical support in resolving issues. Overall they

provided good responsiveness, however, some of their responses were not helpful.

A major challenge was that I was unable to get their free FLIR Tools software program to

install and run on three di�erent Windows machine instances (Windows 7, Windows 7 virtual

machine, Windows 10 virtual machine). This presented a challenge as this software is capable

of processing their proprietary "fat" JPEG formats. I was able to make some progress with a

web-based tool they provided, but it was extremely buggy and frequently crashed or would not

work at all. Having access to information obtained from FLIR software would have been very

helpful in conducting this project. I would have gained insight into the digital �le structure and

would have been able to compare the outputs from my software algorithms with those that FLIR

calculated from the same image. Instead, I had to review numerous web-based forum posts and

other resources to reconstruct information on how to correctly process the �le format data. A

signi�cant amount of information was scattered about in heavily technical FLIR documentation

that I was unable to obtain directly from FLIR but found through web site postings.

Another problem I encountered is that I ordered the camera in August, but it was then

placed on backorder and did not actually arrive until towards the end of November. While I

was thankful it came in before this project was due. It did not give me a lot of time to learn

all of its features and gain any hands on experience prior to commencing work on this project.

13

Page 14: Magnusson m finalproject

4.4 What would I do di�erently

There is not much that I would have done di�erently for this project. Given the time constraints

of the project coupled with my discussion above of what worked and what did not work, I'm

not sure that there is much that I could have done di�erently. The project was ambitious

in that I simultaneously undertook learning a new programming language and environment

(MATLAB R ) while also applying it in a new unfamiliar imaging medium (IR). In addition, this

was using a brand new technology (FLIR ONE for Android) which introduces its own elements

of risk. Overall, I don't imagine the project could have gone signi�cantly easier and I was

thankful that I was able to pull all of these elements together to construct a reasonable novel

image.

If I had more time, I would have applied additional techniques that we learned in this

course to the preparation and blending of the images. This would have included image �ltering,

thresholding and edge extraction. These are techniques that FLIR appears to be using in their

MSX technology and I would be interested in building on the techniques apparent in their

approach to create images that are even more e�ective at communicating IR information.

4.5 Areas of Future Research

I plan on continuing working on this project as I can see it being a foundation for work I pursue

while earning a P.h.D degree in Computer Science (with a concentration in computer vision).

I was able to come close but not accomplish an accurate conversion of raw IR readings to tem-

perature measurements following algorithms provided by FLIR and on-line forums. However,

even if I was able to have the equations correct, I learned by working on this project that

calculated temperatures can be blatantly wrong for a variety of di�erent reasons. For example,

FLIR provides an on-line example of temperature readings of an erupting volcano. Many of the

lava measurements from that example were below human body temperature (not a very likely

scenario). Therefore, one area of future research would be to apply machine learning algorithms

to imagery to help automatically correct temperature readings that are out of scale.

Another area of future research would be to develop feature detection and matching algo-

rithms between the visible light image and the infrared image. The FLIR ONE product uses the

MSX technology to help with this, but I found there were examples of images, as seen in �gure

16 produced by the MSX technology that were still out of alignment. Therefore, it appears

that further improvements in alignment algorithms are needed. In the case of this assignment, I

relied on trial and error resizing to match up the images. While that worked for this assignment,

it is not a scalable or user-friendly approach.

FLIR provides an Android and Iphone-based software development kit (SDK) to program

mobile applications. My original proposal had considered developing an Android application.

I would still like to purse the development of an Android application. My plan would be

to determine a consumer-facing application of this technology where an app would be useful.

Unfortunately, due to the time constraints, I was not able to include any work related to mobile

app development in the scope of this project.

14

Page 15: Magnusson m finalproject

Figure 16: Example of FLIR image where MSX technology produced a misalignment of images

4.6 Conclusion

Overall, image processing using IR imagery is very similar to that of conventional photography.

This makes sense as they both capture photon energy on a grid-like sensor system. The primary

di�erence is the bandwidth that the sensors within the grid network are capable of detecting.

Through out this project, I found that the core skills built through the semester were applicable

in this �nal project in working with the IR images. One of the primary di�erences is that (in

theory) temperature can be calculated from the RAW IR sensor measurements; however, in

practice this turns out to be challenging due to the number of input values that are necessary

to calculate an accurate temperature. I found this to be a very rewarding assignment, and I was

pleased with how I was able to employ the skills developed throughout the course and apply

them to a new problem domain within computational photography. In conclusion, I found this

to be a great capstone project for the course as it required me to build o� all of the skills

acquired in the course but was challenging as it required those skills to be applied in a new

context.

15

Page 16: Magnusson m finalproject

5 References

Instrumart.com (2015). FLIRWebinars. Retrieved from https://www.instrumart.com/pages/488/ ir-

webinars

FLIR. (2015 November 30). FLIR ONE for Android. Retrieved from

http://www. ir.com/ irone/content/?id=69420#specs.

FLIR. (2015 May 4). FLIR LEPTON 3 Long Wave Infrared (LWIR) Datasheet (Version 3.0.1).

Yong, E. (2012 May 27)/ Fire-chasing beetles sense infrared radiation from �res hundreds of kilo-

metres away. Discover. Retrieved from http://blogs.discovermagazine.com/notrocketscience/

2012/05/27/�re-chasing-beetles-sense-infrared-radiation-from-�res-hundreds-of-kilometres-away/.

6 Appendix

6.1 ExifTool Output for Meta-tag Information for Car Image

ExifTool Version Number : 10.05

File Name : flir_20151201T094912.jpg

Directory : flir

File Size : 436 kB

File Modification Date/Time : 2015:12:01 10:31:33-05:00

File Access Date/Time : 2015:12:01 11:31:35-05:00

File Creation Date/Time : 2015:12:01 11:31:35-05:00

File Permissions : rw-rw-rw-

File Type : JPEG

File Type Extension : jpg

MIME Type : image/jpeg

JFIF Version : 1.01

Exif Byte Order : Little-endian (Intel, II)

Make : FLIR Systems AB

Camera Model Name : *

Orientation : Horizontal (normal)

X Resolution : 72

Y Resolution : 72

Resolution Unit : inches

Software : *

Modify Date : 2015:12:01 09:49:12

Y Cb Cr Positioning : Centered

16

Page 17: Magnusson m finalproject

Exif Version : 0220

Create Date : 2015:12:01 09:49:12

Components Configuration : -, Cr, Cb, Y

Subject Distance : 1 m

Focal Length : 3.2 mm

Image Temperature Max : 213

Image Temperature Min : 213

Flashpix Version : 0100

Color Space : sRGB

Exif Image Width : 640

Exif Image Height : 480

Digital Zoom Ratio : 1

Image Unique ID : DD8F87CC4C2418909CD08388C0D381C0

GPS Version ID : 2.2.0.0

GPS Latitude Ref : North

GPS Longitude Ref : West

GPS Altitude Ref : Above Sea Level

Compression : JPEG (old-style)

Thumbnail Offset : 1980

Thumbnail Length : 2400

Creator Software :

Embedded Image Width : 640

Embedded Image Height : 480

Embedded Image Type : JPG

Embedded Image : (Binary data 74779 bytes, use -b option to extract)

Emissivity : 0.95

Object Distance : 1.00 m

Reflected Apparent Temperature : 20.0 C

Atmospheric Temperature : 20.0 C

IR Window Temperature : 20.0 C

IR Window Transmission : 1.00

Relative Humidity : 50.0 %

Planck R1 : 17002.672

Planck B : 1436

Planck F : 1

Atmospheric Trans Alpha 1 : 0.006569

Atmospheric Trans Alpha 2 : 0.012620

Atmospheric Trans Beta 1 : -0.002276

Atmospheric Trans Beta 2 : -0.006670

Atmospheric Trans X : 1.900000

17

Page 18: Magnusson m finalproject

Camera Temperature Range Max : 120.0 C

Camera Temperature Range Min : -40.0 C

Camera Model : *

Camera Part Number : *

Camera Serial Number : *

Camera Software : 1.0.0

Lens Model : FOL2

Lens Part Number :

Lens Serial Number :

Field Of View : 35.4 deg

Filter Model :

Filter Part Number :

Filter Serial Number :

Planck O : -1095

Planck R2 : 0.012071962

Raw Value Median : 9508

Raw Value Range : 2120

Date/Time Original : 2015:12:01 09:49:12.706-05:00

Focus Step Count : 0

Focus Distance : 2.0 m

Palette Colors : 224

Above Color : 170 128 128

Below Color : 50 128 128

Overflow Color : 67 216 98

Underflow Color : 41 110 240

Isotherm 1 Color : 100 128 128

Isotherm 2 Color : 100 110 240

Palette Method : 0

Palette Stretch : 0

Palette File Name : iron.pal

Palette Name : Iron

Palette : (Binary data 672 bytes, use -b option to extract)

Raw Thermal Image Width : 320

Raw Thermal Image Height : 240

Raw Thermal Image Type : PNG

Raw Thermal Image : (Binary data 70781 bytes, use -b option to extract)

Real 2 IR : 1.23202729225159

Offset X : -7

Offset Y : +21

PiP X1 : 0

18

Page 19: Magnusson m finalproject

PiP X2 : 319

PiP Y1 : 0

PiP Y2 : 239

GPS Map Datum : WGS84

Image Width : 640

Image Height : 480

Encoding Process : Baseline DCT, Huffman coding

Bits Per Sample : 8

Color Components : 3

Y Cb Cr Sub Sampling : YCbCr4:2:0 (2 2)

GPS Latitude : 43 deg 10' 14.16" N

Image Size : 640x480

Megapixels : 0.307

Peak Spectral Sensitivity : 10.0 um

Thumbnail Image : (Binary data 2400 bytes, use -b option to extract)

Focal Length : 3.2 mm

6.2 MATLAB R Processing Code

%Initial Settings

file_name = 'flir_20151201T094912.jpg';

out_path = 'E:\school\gatech\CS_6475_Computational_Photography\Assignments\tex\final\files';

%Input binary files

car_fat_jpg = importdata(strcat('original/',file_name));

imwrite(car_fat_jpg, strcat(out_path,'\car_fat.jpg'));

light_car = importdata(strcat('out/embedded/',file_name,'.embedded.jpg'));

imwrite(light_car, strcat(out_path,'\car_light.jpg'));

raw_car = importdata(strcat('out/raw/',file_name,'.raw.png'));

imwrite(raw_car, strcat(out_path,'\car_raw_original.png'));

%Swap byte order on image

raw_car = swapbytes(raw_car);

imwrite(raw_car, strcat(out_path,'\car_raw_swapped.png'));

%Normalize 16 bit integers to double between 0 and 1

norm_car = double(raw_car - min(raw_car(:)))/double(max(raw_car(:))-min(raw_car(:)));

imwrite(norm_car, strcat(out_path,'\norm_car.png'));

%Color palette algorithm for heat map

%Red

19

Page 20: Magnusson m finalproject

r = norm_car;

g = 1.0 - abs(norm_car- .5);

b = abs(norm_car -1);

heat_car = cat(3,uint8(r*255),uint8(g*255),uint8(b*255));

heat_car = imresize(heat_car,1.6);

%Settings for positioning heatmap on visible light image

[u,v,c] = size(heat_car);

x = 70; %starting row position based on 0 position at top

y = 55; %starting colum position based on 0 position at left

%Weight setting on alpha

heat_alpha = 0.65;

%Final blended image based on position and alpha

blended = light_car;

blended(x:x+u-1,y:y+v-1,:) = (heat_alpha * heat_car + (1-heat_alpha) *light_car(x:x+u-1,y:y+v-1,:))/2 ;

imwrite(blended, strcat(out_path,'\blend1.png'));

20