using a multi-functional sensor network platform for large...

15
Using a Multi-functional Sensor Network Platform for Large-Scale Applications to Ground, Air, and Water Tasks Elizabeth Basha, Carrick Detweiler, Marek Doniec, Iuliu Vasilescu, and Daniela Rus Distributed Robotics Laboratory Computer Science and Artificial Intelligence Laboratory {e_basha,carrick,doniec,iuliuv,rus}@mit.edu ABSTRACT We present a modular sensor network platform capa- ble of supporting a wide range of applications. The challenge of designing such a system increases when the applications cover a broad range of environments, com- munication methods, sensor types, and time scales as well as supporting distributed in-situ algorithms. We developed a platform to support such a broad spectrum of scenarios, instantiating our system for situations on the ground, in the water, and in the air. Our system has operated in the field for over 150 days with month long continuous deployments, measuring positions, temper- atures, pressures, and rainfall, while computing cattle behaviors, event locations, and future river level. We use this experimental experience to discuss the lessons learned in designing and using a modular and multi- functional system. General Terms multi-application, sensor network, platform, system ar- chitecture, deployment Keywords multi-application, sensor network, platform, system ar- chitecture, deployment 1. INTRODUCTION We wish to develop a multi-functional sensing plat- form to enable a large and heterogeneous range of ap- plications in the air, on the ground, and in the wa- ter. These applications have different communication requirements, sensing needs, time scales, environments, Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Copyright 200X ACM X-XXXXX-XX-X/XX/XX ...$5.00. and computational algorithms, complicating the design problem. Many existing systems cannot solve this as they deal with almost homogeneous system requirements, leveraging the same sensor and communication systems while focusing on low power and small physical foot- print. Missing from this area, and necessary for solv- ing our problem, is a platform providing very heteroge- neous communication and sensor support with a focus on heavy computation and data storage for disparate environments. We have developed a sensing platform capable of supplying these needs for a wide variety of applications. We have field tested instantiations of the system for ground, aerial, and water applications. The design requirements for our flexible and modular sensor node platform include: Easy addition and use of many sensor types Easy addition and use of many communication meth- ods Easy reconfiguration and programming of the sys- tem, both within a specific application and for switching to a different application Shared user interface for easy addition of new projects, easy system debugging, and easy system access Our system meets these requirements, providing high computational capabilities for a variety of communica- tion and sensing needs. We communicate through long- range and short-range wireless radio channels for ground and aerial needs as well as supporting acoustic and op- tical communication in water. For sensing, we connect to a range of sensors including ones several kilometers away and allow for two different sensor access methods, continuous polling for better accuracy or on-demand for lower power, as well as integrated logging of a variable number of sensors without recompilation. Finally, our system provides an easy-to-use, reconfigurable user in- terface and easy reassignment of nodes through SD card programming. We instantiated this system in three different appli- cation areas (shown in Figure 1): (1) virtual fencing for

Upload: others

Post on 04-Apr-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Using a Multi-functional Sensor Network Platform forLarge-Scale Applications to Ground, Air, and Water Tasks

Elizabeth Basha, Carrick Detweiler, Marek Doniec, Iuliu Vasilescu, and Daniela RusDistributed Robotics Laboratory

Computer Science and Artificial Intelligence Laboratory{e_basha,carrick,doniec,iuliuv,rus}@mit.edu

ABSTRACTWe present a modular sensor network platform capa-ble of supporting a wide range of applications. Thechallenge of designing such a system increases when theapplications cover a broad range of environments, com-munication methods, sensor types, and time scales aswell as supporting distributed in-situ algorithms. Wedeveloped a platform to support such a broad spectrumof scenarios, instantiating our system for situations onthe ground, in the water, and in the air. Our system hasoperated in the field for over 150 days with month longcontinuous deployments, measuring positions, temper-atures, pressures, and rainfall, while computing cattlebehaviors, event locations, and future river level. Weuse this experimental experience to discuss the lessonslearned in designing and using a modular and multi-functional system.

General Termsmulti-application, sensor network, platform, system ar-chitecture, deployment

Keywordsmulti-application, sensor network, platform, system ar-chitecture, deployment

1. INTRODUCTIONWe wish to develop a multi-functional sensing plat-

form to enable a large and heterogeneous range of ap-plications in the air, on the ground, and in the wa-ter. These applications have different communicationrequirements, sensing needs, time scales, environments,

Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.Copyright 200X ACM X-XXXXX-XX-X/XX/XX ...$5.00.

and computational algorithms, complicating the designproblem. Many existing systems cannot solve this asthey deal with almost homogeneous system requirements,leveraging the same sensor and communication systemswhile focusing on low power and small physical foot-print. Missing from this area, and necessary for solv-ing our problem, is a platform providing very heteroge-neous communication and sensor support with a focuson heavy computation and data storage for disparateenvironments. We have developed a sensing platformcapable of supplying these needs for a wide variety ofapplications. We have field tested instantiations of thesystem for ground, aerial, and water applications.

The design requirements for our flexible and modularsensor node platform include:

• Easy addition and use of many sensor types

• Easy addition and use of many communication meth-ods

• Easy reconfiguration and programming of the sys-tem, both within a specific application and forswitching to a different application

• Shared user interface for easy addition of new projects,easy system debugging, and easy system access

Our system meets these requirements, providing highcomputational capabilities for a variety of communica-tion and sensing needs. We communicate through long-range and short-range wireless radio channels for groundand aerial needs as well as supporting acoustic and op-tical communication in water. For sensing, we connectto a range of sensors including ones several kilometersaway and allow for two different sensor access methods,continuous polling for better accuracy or on-demand forlower power, as well as integrated logging of a variablenumber of sensors without recompilation. Finally, oursystem provides an easy-to-use, reconfigurable user in-terface and easy reassignment of nodes through SD cardprogramming.

We instantiated this system in three different appli-cation areas (shown in Figure 1): (1) virtual fencing for

Page 2: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Figure 1: Pictures of the three applications: (a) sensor mounted on cow’s head, (b) AquaNode, and(c) rainfall sensor for river flood prediction.

cattle herds, (2) underwater monitoring of coral reefs,and (3) river flood prediction. Each of these applica-tions has different environmental needs (ranging frommobile to underwater to covering large geographic ar-eas) and different usage models (ranging from regular,fast operation patterns to variable, very slow patterns).

In successfully implementing these applications, wedeveloped a platform with over 150 days of experimen-tal operation, typical active power usage of 155 mW,support for 6 sensor types and 4 communication types,and solving 3 different algorithmic problems. We alsolearned useful lessons about designing multi-functionalsystems, UART multiplexing, communication abstrac-tions, and power management design among other lessons.

This paper is organized as follows. Section 2 dis-cusses related systems developed for multi-applicationsupport. Section 3 describes the system architectureand the operating system. Section 4 reports our ex-perimental results characterizing the operation of thisplatform in terms of communication, power, operationalbehavior, and deployments. Section 5 describes lessonswe learned through the design and implementation ofour sensor platform.

2. RELATED WORKWe build on several years of important work in de-

signing and fielding sensor network systems on a vari-ety of platforms including the Mote [1, 8, 14, 15, 17],Fleck [21], Cricket [18], Meraki [4, 13], and others [6,9, 10]. Our own experience with these systems has ledus to the design decisions described in this paper. Inthis section we describe a small subset of available re-search sensor network platforms as well as a selection ofapplications in which they are used.

The Berkeley Mote was one of the first widely usedwireless sensor network platforms [8]. There are a num-ber of descendants of the original Berkeley Mote. TheIntel Mote includes a 12 MHz ARM7 CPU with more

RAM and FLASH than was available in the originalMote [15]. The Telos Mote is designed to be extremelylow power [17].

Most Motes run TinyOS, an operating system devel-oped for use on the original Berkeley Mote [12]. Its smallfootprint and processor usage come from its heritage ofoperating on processors with relatively limited capabil-ities. Our operating system takes a similar approach toTinyOS in that the core is based on a non-preemptivemulti-tasking scheduler. We chose to develop our ownsystem so that we could optimize the OS for our boardwhile maintaining flexibility.

There are also a number of other sensor network plat-forms, some of which run TinyOS and others that runtheir own custom operating systems. The MIT Cricketis a sensor network node which adds the capability toobtain ranges between sensors [18]. The CSIRO Fleckhas solar charging capabilities and a longer range ra-dio [21]. The Epic platform is an open-source Moteplatform designed to be modular and hierarchical [6].

Another class of sensor network nodes include fasterprocessors (in the multi-hundred MHz range) and morememory. These can handle demanding processing tasks,such as image processing, but at the cost of using sig-nificantly more power. The Intel Mote 2 uses an IntelXScale core which can be clocked at up to 416 MHz [1,14]. The Meraki sensor network [13] is a commercialspinoff of the MIT RoofNet project [4]. The Merakisensors run linux and are able to form adhoc 802.11mesh networks. Cell phones have also been used as thehardware to form a sensor network [10].

Many of these systems have been deployed for manydifferent applications. The Trio Testbed was a largescale deployment of nearly 600 Motes [7]. The LUS-TER network was developed to measure light intensityunder foliage [19]. Motes have been used to performdetailed studies of a Redwood tree [20]. The CarTelnetwork is a sensor network deployed in cars to create

Page 3: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

a mobile sensor network which can also collect infor-mation on traffic and road use [9]. Flecks have beenused to monitor cattle [21]. PermaSense is a sensor net-work aimed at measuring high-altitude environmentaldata [3]. Wireless sensor nodes have been developed tomonitor fatigue in bridges [11].

While all of these platforms provide good options forsensor network research, none supports a full range ofcommunication and sensing options while also support-ing complicated algorithms. As these are vital require-ments for our projects, we decided to design a new plat-form and operating system.

3. SYSTEM ARCHITECTURE AND OPER-ATING SYSTEM

Our goal has been to provide a multi-functional sen-sor network platform that enables a large and hetero-geneous range of applications in the air, on the ground,and in the water. The communication needs across thisrange of applications vary greatly, ranging from acous-tic and optical communication in water to short-rangecommunication on the ground and long-range commu-nications in air. Similarly, the sensing needs range fromcollecting data from nearby sensors to collecting datafrom sensors several kilometers away. The software in-frastructure of the system should be reusable across ap-plications and the task specification and user interfaceto the system simple and intuitive. Furthermore, wewish for the system to be self-sustaining with respectto power, relying on in-situ recharging and intelligentpower management to tune its operations. The specificrequirements for a modular sensor network system are:

• Low-level support for all reasonable sensor types(resistive, interrupt, serial, etc...) along with easyhigh-level addition of new sensors using these types

• Basic wireless communication support for all projectsand easy addition of any serial-based communica-tion device

• High-level reconfiguration of the system and theability to do so in the field without a direct cableconnection to a device

• Long-term (over a year) storage of data

• Reconfigurable graphical user interface providinginline debug support along with visual status ofsystem

In response to these requirements, we have designeda hardware platform, supporting operating system, andsoftware infrastructure for applications. Figure 2 showsthe block diagram of the hardware architecture. Figure3 shows the block diagram of the operating system. We

have five interfaces: processing, communication, sen-sor, power system, and logging. Each provides low-levelhardware access libraries and high-level abstractions.

In defining our system, we focused on the followingareas: processing, communication, sensing, power man-agement, data storage, configuration, and user interface.We outline our design decisions, hardware architecture,and operating system in the following subsections.

Figure 2: Block diagram of the sensor platformhardware architecture.

3.1 ProcessingOf greatest importance for our system is computa-

tional power as we want to enable extensive in-situ pro-cessing on the sensor nodes to avoid transferring signifi-cant amounts of data to a central sever for applicationssuch as modeling and prediction of weather phenomena.Because of this, we focused on choosing a processor anddesigning an operating system that allowed a range ofabstractions from very low-level sensor operations up tocomplex, distributed, floating-point algorithms.

HardwareThe key hardware requirement is a processor with a

relatively large amount of on-chip RAM (40K), flash(512K), input-output pins, and other features. We se-lected the LPC2148 ARM 7 processor [16] which sat-isfies those requirements. Additionally, this processorsupports floating-point operations though software li-braries, providing vectored interrupts, and utilizing anon-chip real time clock with good precision and alarmcapability.

SoftwareThe key software requirement is abstraction for plug-

and-play support of different communication, sensing,and processing modules. Our base software has to sup-port all the various operations necessary for a varietyof projects: measurements, communication, user inter-

Page 4: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Figure 3: Block diagram of the sensor platformoperating system. Users of the system need onlyuse the high level interfaces. Devices such as abluetooth radio can be added to the communica-tions interface without having to modify the corecode, instead the API for the low-level communi-cations system can be used to add a new device.

face updates, failure checking, logging, algorithms, andother activities. Additionally, these operations can re-quire different timing paradigms: occurring at regulartime intervals independent of a real time clock, occur-ring at specific times dependent on a real time clock(for example occurring on the tens: 00:10, 00:20, etc...),or occurring only in response to other events with notime dependence. The event priorities also can vary assome events have to run while others could wait untilthe system is awake and free. Finally, to enable easy re-configuration, we also need the abilities to add, delete,and modify operations within the user interface.

Supporting these requirements entails a very flexiblesystem, especially as we want to maintain the loosenessof the requirements to meet future research projects.To achieve this, we designed a non-preemptive multi-tasking scheduler-based system utilizing the real timeclock and millisecond timers. Events consist of an id,name, interval, and need-to-run flag. In defining theinterval, it can either occur regularly every x seconds(or milliseconds) or only once at y seconds from now. Ifit needs to run, the system will ensure that it is onlineat that time window to run the event (further discussedin Section 3.4). Finally, events can be added, deleted,or modified by id or name both within the code andin the user interface, creating a very powerful way ofonline modification of the system behavior as well as

easy reconfiguration and system debug.

3.2 CommunicationWe wish to provide heterogeneous communication sup-

port of any serial-based device to reach a range of dis-tances through a variety of mediums (air, water). Oursystem needs to achieve this communication to manydifferent objects (other nodes, people, different sensors)without exposing any of the complexities of switchingbetween these to the user.

HardwareSeamless transitions between communication devices

require sufficient protocols and their hardware support.Here, the LPC2148 provides UARTs, SPI, and I2C. Un-fortunately, it only provides 2 UARTs, which becomesvery limiting in scenarios such as a node communicatingvia RS232 to a computer as well as radio to another nodewhile determining its position from the RS232-enabledGPS. Because of the simultaneity requirement in thisrealistic model, we cannot use a bus-based communica-tion method such as SPI or USB, but need more than2 UARTs. To avoid limiting ourselves to 2 UARTs andsuccessfully manage the variety of UART devices, we in-clude a low-power FPGA which bidirectionally buffersup to four additional serial ports allowing simultaneouscommunication with up to 6 serial devices. As everynetwork will need some basic form of radio communica-tion, we automatically add a radio. In choosing a radio,the key requirements were an easy usage model and asufficient range for mobile networks as well as large geo-graphical networks. We chose the 900 MHz AerocommAC4790 radio due to its claim of a 20 mile communi-cation range and initial testing compared to the Zigbeeradios available at time of choice. Each board also sup-ports RS232 and USB to allow computer connections.

SoftwareWe need several abstraction layers to avoid exposing

communication switching complexities to the end userbut still enable easy addition of other devices. We startwith four low-levels interfacing with the microprocessor:the FPGA access code, the UARTs, the SPI interface,and the I2C interface. Because of the physical limita-tions, the UART code is the most complex, needing toprovide seamless switching between the virtual UARTdevices the user thinks exist and the actual physicaldevices. It does so through meticulous record-keepingand buffering to ensure connections occur error-free andwithout data loss.

Just above this layer, we provide an AC4790 layerto interface with the Aerocomm built-in communicationprotocols. This layer contains low-level radio access andbasic message packet structures on top of the UARTinterface. We also incorporate configuration and control

Page 5: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

of the AC4790 registers, abstracting the EEPROM andother steps necessary to setup the radios.

On top of these low-level accesses, we provide a com-munication system that further abstracts what commu-nication device and what message. Any module cancreate a message, adding it to a message queue that isprocessed within the scheduler at a time window defin-able by the user. A module receiving a message will onlyhandle those it recognizes and for which it has a handlercapable of processing that type of message. This enablesdifferent projects to react differently to the same mes-sage as well as create project specific messages that donot interfere with another project’s modules. Each mes-sage also invisibly handles which communication devicesends the message, abstracting away whether the mes-sage belongs to the user interface, radio, or other devicethrough a routing layer. The routing layer initially be-gins with an internal configuration, but redefines thetables based on receiving messages and the user inter-face. Adding a new communication method becomesvery easy due to these layers as it only requires definingit within the proper low-level layer, which automaticallymakes it available to the higher layers and the user.

3.3 SensingWe want a platform capable of accessing almost any

sensor type (resistive, interrupt, serial, SPI, etc...) andproviding this access at a high enough level to add anew sensor with ease.

HardwareWe achieve this through exposing as many input-output

(IO) pins as possible and enabling connections to expan-sion boards. Our expansion board connection providesIO pins, UART ports, ADC pins, I2C bus access, andSPI bus access. The FPGA allows routing of these pinsin any configuration from the processor to the expansionports. This allows for a wide variety of sensor types.Additionally, we provide several base sensors: temper-ature, compass, accelerometers, and GPS. These basicsensors help provide system information regarding loca-tion, position, and internal temperature, useful data foralmost every project.

SoftwareWe provide ease-of access and addition through sev-

eral software layers. We start with the basic hardware,developing ADC code and GPS access code in additionto existing I2C and UART layers. Should a sensor needa specialized user interface, it could access these layersand have direct control. For the majority of sensors,however, the sensor layer abstracts out the different ac-cess types each sensor uses, providing a standard in-terface for these sensors. Within this layer, we accesssensors using two different approaches: one where all

sensors are polled regularly so that any request returnsthe latest polled value, and another where a sensor isread only upon request. The first allows utilization ofsensors with longer update windows and refresh require-ments such as the GPS and compass while the secondminimizes power usage in cases where the sensed valueis immediately available. Adding a sensor requires onlyplacing it in the list of available sensors within the sens-ing layer; it is then available and supported through theexisting sensor functions to all higher layers.

3.4 Power ManagementWe want real-time monitoring of power usage and

power availability which we can also control throughautonomous, fine-grained regulation of all system com-ponents.

HardwareTo measure and control the power, we have a charge

circuit allowing solar and DC charging of lithium-polymerbatteries. This circuit also provides measurement of thecharge current so we know the amount of power enter-ing the system. For understanding the amount exitingthe system, we add a battery circuit to each lithium-polymer battery. Within the circuit, we measure current(both charge and discharge), voltage, and temperature.

SoftwareWe then use these measurements to understand and

regulate the power profile of our system. The batterycircuit provides enough information to define the re-maining battery capacity. Between that, the batteryvoltage, and the charge current, we can define whenthe battery is depleted and what the proper action isto take. This occurs automatically through a sched-uled event with a default depletion handler, easily re-placeable by a project-specific handler through softwarehooks. If we have no charge current, we can put thesystem to sleep for hours until solar charging should oc-cur, waking up only when we know it is daytime. Wealso autonomously monitor the activities on the sched-uler and decide when to put the system to sleep. This isbased on a number of variables, defining the minimumsleep window, amount of time before events the systemis awake, and amount of time after events before systemcan sleep. For events that have to run at a specific time,we define a need-to-run flag, which limits the sleep cy-cles. Events with less priority queue until the systemawakes and then all run at that moment. Because thereis always at least one need-to-run event, we ensure thatall events run eventually and the system does not sleepforever.

3.5 Data StorageOur goal for data storage is large capacity for long-

Page 6: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

term operation along with fast access of short-term mea-surements.

HardwareThe LPC2148 provides a good amount of on-chip stor-

age allowing buffers for many important data values.Beyond this, we add a 32KB FRAM and a mini-SDcard slot. The FRAM allows for fast persistent storage(across power cycles) with a nearly infinite number ofwrite cycles, for projects where data measurements oc-cur frequently and the system accesses them frequently.For long-term storage of other data, including logs ofcommunication and operation, the SD card provides asolution.

SoftwareWe developed code to enable fast and easy access of

the FRAM and support streaming data to/from it. TheSD card requires a file system so we implemented a FATfile system to ensure readability of the SD cards on reg-ular computer systems as well. On top of the FAT sys-tem, we added a logging system that enables creationof many concurrent log files. The system automaticallysynchronizes the files through the scheduler and main-tains a date-based directory structure.

At the start of every day the log files rotate, creat-ing a new directory and seamlessly rolling all active logfiles to that directory. This allows for more manage-able data storage as files do not grow without limit.The entire system allows for shared logging of relevantdata such as communication and power, while support-ing each projects’ individual data needs from single sen-sor storage to large numbers of concurrent files for a va-riety of sensors and information. The data stored in theFRAM as well as the files stored on the SD card canbe downloaded manually (removing the SD card), via alocal serial cable, or remotely using the radio.

3.6 ConfigurationA truly multi-application platform needs to support

easy reconfiguration of each node while in the field evenif cable connection is not possible.

HardwareWe utilize the FRAM and SD card mentioned in Sec-

tion 3.5.

SoftwareWe wrote our own bootloader program to load new

program code into the system. The bootloader readsthe program file from the SD card allowing very easyreconfiguration. The program can be updated by swap-ping SD cards or by uploading a file via a serial or radiolink. Additionally, the bootloader has a failsafe system.If the board does not properly boot, a backup programwill be automatically loaded. This allows the user to

program boards in hard to reach locations without fearof “bricking” the system with a bad program. Since theprogram can be updated by changing SD cards, it isextremely easy to reconfigure any node for use in a dif-ferent application.

Once the system starts operation, the FRAM pro-vides fine-grained configuration through variables per-manently stored in it. These variables contain a recordof all configuration parameters for the base system andall projects. Between these two configuration options,we can easily modify any node to a different operationwithin a project, a different code version, or a differentproject.

3.7 User InterfaceOur system should provide a variety of external access

mechanisms, capable of configuring and controlling thesystem independent of node ease-of-access and distancefrom the user.

HardwareWe use the UARTs to connect to a computer and com-

municate with a user interface (shown in Figure 4). Theuser interface can also be accessed remotely via the radiowhen a physical connection is unavailable. For projectswhere serial connections are unavailable, we connect toa small LCD display board. This board allows basicconfiguration of the system using the accelerometers asthe method of moving a cursor for input.

Figure 4: A picture of one view of the user in-terface.

SoftwareOur user interface utilizes the same communication

Page 7: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

layer as all other forms of communication. This simpli-fies development of the user interface on the system side.On the computer side, we designed a Java-based userinterface consisting of panels, which easily add, swapand update for different projects. A panel not only de-fines the graphical view, but the message structure usedwithin the communication layer. Each layer then hasan associated panel, which, for the lower layers, pro-vides debug and configuration of variables stored in theFRAM. At the highest layers, we access the applicationdata and global system behavior. In these layers, we alsoprovide more graphical views of data such as mappinglocations of nodes and time-series views of measureddata.

Upon startup, the user interface connects to a nodevia the serial port (either using RS232, Bluetooth orUSB) and loads an xml file. Through these xml files,we initialize the screen view for a specific project anddetermine the update rate for these panels. Transparentto the user, the interface not only connects to a nodethrough the serial port, but to all other nodes in the sys-tem through that node’s radio. This allows easy access,configuration, and debug of the entire system from onecomputer terminal. Our bootloader system also allowsreprogramming of the board through the user interface,independent of how the node connects to the user inter-face, although radio reprogramming occurs very slowly.

In addition to panels, we have a console display allow-ing text-based control of the system. Here we provide ahelp menu consisting of program handles to access verylow level control, update configuration variables, modifythe scheduler, read the log files, and a variety of otheroperations. This interface supports more basic opera-tions for which a user interface panel is unnecessary.

Figure 5: Picture of the base board.

4. EXPERIMENTS AND RESULTSWe have instantiated the sensor network system and

deployed it in the field in the context of three appli-cations: monitoring and controlling cattle, monitoringcoral reefs, and monitoring and predicting river floods.These deployments thoroughly tested and characterized

the system. In this section, we analyze the communica-tion statistics, power usage, operational behavior, anddeployments results.

4.1 CommunicationsWe logged data regarding the number of bytes and

the type of messages sent. These data describe the be-haviors of the different communication methods, high-lighting the need for a variety of different options andthe trade-offs between then.

Table 1 summarizes our results, showing the datarates, ranges, and success percentages for the variouscommunication methods used. In this table, the realrate shows the effect of communication overhead, whichwe compute assuming 100% transmission success rate.As the table shows, some types such as serial see neg-ligible effects while others such as the acoustic modemhave significant overhead due to the complexity of thecommunication method. For the 900 MHz radios, weuse broadcast mode, which automatically repeats eachmessage 4 times, something reflected in the real rate.The acoustic modem sees a decrease due to guard timesintroduced to prevent interference from reflections andprotocol overhead.

Also shown in the table is the wide spectrum of com-munication ranges at which our system can operate,from 1 meter to 60 kilometers. Maximum range definesthe maximum distance at which we have seen the devicefunction, not the manufacturer specified range (Aero-comm states 32 kilometers, but we have only achieved3). The difference from maximum to typical varies sig-nificantly. Those that see the least variation (144 MHz,optical, and acoustic) are the communication systemswe designed internally, where the application requiredsome form of communication covering the distance oroperating in that medium. These communication meth-ods have less opportunity for variety in their installation(not as many antenna options, specific installation re-quirements, etc...) as well as increased complexity indetermining maximum ranges (optical and acoustic re-quire testing in large bodies of water, and 144 MHz re-quires infrastructure and line-of-sight greater than 60km).Those with the greatest variation are communicationmethods providing a multiplicity of options for installa-tion with a variety of antenna choices as well as easiertesting parameters. In installing these communicationmethods, we intentionally implement and install themto not maximize the range but to balance the trade-offbetween range and success rate, an action understand-able by examining the success rates.

Our final column of the table demonstrates the com-munication success rates, based on our experiments.The 100% success rates seen by the Bluetooth and serialindicate the reasoning behind using those for configura-

Page 8: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Device PhysicalLayer Rate

Real Rate MaximumRange

Typical Range Success Rate(at Typical)

900 MHz Radio 57600b/s 7200b/s 3km 100m 25-50%144 MHz Radio 1200b/s 818b/s 60km 50km 90%Bluetooth 1Mbit/s 92100b/s 50m 5m 100%Serial Cable 115200b/s 92100b/s 300m 1m 100%Optical Modem 1Mbit/s 800Kbit/s 4m 3m 90%AcousticModem

300b/s 22b/s 500m 400m 56%

Table 1: Summary of communication results.

tion and reprogramming. The high success rates alsoallow us to use serial interfaces (RS232 and RS485) forremote sensors, up to 300m away, when reliability iscritical. The lowest rates reflect the difficulty of com-munication medium in the case of the acoustic modemsand limitations of the Aerocomm module in the case ofthe 900 MHz radios.

The primary trade-off seen from these data is betweendistance and data rate; clearly as the data rate decreasesthe typical distance increases. By supporting such alarge range of communication devices, we can analyzethe application needs, choose a method, and plug-and-play communication options as necessary to optimizethe trade-off for our specific application.

4.2 Power Usage and ChargingWe characterized the power usage of our system through

our field experiments and directed lab testing to under-stand what uses the most power and what trade-offs weincur through our design choices. Table 2 outlines thepower usage of the different system components as wellas the components added by each system, summarizingthe total current requirement of each. For a base systemwith 900 MHz radio, GPS, and our base sensors, we usea minimum of 77 mA during active operation and a max-imum of 182 mA instantaneous current. When powermanagement occurs, the base system reduces this usageto 2 mA. Transmitting messages via the radio dominatesthis base system with GPS measurements and CPU av-erage usage both using approximately half the current.

The average usage of this base system ignoring com-munication is 83 mA. Adding extension boards for eachof our applications increases this average usage by 1%for river flooding, 18% for coral reef monitoring, and35% for virtual fencing. For virtual fencing, the only ap-plication where the current usage is dominated by some-thing other than communication, shocking increases in-stantaneous usage by 516%. When we include 1 Hzcommunication on the 900 MHz radio, the average cur-rent of the base system is 156 mA. Adding the 144 MHzradio, needed by the river flood application, increases

Component Current (mA)Base Board Sleep Mode 2CPU Low Usage 16CPU Max Usage 59Base Sensors 6GPS no fix 44GPS fix 35–40900 MHz Radio Receive Only 20900 MHz Radio Transmit 1 Hz 73Cow Extension Board Standby 7Cow Extension Board On 29Cow Shocking 200–400Underwater Extension Standby <1Underwater Extension Active 15Acoustic Receive 110Acoustic Transmit 200Optical Receive 15Optical Transmit 100–500River Extension Board <1144 MHz Radio Transmit 5000

Table 2: The power usage of the various subsys-tems.

Page 9: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Location Average Charge Current (mA)Massachusetts 11.28Honduras 92.12New Mexico 29.67

Table 3: Summary of the average charge currentper day in three different geographic locations.

the instantaneous current usage by 3206%. Although atinitial glance the power usage for our base board mayseem high, clearly the application needs dominate ourpower usage.

With such high power usage, we need to focus onpower management and, therefore, gather charge cur-rent and discharge current measurements. Figure 6 showsthe average daily charge current for three different ge-ographic locations: Massachusetts, Honduras, and NewMexico. In these experiments, the Massachusetts andHonduras locations used 2 watt solar cells while theNew Mexico location used 1.5 watt solar cells. Ta-ble 3 summarizes this information, providing the av-erage daily values seen by each system. Each locationhas significant variability in daily average charge cur-rent and instantaneous charge current available, whichhighlights the arguments behind our design decisions inpower management.

Figures 7 and 8 show the overall power usage for thevirtual fence and coral reef systems respectively. Whilewe have power usage for the river system, we only mea-sure when the system is awake; as the system is asleepat least 50% of the time, the time series of the datamisrepresents the true system behavior.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50

100

200

300

400

500

600

Day

Ch

arg

e C

urr

ent

(mA

)

Solar Charging in Different Locations

HondurasDoverNew Mexico

Figure 6: Charge current from Honduras dur-ing February, Massachusetts in late October, andNew Mexico in January. Days are offset for vis-ibility.

4.3 Deployment Time and DetailsThe applications stress-tested the system in several

0 5 10 15 20 25−100

0

100

200

300

400

500

Time (hrs)

Cu

rren

t U

sag

e (m

A)

Current Usage for a Cow Node

Figure 7: The current usage over a day for thevirtual fencing system. There is low current us-age during the period when the solar cell is con-tributing energy to the system.

ways, utilizing a suite of ground, aerial, and water sen-sors, and several modalities of communication throughair (short range and long range) and water (short rangeand long range). Cattle monitoring stressed the sensingfunctionality, with high-data rate position requirementsand rapid data access, while the flood prediction appli-cation required very computationally intensive in-situalgorithms. Furthermore, the virtual fencing and floodprediction systems had a continuous in-situ deploymentof more than a month–which tested the power manage-ment algorithms used in the system.

Although each system has been deployed many times,we focus on the most recent experiments.

Comparison of ApplicationsOverall all applications share the same base hardware

(see Figure 5) and base software, providing 80 to 90%of the software for any given project. Table 4 outlineskey features of our system and how each applicationuses them. This clearly demonstrates the diversity ofapplications our system can support.

Virtual FencingVirtual fencing aims to help ranchers control their cat-

tle through a mobile sensor network. This network, withnodes attached to the heads of the cattle, monitors thelocation of each animal and determines if the animalhas reached the edge of the “fence,” a virtual bound-ary defined by the rancher. If the animal heads outof bounds, the system provides stimulus in the form ofshock and/or sound to direct the animal back withinthe virtual fence.

On the hardware side, to help control the cattle, thesystem requires an expansion board providing the shock

Page 10: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Figure 9: The core software usage and extensions for: (a) coral reef monitoring, (b) river flooding,and (c) virtual fencing.

and sound capabilities as shown in Figure 10(a). Inthis application, we use all the base software, providing87% of the software, adding only complex, application-specific algorithms as described in Figure 9(a). These al-gorithms define a virtual fence within which we keep thecattle through directional queues determined by headorientation.

At time of writing, the virtual fence system has op-erated for over 2 months on a ranch in New Mexico ata ranch affiliated with the United States Department ofAgriculture (USDA). Here we deployed 5 nodes on cat-tle, roaming over an area of 5 square kilometers, with asixth node as a stationary reference and a seventh as amobile base station.

Coral Reefs: AquaNodesCoral reef monitoring provides a valuable tool for bi-

ologists researching issues related to the flora and faunaexisting in the reef habitat. These scientists spend manylong hours making measurements and installing dataloggers. Instead, a sensor network can provide a moreintelligent monitoring system, regularly transmitting datato the scientists and informing them of the system statusbefore too much time elapses. Performing this monitor-ing with a sensor network involves long-term installationof fixed location sensors underwater. These nodes mea-sure items such water temperature, water pressure, andcontaminants at regular intervals.

The AquaNodes require extending the base hardwareand software systems as seen in Figures 10(b) and 9(b).On the hardware side, this involves adding an expan-sion board with a 24-bit AD converter to provide high-precision measurements of external temperature and pres-sure. Since the radio does not work underwater, we alsoadd acoustic and optical modems. The optical system isused by divers or a robot to download the full logs, whilethe acoustic modem is used for periodic status updates.To aid deployment, the system has a bluetooth radioand an LCD screen operated through the accelerome-ter and hall effect sensors, enabling configuration andtesting while in the water.

The base software provides 83% of the software neededby the application, including the 900 MHz radio andGPS, which help with initial configuration on land. Ad-ditionally, because GPS does not work underwater, weuse ranges obtained by the acoustic modems to createa GPS-like system underwater [5]. This allows the sen-sor nodes to self-localize with respect to each other sothat the precise locations of the collected data can bedetermined.

We have deployed the AquaNodes in rivers, lakes andon the coral reefs near Moorea, in Tahiti. During ourlast trip to Tahiti in August 2008, we placed 8 nodes inthe reef for a few hours most days over the course of twoweeks. We were unable to leave the system unattendeddue to concern over theft.

Page 11: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

Applica

tion

Mob

ile/

Fix

edCov

erag

eA

rea

Op

erat

ional

Lif

etim

eE

vent

Tim

eSca

les

Num

ber

ofE

vents

Inte

rval

orT

ime

Sch

ed-

ule

dE

vents

Sen

sor

Op

era-

tion

Num

ber

ofL

ogF

iles

Eas

eof

Syst

emA

cces

s

Vir

tual

Fenc

ing

Mob

ile1

kmM

onth

sSe

cond

s6-

12In

terv

alP

olle

d4-

6E

asy

Cor

alR

eefs

Fix

ed10

sof

kmM

onth

s10

sof

seco

nds

10-1

2In

terv

alP

olle

d4-

6D

ifficu

lt

Riv

erF

lood

sF

ixed

100s

ofkm

Yea

rsM

inut

es12

-15

Tim

eR

ead

onA

cces

s8-

10M

oder

ate

Table 4: Comparison of Applications on System

0 0.5 1 1.5 2 2.5 30

50

100

150

200

250

300

350

400

450

500

Time (hrs)

Cu

rren

t U

sag

e (m

A)

Current Usage for an AquaNode

Figure 8: Power usage of an AquaNode. Ini-tially it is low when it is out of the water, onceit is deployed the acoustic modem is turned onincreasing the power usage. Spikes are acoustictransmissions. The AquaNode runs on a 3 cellbattery instead of the one cell battery used inthe other systems.

Flood PredictionRiver flood prediction intends to warn communities

of incoming river floods, gaining time for them to pro-tect their property and evacuate [2]. Commonly such asystem requires significant centralized computation andlarge amounts of historical data. A sensor network re-places that with local sensing and self-calibrated modelsallowing much easier installation and operation in anyriver basin in the world. The system measures rainfall,air temperature, and water pressure to predict river level24 hours in advance, providing sufficient warning to savethe lives of those downstream.

This application modifies the base hardware as shownin Figure 10(c). All the sensors are external to the sys-tem so an extension board is needed to connect withthem; each also uses the system’s variety of access op-tions as temperature is resistive, rainfall is interrupt-driven, and pressure is a RS485-based measurement.While local sensing clusters communicate via the AC4790radios, to cover the large geographic area of the basin,some nodes communicate via 144 MHz radios. These ra-dios require a modem, which we place on an expansionboard used only in those few nodes.

On the software side, Figure 9(c) shows the additionsnecessary. Because we do not use the GPS or many ofthe internal sensors but do use the same sensor layer,the base code supplies 90% of the system code with theadditions being the distributed modeling and predictionalgorithms.

The river flood system has deployed in two different

Page 12: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

locations: Dover, Massachusetts and Honduras. At theDover site, we placed 5 sensors within a square kilome-ter of area for 5 weeks during fall 2008. In Honduras, wedeployed 6 nodes covering 100 square kilometers for 2weeks. One node provided the office interface, 2 nodesprovided 144 MHz communication, and 4 nodes pro-vided sensing (one of the 144 MHz nodes also measuredpressure).

4.4 Operational BehaviorOur system provides many different options in terms

of sensor, communication, computation, and power man-agement behaviors. Figure 11 illustrates some of thesepotential operational behaviors, showing the percentageof time each activity occurs over a 24 hour period foreach existing application. This figure combines mea-sured values from the deployments with estimates andextrapolations where full day testing has yet to occuras in the case of the AquaNodes. We can see the verydifferent usage models of each application yet, as de-scribed in Section 4.3, each operates on the same sensornetwork platform with minimal changes.

5. LESSONS LEARNEDOur goal has been to develop an easily reconfigurable

sensor network architecture. We have learned severallessons with using such a system as compared to usinga sensor network system with specialized hardware andsoftware:

• The startup costs of designing our multi-functionalsystem have been high both on the hardware andthe software sides. With such a variety of needs,we found it difficult to initially design each partwith the necessary flexibility, often requiring devel-opment first for one project style and later mod-ification for the other. However, by sharing thesame base hardware and software, debugging isvery fast; usually the other project successfullybreaks the new addition in minutes. From this, wesuggest multi-application platforms use the samebase code, separating out only that code specificto the application, but ensuring both many layersof abstraction and access to functions on all ab-straction layers. It is nearly impossible to forseeall usage models of any aspect of the software soaccess to the various layers ensures the base coderemains the base code instead of fragmenting intodifferent software projects.

• There are never enough UARTs. We have the ca-pability to connect 6 different serial devices andstill can think of more to add. From this, we con-clude having some form of external serial multi-plexer is necessary, whether it is just a simple se-

rial multiplexer or a more complicated FPGA aswe use. This allows for simultaneous use of severalcommunication methods and sensors, a situationthat arose in all three of our applications but fordifferent components.

• The communication abstraction infrastructure makesadding and using a new communication device fastand simple. By creating such a complex serialrouting structure and utilizing an abstraction en-compassing all different bus protocols, we have noproblems routing messages through different ra-dios, different expansion boards, and different de-vices. Given that each application uses a differ-ent messaging structure and system with differentoperational behaviors, it becomes a necessity tohave this abstraction and enables future applica-tion modalities.

• The sensing abstraction enables us to treat localand remote sensors identically. This has aided in-field expansion of project goals and unique modi-fications for small side projects.

• The power management enables continuous opera-tion for the applications we instantiated. Althoughpower remains a concern no matter how smart youmake the management, how low power you re-quire the components, or how big you make thesolar panels. For each application, we designedthe power system, but discovered leaf cover andmobility affected solar charging. Therefore, intel-ligent power management becomes a necessity andrequires hardware support in the form of measure-ment circuitry of charging and discharging.

• Our system enabled easy prototyping of each ap-plication but at higher cost and more complexitythan would occur in an application-specific design.Should we want a long-term production system,it would be more optimal from cost, hardware andsoftware standpoints to design separate systems foreach application.

6. CONCLUSIONSWe designed a sensor network system capable of sup-

porting a wide variety of applications. This system al-lows easy addition of sensors and communication types,reconfiguration of nodes, data storage and access, anduser operation. Our scheduler-based operating systemprovides multi-level access of the many abstraction lay-ers in our software, which aids new application develop-ment on the platform.

We characterized our system through three applica-tion areas: virtual fencing of cattle, coral reef monitor-

Page 13: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

ing, and river flood prediction. Each of these applica-tions was deployed in the field, providing a wide varietyof data on the base system in addition to applicationspecific data. Moving forward we look to instantiatemany new applications on the system, including robotcontrol, garden monitoring, and landslide detection.

7. ACKNOWLEDGMENTSWe would like to acknowledge the following groups for

their financial support: Microsoft Research, the PublicService Center, and Xerox.

We would like to thank the following people and or-ganizations for the assistance at various points of ourvarious projects: Alex Bahr, Andrea Llenos, Brian Ju-lian, and the Fundacion San Alonso Rodriguez.

8. REFERENCES[1] R. Adler, M. Flanigan, J. Huang, R. Kling,

N. Kushalnagar, L. Nachman, C. Wan, andM. Yarvis. Intel mote 2: an advanced platform fordemanding sensor network applications. InProceedings of the 3rd international conference onEmbedded networked sensor systems, pages298–298, San Diego, California, USA, 2005. ACM.

[2] E. Basha, S. Ravela, and D. Rus. Model-basedmonitoring for early warning flood detection. InProceedings of the 6th ACM Conference onEmbedded Networked Sensor Systems (SenSys),Raleigh, NC, November 5-7 2008.

[3] J. Beutel, S. Gruber, A. Hasler, R. Lim, A. Meier,C. Plessl, I. Talzi, L. Thiele, C. Tschudin,M. Woehrle, and M. Yuecel. Demo abstract:Operating a sensor network at 3500 m above sealevel. In Proceedings of the 2009 InternationalConference on Information Processing in SensorNetworks - Volume 00, pages 405–406. IEEEComputer Society, 2009.

[4] J. Bicket, D. Aguayo, S. Biswas, and R. Morris.Architecture and evaluation of an unplanned802.11b mesh network. In Proceedings of the 11thannual international conference on Mobilecomputing and networking, pages 31–42, Cologne,Germany, 2005. ACM.

[5] C. Detweiler, J. Leonard, D. Rus, and S. Teller.Passive mobile robot localization within a fixedbeacon field. in Proceedings of the InternationalWorkshop on the Algorithmic Foundations ofRobotics, 2006.

[6] P. Dutta and D. Culler. Epic: An open moteplatform for Application-Driven design. InInformation Processing in Sensor Networks, 2008.IPSN ’08. International Conference on, page547?548, 2008.

[7] P. Dutta, J. Hui, J. Jeong, S. Kim, C. Sharp,J. Taneja, G. Tolle, K. Whitehouse, and D. Culler.Trio: enabling sustainable and scalable outdoorwireless sensor network deployments. InProceedings of the 5th international conference onInformation processing in sensor networks, page407?415, Nashville, Tennessee, USA, 2006. ACM.

[8] J. Hill and D. Culler. A wireless embedded sensorarchitecture for system-level optimization. 2001.

[9] B. Hull, V. Bychkovsky, Y. Zhang, K. Chen,M. Goraczko, A. Miu, E. Shih, H. Balakrishnan,and S. Madden. CarTel: a distributed mobilesensor computing system. In Proceedings of the4th international conference on Embeddednetworked sensor systems, page 125?138, Boulder,Colorado, USA, 2006. ACM.

[10] A. Kansal, M. Goraczko, and F. Zhao. Building asensor network of mobile phones. In Proceedings ofthe 6th international conference on Informationprocessing in sensor networks, pages 547–548,Cambridge, Massachusetts, USA, 2007. ACM.

[11] A. Ledeczi, T. Hay, P. Volgyesi, D. Hay, A. Nadas,and S. Jayaraman. Wireless acoustic emissionsensor network for structural monitoring. SensorsJournal, IEEE, 9(11):1370–1377, 2009.

[12] P. Levis, S. Madden, J. Polastre, R. Szewczyk,K. Whitehouse, A. Woo, D. Gay, J. Hill,M. Welsh, E. Brewer, and D. Culler. TinyOS: AnOperating System for Sensor Networks, pages115–148. 2005.

[13] Meraki. Wireless routers & WiFi networks: Indoorand outdoor wireless networks by meraki.http://meraki.com/, 2009.

[14] L. Nachman, J. Huang, J. Shahabdeen, R. Adler,and R. Kling. IMOTE2: serious computation atthe edge. In Wireless Communications and MobileComputing Conference, 2008. IWCMC ’08.International, pages 1118–1123, 2008.

[15] L. Nachman, R. Kling, R. Adler, J. Huang, andV. Hummel. The intelo mote platform: abluetooth-based sensor network for industrialmonitoring. In Information Processing in SensorNetworks, 2005. IPSN 2005. Fourth InternationalSymposium on, pages 437–442, 2005.

[16] Phillips. LPC241x User Manual, 2 edition, July2006.

[17] J. Polastre, R. Szewczyk, and D. Culler. Telos:enabling ultra-low power wireless research. InInformation Processing in Sensor Networks, 2005.IPSN 2005. Fourth International Symposium on,page 364?369, 2005.

[18] N. B. Priyantha, A. Chakraborty, andH. Balakrishnan. The cricket location-support

Page 14: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

system. null, pages 32—43, 2000.[19] L. Selavo, A. Wood, Q. Cao, T. Sookoor, H. Liu,

A. Srinivasan, Y. Wu, W. Kang, J. Stankovic,D. Young, and J. Porter. LUSTER: wirelesssensor network for environmental research. InProceedings of the 5th international conference onEmbedded networked sensor systems, page103?116, Sydney, Australia, 2007. ACM.

[20] G. Tolle, J. Polastre, R. Szewczyk, D. Culler,N. Turner, K. Tu, S. Burgess, T. Dawson,P. Buonadonna, D. Gay, and W. Hong. Amacroscope in the redwoods. In Proceedings of the3rd international conference on Embeddednetworked sensor systems, page 51?63, San Diego,California, USA, 2005. ACM.

[21] T. Wark, P. Corke, P. Sikka, L. Klingbeil, Y. Guo,C. Crossman, P. Valencia, D. Swain, andG. Bishop-Hurley. Transforming agriculturethrough pervasive wireless sensor networks.Pervasive Computing, IEEE, 6(2):50–57, 2007.

3-AxisMagneticCompass

3-AxisAccelerometer

1 WattAerocomm

900MHz Radio

Low-powerFPGA

BluetoothRadio

RS232Level

Shifter

132x132LCD

Display

Hall EffectSensors

BatteryFuel Gauge

Internal PressureSensor

ExternalPressureSensor

ExternalTemperature

Sensor

24 bitAnalog to Digital

Converter

AcousticModem

OpticalModem

i2c

bus

SPI bus

SD Card Storage

512KBFRAM

Base Board

ExtensionBoard

LPC214860MHz ARM7

Processor

Atmega164P8 bit 8MHzProcessor

Seri

al

GPS

Figure 10: The core hardware usage and exten-sions for: (a) coral reef monitoring, (b) riverflooding, and (c) virtual fencing.

Page 15: Using a Multi-functional Sensor Network Platform for Large ...people.csail.mit.edu/ebasha/sensornode-spots2010.pdf · Using a Multi-functional Sensor Network Platform for Large-Scale

River Flood Monitoring AquaNode Virtual Fencing

0

20

40

60

80

100

Communicating Processing Sensing Power Saving

Perc

enta

ge o

f to

tal t

ime

Percentage of time spent in different states

Figure 11: Overview of system behaviors for dif-ferent applications.