embedded systems: security reference: kocher et al., dac 2004, pp. 753-760

21
Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753- 760

Post on 21-Dec-2015

225 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Embedded Systems: Security

Reference:

Kocher et al., DAC 2004, pp. 753-760

Page 2: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

This material addresses security, not safety or reliability

Standard security protocols:

cryptographic algorithms, functional perspective

Embedded systems: constrained by their particular environments and resources move security concerns from function-centric perspective to hardware / software (system architecture) design issue

Page 3: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Embedded systems must be secure when accessed logically or physically by malicious entities (software attacks, physical attacks, side-channel attacks)

• security processing is computationally demanding, embedded system resources may be minimal—can lead to undesirable tradeoffs between security / cost security / performance

•Security demands also have big impact on battery-driven systems—resource constraints are severe

•Security mechanisms & standards can evolve rapidly, embedded architectures must allow for this

•Certain objectives such as denial of service attacks, digital content protection, require that embedded system architects cooperate with security experts

Architectural & design methodology solutions needed

Page 4: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Security requirements can be approached from a number of perspectives:Ex: cell-phone: perspectives include:

--manufacturer of a component in the phone

--cell phone manufacturer: secrecy of proprietary firmware in the cell phone

--cellular service provider

--content provider: copy protection of content—e.g., end user may be untrusted entity

--end user: security of personal data stored and communicated

Page 5: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Basic security requirements (end user perspective):

--user identification: restrict access to selected set of authorized users

--secure network access: only to authorized devices

--availability: avoid degrades in service, denial of service

--secure storage—external/internal devices, erasures as needed

--content security (digital rights management)

--tamper resistance—even when malicious parties can physically or logically probe devices

Page 6: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Basic security mechanisms—cryptographic algorithms

--sender and receiver use same secret key; confidentiality during transmission; without secret key, encryption/decryption is very difficult—ex: AES

--secure hash functions—often used to construct method authentication functions—ex: MD5, SHA

--asymmetric algorithms—public key—sender and receiver have separate keys—sender uses public key, receiver uses own private key--used for digital signatures, e.g.—ex: RSA

Public key ciphers are computationally intensive, thus combinations of techniques may be used, e.g., public key for authentication, AES for sending bulk data

Page 7: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Security typically relies on one or more of the above algorithms, along with security protocols:

--Secure communication protocols, e.g. VPN

--Digital certificates, e.g. biometric technologies, digital signatures

--private secure frameworks to protect application content

--secure storage and secure execution—e.g., dedicated hardware, authentication of software and firmware, use of encrypted code

Page 8: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Attacks and countermeasures:

“Trinity of trouble”—complexity, extensibility, connectivity

Complexity: software complexity implies we cannot “prove” most software safe—it is too long and complex; popular languages such as C and C++ do not protect against even simple kinds of attacks such as buffer overflow

Extensibility: systems are designed to be extensible through software updates, dynamically loadable device drivers and modules—these extensions provide opportunities for new software vulnerabilities to be added

Connectivity: connection to internet allows small failures to propagate and become massive failures; attackers can launch attacks without having physical access; poor software practices can spread vulnerabilities

Page 9: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Example: hardware virus

Attack os kernel, which has access to all memory space, e.g., read or write to BIOS; in older systems this was likely in ROM or EPROM; in newer systems may be in flash ROM, which can be rewritten using software

Flash ROM often has extra space, which can be used to store backdoor access; rebooting, “restoring system” will not remove the problem

Such a virus can input false data or order the OS to ignore certain critical events

Page 10: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Securing against software attacks:

e.g., buffer overflows, inconsistent error handling

Prevention:--Include security concerns THROUGHOUT design process--know and understand common pitfalls, including language vulnerabilities--design for security--use thorough, ongoing risk analysis and testing--understand that security problem is more likely to arise in a standard part of the system (e.g., API) than in a part of the system focusing on security

Page 11: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

“Best practices” in software development life cycle:

Req, design test plan code test results field feedbackuse cases

SecurityReqs

Abusecases

Externalreview

Riskanalysis

Risk-basedSecurity tests

Risk analysis

Staticanalysis

Securitybreaks

Penetrationtesting

Page 12: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Must apply software security best practices at all levels:

--requirements: overt security such as cryptographic protocols and also emergent characteristics

--design and architecture level—need coherent system, unified security architecture, use of security principles such as principle of least privilege

--code—use static analysis tools to scan for common source code vulnerabilities

--need constant risk analysis

--need ongoing monitoring—attacks will happen and must be caught and system fixed

Page 13: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Physical and side-channel attacks—e.g., on smart cards

Invasive: e.g., microprobing, reverse engineering—require access and thus are difficult to mount and repeat

Non-invasive attacks: e.g., timing, power analysis. Fault injection, electromagnetic analysis—comparatively cheap and scalable

Page 14: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Side channel attack: in cryptography, a side channel attack is any attack based on information gained from the physical implementation of a cryptosystem, rather than brute force or theoretical weaknesses in the algorithms (compare cryptanalysis). ...

---en.wikipedia.org/wiki/Side_channel_attack

Page 15: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Physical attacks:Require depackaging, layout reconstructionDifficult and expensive but can be carried out once and then guide subsequent noninvasive attacks

Timing analysis:Can use statistical analysis to recover key values, e.g.Can actually infer bit values of key, one at a timeThis attack is immune to simple fixes such as quantizing the time taken or randomizing delays; making all computations take exactly the same amount of time would work, but this is almost impossible to achieve (similar to matching gate delays, e.g.)

Page 16: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Successful protective techniques do exist, e.g., “message blinding” may work

In cryptography, blinding is a technique by which an agent can provide a service to (i.e, compute a function for) a client in an encoded form without knowing either the real input or the real output. Blinding techniques also have applications to preventing side-channel attacks on encryption devices.More precisely, Alice has an input x and Oscar has a function f. Alice would like Oscar to compute y = f(x) for her without revealing either x or y to him. The reason for her wanting this might be that she doesn't know the function f or that she does not have the resources to compute it. Alice "blinds" the message by encoding it into some other input E(x); the encoding E must be a bijection on the input space of f, ideally a random permutation. Oscar gives her f(E(x)), to which she applies a decoding D to obtain D(f(E(x))) = y.Of course, not all functions admit of blind computation.The most common application of blinding is the blind signature. In a blind signature protocol, the signer digitally signs a message without being able to learn its content.

http://en.wikipedia.org/wiki/Blinding_%28cryptography%29

Page 17: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Power analysis

Simple power analysis: infer cryptographic key by power analysis of functions used in cryptographic computations (finite field multiplication and exponentiation, e.g.)

differential power analysis: use statistics to determine key values

Fault inductionInclusion of a fault in a computation can allow the recovery of a key in RSA, e.g.

Electromagnetic analysisUse radiation emitted by device to infer sensitive information; e.g., radiation from video display can be used to reconstruct screen contents

Page 18: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Secure information processing—architectural design space

Macroarchitecture: ASICS, general-purpose/FPGA, general-purpose/HW accel, general-purpose/coprocessor, applicationspecific/accelerator, secure general processor….

Base processor parameters:wordsize, #registers, #pipeline stages, #instructions per cycle, cache architecture

Security processing featureschoice of custom instructions, choice of HW accelerators

Attack resistant featuressecure memory space, concurrent fault detection

Page 19: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

ASICS: hardware only—effective if enough processors of the same type are required, otherwise computationally expensive (e.g., Intel processors with AES function built-in)

Software-only: cryptographic protocols may be too computationally intensive (“processing gap” and/or “battery gap”)

Combination: hardware with acceleration—many possibilities

Page 20: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Attack-resistant architectures:

e.g., owner of embedded processor may be “attacker” in cases of digital rights management—owner wants to make copies of a film, e.g.

Page 21: Embedded Systems: Security Reference: Kocher et al., DAC 2004, pp. 753-760

Design methodology

Formal or non-formal security specifications may be too cumbersome for system design budgets and time-to-market constraints

Much more research is needed here to develop reliable, practical tools

Must be usable by designers who may not be security experts