representing characters in a computer system representation of data in computer systems

14
Representing Characters in a Computer System Representation of Data in Computer Systems

Upload: letitia-caldwell

Post on 18-Jan-2018

215 views

Category:

Documents


0 download

DESCRIPTION

Representing Characters

TRANSCRIPT

Page 1: Representing Characters in a Computer System Representation of Data in Computer Systems

Representing Characters in a Computer System

Representation of Data in Computer Systems

Page 2: Representing Characters in a Computer System Representation of Data in Computer Systems

Activity 1

Convert the following binary numbers to hexadecimal.  

0010 0111  

1001 1000

5min

Page 3: Representing Characters in a Computer System Representation of Data in Computer Systems

Representing Characters

Page 4: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer SystemsIntroduction

As we have seen before, computers can only deal with 0s and 1s (binary)

All data that it needs to work with (numbers, sound, images etc) must be converted into binary for the computer to be able to process it.

It is exactly the same for text, or one piece of text known as a character.

Each time you hit a key on a keyboard, the computer generates a code for that letter, which is then processed by the CPU and the result might be the letter appearing on the screen or being printed on paper.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 5: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

Introduction

So that all computer systems behave in a similar way it is important that there is an agreed set of codes for characters.

In 1960, the American Standard Association agreed on a set of codes to represent the main characters in the English language.

And this is known as ASCIIAmerican Standard Code for Information Interchange

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 6: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

ASCII Character SetThe English Language requires the number of codes shown below:

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Letters of the alphabet (lower case) 26Letters of the alphabet (upper case) 26All numeric symbols 10Punctuation, symbols and ‘space’ 3332 codes reserved for non-printable control codes

32

95 (printable)32 (non printable)127 in total

Page 7: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

ASCII Character SetAs we know, one byte is capable of storing 256 different numbers:Binary: 00000000 - 11111111Denary: 0 – 255

The ASCII system requires 127 different codes.In binary, 127 is 1111111, so the ASCII system uses 7 bits.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 8: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

ASCII Character Set

As 8 bit machines became standard, the ASCII character set made use of the extra bit (providing a further 128 characters).

So conveniently a byte is used to represents all characters for the English language.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 9: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

Page 10: Representing Characters in a Computer System Representation of Data in Computer Systems

Activity 25mins Write ASCII in ASCII code (binary)

Page 11: Representing Characters in a Computer System Representation of Data in Computer Systems

Activity 25mins

Write ASCII in ASCII code (binary)

A 65 1000001S 83 1010011C 67 1000011I 73 1001001I 73 1001001

Page 12: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

The problem with ASCII

So, we have seen how the ASCII character set can hold up to 256 characters.

What is the problem with this?

The issue is that some languages (such as Chinese and Japanese) use thousands of different characters – which cannot fit into a byte.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 13: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

UNICODE

As computers developed and 16 bit computers were introduced, a new character set was developed to accommodate the various other languages of the world.

This new character set is known as UNICODE.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented

Page 14: Representing Characters in a Computer System Representation of Data in Computer Systems

Representation of Data in Computer Systems

UNICODE

UNICODE uses 32 bits (2 sets of 16 bits) to represent every character in various languages around the world.

Within the UNICODE system, the original 127 ASCII characters still have the same code values, others have just been added on.

Learning Objectives:Characters:(a) Explain the use of

binary codes to represent characters

(b) Explain the term character set

(c) Describe with examples (for example ASCII and Unicode) the relationship between the number of bits per character set and the number of characters that can be represented