the computer chronicles

Upload: joseph-okafor

Post on 06-Apr-2018

228 views

Category:

Documents


1 download

TRANSCRIPT

  • 8/3/2019 The Computer Chronicles

    1/23

    The Computer Chronicles

    2010 Alton C. Crews Middle School Spring Issue 2010 \

    FREE & ExcellentStudent Web design App.

    NEWS FLASH!A New Generation of Computers is about to be Announcedby Roderick Hames

    In the beginning ...A generation refers to the state of improvement in the development of a product.

    This term is also used in the different advancements of computer technology. With eachnew generation, the circuitry has gotten smaller and more advanced than the previous

    generation before it. As a result of the miniaturization, speed, power, and memory of

    computers has proportionally increased. New discoveries are constantly being developedthat affect the way we live, work and play.

    The First Generation: 1946-1958 (The Vacuum Tube Years)The first generation computers were huge, slow, expensive, and often

    undependable. In 1946two Americans, Presper

    Eckert, and John Mauchly built the ENIACelectronic computer which used vacuum tubes

    instead of the mechanical switches of the Mark I.The ENIAC used thousands of vacuum tubes, which

    took up a lot of space and gave off a great deal of

    heat just like light bulbs do. The ENIAC led to othervacuum tube type computers like the EDVAC

    (Electronic Discrete Variable Automatic Computer) and the UNIVAC I (UNIVersal

    Automatic Computer).

    The vacuum tube was an extremely important step in the advancement ofcomputers. Vacuum tubes were invented the same time the light bulb was invented by

    Thomas Edison and worked very similar to light bulbs. It's purpose was to act like anamplifierand aswitch. Without any moving parts, vacuum tubes could take very weaksignals and make the signal stronger (amplify it). Vacuum tubes could also stop and start

    the flow of electricity instantly (switch). These two properties made the ENIAC

    computer possible.

    http://www.weebly.com/referral.php?tW5fTkC59lZL79JChttp://www.weebly.com/referral.php?tW5fTkC59lZL79JChttp://www.weebly.com/referral.php?tW5fTkC59lZL79JChttp://www.time-warp.org/http://www.weebly.com/referral.php?tW5fTkC59lZL79JC
  • 8/3/2019 The Computer Chronicles

    2/23

    The ENIAC gave off so much heat that they had to be cooled by gigantic air

    conditioners. However even with these huge coolers, vacuum tubes still overheated

    regularly. It was time for something new.

    The Second Generation: 1959-1964 (The Era of the Transistor)

    The transistor computer did not last as long as the vacuum tube computer lasted, butit was no less important in the advancement of computer technology. In 1947 three

    scientists, John Bardeen, William Shockley, and Walter Brattain

    working at AT&T's Bell Labs invented what would replace thevacuum tube forever. This invention was the transistor which

    functions like a vacuum tube in that it can be used to relay and switch

    electronic signals.

    There were obvious differences between the transisitor and thevacuum tube. The transistor was faster, more reliable, smaller, and much cheaper to

    build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes.

    These transistors were made of solid material, some of which is silicon, an abundantelement (second only to oxygen) found in beach sand and glass. Therefore they were

    very cheap to produce. Transistors were found to conduct electricity faster and betterthan vacuum tubes. They were also much smaller and gave off virtually no heat

    compared to vacuum tubes. Their use marked a new beginning for the computer.Without this invention, space travel in the 1960's would not have been possible.

    However, a new invention would even further advance our ability to use computers.

    The Third Generation: 1965-1970 (Integrated Circuits - Miniaturizing theComputer)

    Transistors were a tremendous breakthrough in advancing the computer. However

    no one could predict that thousands even now millions of transistors (circuits) could becompacted in such a small space. The integrated circuit, or as it is sometimes referred

    to as semiconductor chip, packs a huge number of transistors onto a single wafer of

    silicon. Robert Noyce ofFairchild Corporation and Jack Kilby ofTexas Instrumentsindependently discovered the amazing attributes of integrated circuits. Placing such large

    numbers of transistors on a single chip vastly increased the power of a single computer

    and lowered its cost considerably.

    Since the invention of integrated circuits, the number of transistors that can beplaced on a single chip has doubled every two years, shrinking both the size and cost of

    computers even further and further enhancing its power. Most electronic devices today

    use some form of integrated circuits placed on printed circuit boards-- thin pieces ofbakelite orfiberglass that have electrical connections etched onto them -- sometimes

    called a mother board.

    These third generation computers could carry out instructions in billionths of asecond. The size of these machines dropped to the size of small file cabinets. Yet, the

    single biggest advancement in the computer era was yet to be discovered.

  • 8/3/2019 The Computer Chronicles

    3/23

    The Fourth Generation: 1971-Today (TheMicroprocessor)

    This generation can be characterized by both the jump to monolithicintegratedcircuits(millions oftransistors put onto one integrated circuit chip) and the invention of

    the microprocessor (a single chip that could do all the processing of a full-scalecomputer). By putting millions of transistors onto one single chip more calculation and

    faster speeds could be reached by computers. Because electricity travels about a foot in a

    billionth of a second, the smaller the distance the greater the speed of computers.

    However what really triggered the tremendous growth of computers and itssignificant impact on our lives is the invention of the microprocessor. Ted Hoff,

    employed by Intel (RobertNoyce's new company) invented a chip the size of a pencil

    eraser that could do all the computing and logic work of a computer. The microprocessorwas made to be used in calculators, not computers. It led, however, to the invention of

    personal computers, or microcomputers.

    It wasn't until the 1970's that people began buying computer for personal use. One

    of the earliest personal computers was the Altair 8800 computer

    kit. In 1975 you could purchase this kit and put it together to make

    your own personal computer. In 1977 the

    Apple II was sold to the public and in 1981

    IBM entered the PC (personal computer)

    market.

    Today we have all heard ofIntel and its Pentium Processors

    and now we know how it all got started. The computers of the next generation will have

    millions upon millions of transistors on one chip and will perform over a billioncalculations in a single second. There is no end in sight for the computer movement.

    Questions

    Directions: Answer each of the questions after reading the article above. Write in

    complete sentences. You must think and be creative with your answers.

    1. In each of the 4 generations what was the cause for the increase ofspeed, power,

    ormemory?

    2. Why did the ENIAC and other computers like it give off so much heat? (Be veryspecific)

    3. What characteristics made the transistors better than the vacuumtube?

    4. How was space travel made possible through the invention of transistors?5. What did the microprocessorallow the computers to do? and What was the

    microprocessor's original purpose?

    6. When was the first computer offered to the public and what was its name?7. What was Robert Noyce and Jack Kilby known for?

  • 8/3/2019 The Computer Chronicles

    4/23

    8. Intel was started by who?

    9. What is monolithicintegratedcircuits?

    10. How do you think society will be different if scientists are able to create a chipthat will perform a trillion operations in a singlesecond?

    Processors of old and new

    One of the first ICs 386 Processor Pentium Processor The New Processors

    This site was created byRoderick Hamesfor the primary purpose of teaching and demonstrating computer & business skills..

    Any distribution or copying without the express or written consent ofAlton C. Crews Middle School or its creator is strictly prohibited.

    ***Any questions, comments or suggestions concerning

    this page or this Web site should be forwarded toRoderick Hames, Computer Science / Business Education Teacher

    Copyright 2009, Alton C. Crews Middle School: CS Dept - Articles

    mailto:[email protected]?subject=Computer%20Chronmailto:[email protected]?subject=Computer%20Chronmailto:[email protected]?subject=Computer%20chronhttp://www.crews.org/curriculum/ex/compsci/articles/articles.htmlmailto:[email protected]?subject=Computer%20Chronmailto:[email protected]?subject=Computer%20chronhttp://www.crews.org/curriculum/ex/compsci/articles/articles.html
  • 8/3/2019 The Computer Chronicles

    5/23

    Dated: Aug. 13, 2004

    Related Categories

    Computer Beginners Guides

    By Najmi

    Related Article:History, Origins, and Various Generations of Computers, Charles

    Babbage - Father of Computing

    Jan. 01, 2010 Update: Minor Tweaks were done for keywords.

    The history of computer development is often referred to in reference to the different

    generations of computing devices. A generation refers to the state of improvement in the

    product development process. This term is also used in the different advancements ofnew computer technology. With each new generation, the circuitry has gotten smaller and

    more advanced than the previous generation before it. As a result of the miniaturization,

    speed, power, and computer memory has proportionally increased. New discoveries areconstantly being developed that affect the way we live, work and play.

    Each generation of computers is characterized by major technological development that

    fundamentally changed the way computers operate, resulting in increasingly smaller,

    cheaper, more powerful and more efficient and reliable devices. Read about each

    generation and the developments that led to the current devices that we use today.

    First Generation - 1940-1956: Vacuum Tubes

    The first computers used vacuum tubes for circuitry and

    magnetic drums for memory, and were often enormous,

    taking up entire rooms. A magnetic drum,also referred toas drum, is a metal cylinder coated with magnetic iron-

    oxide material on which data and programs can be stored.

    Magnetic drums were once use das a primary storage

    device but have since been implemented as auxiliarystorage devices.

    The tracks on a magnetic drum are assigned to channels

    located around the circumference of the drum, formingadjacent circular bands that wind around the drum. A

    single drum can have up to 200 tracks. As the drum rotates at a speed of up to 3,000 rpm,

    the device's read/write heads deposit magnetized spots on the drum during the write

    http://www.techiwarehouse.com/cat/47/Computerhttp://www.techiwarehouse.com/cms/engine.php?page_id=51c38188http://www.techiwarehouse.com/engine/53eac295/Charles-Babbage---Father-of-Computinghttp://www.techiwarehouse.com/engine/53eac295/Charles-Babbage---Father-of-Computinghttp://www.techiwarehouse.com/cat/47/Computerhttp://www.techiwarehouse.com/cms/engine.php?page_id=51c38188http://www.techiwarehouse.com/engine/53eac295/Charles-Babbage---Father-of-Computinghttp://www.techiwarehouse.com/engine/53eac295/Charles-Babbage---Father-of-Computing
  • 8/3/2019 The Computer Chronicles

    6/23

    operation and sense these spots during a read operation. This action is similar to that of a

    magnetic tape or disk drive.

    They were very expensive to operate and in addition to using a great deal of electricity,generated a lot of heat, which was often the cause of malfunctions. First generation

    computers relied on machine language to perform operations, and they could only solveone problem at a time. Machine languages are the only languages understood by

    computers. While easily understood by computers, machine languages are almostimpossible for humans to use because they consist entirely of numbers. Computer

    Programmers, therefore, use eitherhigh level programming languages or an assembly

    language programming. An assembly language contains the same instructions as amachine language, but the instructions and variables have names instead of being just

    numbers.

    Programs written in high level programming languages retranslated into assembly

    language or machine language by a compiler. Assembly language program retranslated

    into machine language by a program called an assembler (assembly language compiler).

    Every CPU has its own unique machine language. Programs must be rewritten or

    recompiled, therefore, to run on different types of computers. Input was based onpunch

    card and paper tapes, and output was displayed on printouts.

    The UNIVAC and ENIAC computers are examples of first-generation computing

    devices. The UNIVAC was the first commercial computer delivered to a business client,

    the U.S. Census Bureau in 1951.

    Acronym for Electronic Numerical Integrator And Computer, the world's first operational

    electronic digital computer, developed by Army Ordnance to compute World War IIballistic firing tables. The ENIAC, weighing 30 tons, using 200 kilowatts of electric

    power and consisting of 18,000 vacuum tubes,1,500 relays, and hundreds of thousands ofresistors,capacitors, and inductors, was completed in 1945. In addition to ballistics, the

    ENIAC's field of application included weather prediction, atomic-energy calculations,

    cosmic-ray studies, thermal ignition,random-number studies, wind-tunnel design, and

    other scientific uses. The ENIAC soon became obsolete as the need arose for fastercomputing speeds.

    Second Generation - 1956-1963:

    Transistors

    Transistors replaced vacuum tubes and

    ushered in the second generation

    computer. Transistor is a device composed

    of semiconductor material that amplifies asignal or opens or closes a circuit.

    Invented in 1947 at Bell Labs, transistors

    have become the key ingredient of all

  • 8/3/2019 The Computer Chronicles

    7/23

  • 8/3/2019 The Computer Chronicles

    8/23

  • 8/3/2019 The Computer Chronicles

    9/23

    Instruction Set: The set of instructions that the microprocessor can execute.

    Bandwidth: The number of bits processed in a single instruction.

    Clock Speed: Given in megahertz (MHz), the clock speed determines how manyinstructions per second the processor can execute.

    In both cases, the higher the value, the more powerful the CPU. For example, a 32-bit

    microprocessor that runs at 50MHz is more powerful than a 16-bitmicroprocessor that

    runs at 25MHz.

    What in the first generation filled an entire room could now fit in the palm of the hand.

    The Intel 4004chip, developed in 1971, located all the components of the computer -from the central processing unit and memory to input/output controls - on a single chip.

    Abbreviation of central processing unit, and pronounced as separate letters. The CPU isthe brains of the computer. Sometimes referred to simply as the processor or central

    processor, the CPU is where most calculations take place. In terms of computing

    power,the CPU is the most important element of a computer system.

    On large machines, CPUs require one or more printed circuit boards. On personalcomputers and small workstations, the CPU is housed in a single chip called a

    microprocessor.

    Two typical components of a CPU are:

    The arithmetic logic unit (ALU), which performs arithmetic and logical

    operations.

    The control unit, which extracts instructions from memory and decodes and

    executes them, calling on the ALU when necessary.

    In 1981 IBM introduced its first computer for the home user, and in 1984 Appleintroduced the Macintosh. Microprocessors also moved out of the realm of desktopcomputers and into many areas of life as more and more everyday products began to use

    microprocessors.

    As these small computers became more powerful, they could be linked together to formnetworks, which eventually led to the development of the Internet. Fourth generation

    computers also saw the development of GUI's, the mouse and handheld devices

  • 8/3/2019 The Computer Chronicles

    10/23

    Fifth Generation - Present and Beyond: Artificial

    Intelligence

    Fifth generation computing devices, based on artificialintelligence, are still in development,though there are some

    applications, such as voice recognition, that are being usedtoday.

    Artificial Intelligence is the branch of computer science concerned with makingcomputers behave like humans. The term was coined in 1956 by John McCarthy at the

    Massachusetts Institute of Technology. Artificial intelligence includes:

    Games Playing: programming computers to play games such as chess andcheckers

    Expert Systems: programming computers to make decisions in real-life situations

    (for example, some expert systems help doctors diagnose diseases based on

    symptoms)

    Natural Language: programming computers to understand natural human

    languages

    Neural Networks: Systems that simulate intelligence by attempting to reproducethe types of physical connections that occur in animal brains

    Robotics: programming computers to see and hear and react to other sensory

    stimuli

    Currently, no computers exhibit full artificial intelligence (that is, are able to simulatehuman behavior). The greatest advances have occurred in the field of games playing. The

    best computer chess programs are now capable of beating humans. In May,1997, an IBMsuper-computer called Deep Blue defeated world chess champion Gary Kasparov in a

    chess match.

    In the area of robotics, computers are now widely used in assembly plants, but they are

    capable only of very limited tasks. Robots have great difficulty identifying objects based

    on appearance or feel, and they still move and handle objects clumsily.

  • 8/3/2019 The Computer Chronicles

    11/23

    Natural-language processing offers the greatest potential rewards because it would allow

    people to interact with computers without needing any specialized knowledge. You could

    simply walk up to a computer and talk to it. Unfortunately, programming computers tounderstand natural languages has proved to be more difficult than originally thought.

    Some rudimentary translation systems that translate from one human language to another

    are in existence, but they are not nearly as good as human translators.

    There are also voice recognition systems that can convert spoken sounds into writtenwords, but they do not understand what they are writing; they simply take dictation. Even

    these systems are quite limited -- you must speak slowly and distinctly.

    In the early 1980s, expert systems were believed to represent the future of artificial

    intelligence and of computers in general. To date, however, they have not lived up toexpectations. Many expert systems help human experts in such fields as medicine and

    engineering, but they are very expensive to produce and are helpful only in special

    situations.

    Today, the hottest area of artificial intelligence is neural networks, which are proving

    successful in an umber of disciplines such as voice recognition and natural-language

    processing.

    There are several programming languages that are known as AI languages because theyare used almost exclusively for AI applications. The two most common are LISP and

    Prolog.

    Related Article:Discover Computer History

    Voice Recognition

    The field of computer science that deals

    with designing computer systems that canrecognize spoken words. Note that voice

    recognition implies only that the computer

    can take dictation, not that it understandswhat is being said. Comprehending human

    languages falls under a different field of

    computer science called natural language

    processing. A number of voice recognition

    systems are available on the market. Themost powerful can recognize thousands of

    words. However, they generally require an extended training session during which thecomputer system becomes accustomed to a particular voice and accent.Such systems are

    said to be speaker dependent.

    Many systems also require that the speaker speak slowly and distinctly and separate each

    word with a short pause. These systems are called discrete speech systems. Recently,

    http://www.techiwarehouse.com/cms/engine.php?page_id=a24c7615http://www.techiwarehouse.com/cms/engine.php?page_id=a24c7615http://www.techiwarehouse.com/cms/engine.php?page_id=a24c7615
  • 8/3/2019 The Computer Chronicles

    12/23

    great strides have been made in continuous speech systems -- voice recognition systems

    that allow you to speak naturally. There are now several continuous-speech systems

    available for personal computers.

    Because of their limitations and high cost, voice recognition systems have traditionally

    been used only in a few specialized situations. For example, such systems are useful ininstances when the user is unable to use a keyboard to enter data because his or her hands

    are occupied or disabled. Instead of typing commands, the user can simply speak into aheadset. Increasingly, however, as the cost decreases and performance improves, speech

    recognition systems are entering the mainstream and are being used as an alternative to

    keyboards.

    The use of parallel processing and superconductors is helping to make artificialintelligence a reality. Parallel processing is the simultaneous use of more than one CPU

    to execute a program. Ideally, parallel processing makes a program run faster because

    there are more engines (CPUs) running it. In practice, it is often difficult to divide a

    program in such a way that separate CPUs can execute different portions withoutinterfering with each other.

    Most computers have just one CPU, but some models have several. There are even

    computers with thousands of CPUs. With single-CPU computers, it is possible to performparallel processing by connecting the computers in a network. However, this type of

    parallel processing requires very sophisticated software called distributed processing

    software.

    Note that parallel processing differs from multitasking, in which a single CPU executesseveral programs at once.

    Parallel processing is also called parallel computing.

    Quantum computation and molecular and nano-technology will radically change the face

    of computers in years to come. First proposed in the 1970s, quantum computing relies onquantum physics by taking advantage of certain quantum physics properties of atoms or

    nuclei that allow them to work together as quantum bits, or qubits, to be the computer's

    processor and memory. By interacting with each other while being isolated from theexternal environment,qubits can perform certain calculations exponentially faster than

    conventional computers.

    Qubits do not rely on the traditional binary nature of computing. While traditionalcomputers encode information into bits using binary numbers, either a 0or 1, and canonly do calculations on one set of numbers at once, quantum computers encode

    information as a series of quantum-mechanical states such as spin directions of electrons

    or polarization orientations of a photon that might represent a 1 or a 0, might represent acombination of the two or might represent a number expressing that the state of the qubit

    is somewhere between 1 and 0, or a superposition of many different numbers at once. A

    quantum computer can doan arbitrary reversible classical computation on all the numbers

  • 8/3/2019 The Computer Chronicles

    13/23

    simultaneously, which a binary system cannot do, and also has some ability to produce

    interference between various different numbers. By doing a computation on many

    different numbers at once,then interfering the results to get a single answer, a quantumcomputer has the potential to be much more powerful than a classical computer of the

    same size.In using only a single processing unit, a quantum computer can naturally

    perform myriad operations in parallel.

    Quantum computing is not well suited for tasks such as word processing and email, but itis ideal for tasks such as cryptography and modeling and indexing very large databases.

    Nanotechnology is a field of science whose goal is to control individual atoms and

    molecules to create computer chips and other devices that are thousands of times smaller

    than current technologies permit. Current manufacturing processes use lithography toimprint circuits on semiconductor materials. While lithography has improved

    dramatically over the last two decades -- to the point where some manufacturing plants

    can produce circuits smaller than one micron(1,000 nanometers) -- it still deals with

    aggregates of millions of atoms. It is widely believed that lithography is quicklyapproaching its physical limits. To continue reducing the size of semiconductors, new

    technologies that juggle individual atoms will be necessary. This is the realm ofnanotechnology.

    Although research in this field dates back to Richard P. Feynman's classic talk in 1959,

    the term nanotechnology was first coined by K. Eric Drexler in1986 in the book Engines

    of Creation.

    In the popular press, the term nanotechnology is sometimes used to refer to any sub-micron process,including lithography. Because of this, many scientists are beginning to

    use the term molecular nanotechnology when talking about true nanotechnology at themolecular level.

    The goal of fifth-generation computing is to develop devices that respond to natural

    language input and are capable of learning and self-organization.

    Here natural language means a human language. For example, English, French, and

    Chinese are natural languages. Computer languages, such as FORTRAN and C,are not.

    Probably the single most challenging problem in computer science is to developcomputers that can understand natural languages. So far, the complete solution to this

    problem has proved elusive, although great deal of progress has been made. Fourth-generation languages are the programming languages closest to natural languages.

    The Internet has sunk its claws deep in peoples hearts and minds. Users now want tostay connected to the Net all the time so that none of the correspondence is delayed or

    they dont miss any opportunity to click on a prize. One tends to forget the amount of

    manpower that is being consumed while playing lotto online or downloading a new flashgame. The gum on the screen is so sticky that nobody gets time to shift his or her eye

  • 8/3/2019 The Computer Chronicles

    14/23

    from the monitor. This is a destructive trait for employees in Mission Critical

    Environments. Thanks to the proliferation of the Internet in the corporate sector, how

    employees utilize their online time at workplace is now a critical debate.

    Increasingly, companies are finding that workers tend to get sidetracked indulging in

    personal entertainment or catching up with the friends via Instant Messengers resulting inhours of wasted company time. Employees who do waste precious company time in

    mindless Net pursuits are termed as CyberSlackers. Over the past three years, severalorganizations such as The New York Times, Rolls Royce and Xerox have fired workers

    for abusing company Net accounts. Two popular reasons why more workers are finding

    themselves out of job are downloading porn and sending out obscene emails to friendsand colleagues.In the age of the connected, a company can therefore never expect to meet

    targets if checks are not applied and policies are not outlined for employees to spend their

    online time. What factors should organizations keep in mind while drafting the groundrules for their employees?

    Is the time spent in various online non-productive activities an appreciable thing? Is the costing of Internet usage worth the companys monthly expense?

    Is the time spent on the Net beneficial for the company or are the employeeswasting a major chunk of the account in individual pursuits?

    There are three main reasons why companies should be concerned about their employees

    surf habits:

    Loss of productivity

    Legal liability Waste of Bandwidth

    Loss of Productivity

    If employees are using the Internet for non-work related purposes, then this results in

    reduced productivity and ultimate loss in profits. On an average, workers browse theInternet more at the office as opposed to the odd few hours at home due to presence of

    proxy settings on their PCs, which allows full-time connectivity.

    The US Treasury Department recently monitored the Internal Revenue Services

    (IRS)Workforces Internet use

    They found that activities such as personal email, chat, online shopping and personalfinance and stocks accounted for 51% of employees time spent online. The top non-work

    Web activity favored by IRS officials was surfing financial websites. Chat and email ran

    a close second, followed by miscellaneous activities including visiting adult sites, searchrequests, and looking at or downloading streaming media.

    Time is an asset and a misuse of that asset is just as wrong as the misuses of any of other

    assets that the company holds. As with any project, meeting a deadline is always a core

  • 8/3/2019 The Computer Chronicles

    15/23

    issue. Internet addiction makes meeting crucial deadlines an impossible task and one

    never realizes how his precious time is wasted whiling away on the Internet checking

    horoscope and news trivia.

    Legal Liability

    Employees are betraying their companys trust and abusing their online time when they

    download material that is illegal or inappropriate. An employer can face legal risks at the

    hands of careless employees. Companies are also at legal risk for copyright violationwhen employees download protected mp3 files or pirated software.

    Employees can also sue their employers if a co-worker has downloaded pornographic or

    racist materials. Clearly, it has become essential for companies to be aware of what there

    employees are downloading from the Internet, and for them to take steps to avoid liabilityby introducing employee Internet management strategies. Following habits should not be

    entertained in a regular office:

    Uploading illegal materials to a public Web site, illegally gained access to a

    network, server, by hacking or cracking passwords.

    Sending out computer viruses or denial of service attack to the Internet.

    Sending illegal material such as child pornography to co-workers.

    Emailing hate letters or slanderous letters over the Internet.

    Posting unfounded corporate rumors on stock market bulletin boards.

    Sending emails that may offend co-workers and are covered under sexual

    harassment laws.

    Otherwise engage in similar online behavior.

    Employees who are treated with dignity and respect, who take pride in their organizationand its ethics, tend to respect the assets of that organization. As stealing pen, stationary,

    and hardware is unethical. Similarly acting indifferent on the Net during office hours is abad techie trait.

    Waste of Bandwidth

    As broadband applications over the Internet continue to become increasingly popular,

    corporate networks are becoming bottlenecked; Streaming media, mp3 files, video and

    audio files, large graphic files, are increasing network crashes.For many companies,network quality of service (QoS) may be their most important business asset. If QoS is

    dragging, so is the companys ability to keep pace with competition. Todays Internet letsemployees buy products, chat with friends, visit their kids at daycare, listen to Real audiofeeds and play interactive games. As a result, the bandwidth increases.

    Preventive Measures

  • 8/3/2019 The Computer Chronicles

    16/23

    While monitoring and filtering software can be effective for managing current Internet

    abusers in the workplace, a more effective means of Employee Internet Management is

    via preventative measures.

    Draft Company Internet access policy

    Company-wide education about proper Net use

    Internet Access Policy

    Most companies already require employees to sign a basic contract indicating what

    acceptable Internet use is, the fact that employees may be monitored without indication,and that unacceptable Internet abuse is grounds for termination.

    Company-Wide Education About Proper Net Usage

    Surprisingly, very few companies offer seminars or educational materials for employees

    to learn about the ramifications of Internet abuse. By educating employees on howabusing the Internet has a negative impact on both the self and the company, many of

    these problems outlined in this article may be alleviated.

    Boss Plays Big Brother

    It is certainly not ethical to poke into someones inbox. Given the importance of user

    privacy in the cyber age, it is important not to overlook how bosses can step over the line,closely monitoring their employees surf trails. If your organization is drafting a policy

    paper for workers Net use, make sure it is well balanced in light of the standards set by

    International privacy advocates. The Electronic Frontier Foundation maintains an archive

    of information pertinent to work privacy.

    Work First

    Wasting company Internet time, intentionally or otherwise, is wrong. It cuts into our

    abilities to do the job, to be productive and competitive. And in todays market place,being competitive is the key to survival. Internet is an excellent research tool and its use

    should benefit the companys cause. As workers, our online time should produce new

    growth strategies for the company rather than mindless IMing.

    For any response,

  • 8/3/2019 The Computer Chronicles

    17/23

    History of generations of Computers

    [an error occurred while processing this directive]

    [an error

    occurred

    whileprocessing

    thisdirective]

    The computers that you see and use today hasn't come off by any

    inventor at one go. Rather it took centuries of rigorous research work to

    reach the present stage. And scientists are still working hard to make it

    better and better. But that is a different story.

    First, let us see when the very idea of computing with a machine or

    device, as against the conventional manual calculation, was given a

    shape.

    Though experiments were going on even earlier, it dates back to the17th century when the first such successful device came into being.

    Edmund Gunter, an English mathematician, is credited with itsdevelopment in 1620. Yet it was too primitive to be recognized even as

    the forefather of computers. The first mechanical digital calculatingmachine was built in 1642 by the French scientist-philosopher Blaise

    Pascal. And since then the ideas and inventions of many

    mathematicians, scientists, and engineers paved the way for thedevelopment of the modern computer in following years.

    But the world has had to wait for yet another couple of centuries toreach the next milestone in developing a computer. Then it was the

    English mathematician and inventor Charles Babbage who did thewonder with his works during 1830s. In fact, he was the first to work on

    a machine that can use and store values of large mathematical tables.The most important thing of this machine is its use in recording electric

    impulses, coded in the very simple binary system, with the help of onlytwo kinds of symbols.

    This is quite a big leap closer to the basics on which computers todaywork. However, there was yet a long way to go. And, compared to

    present day computers, Babbage's machine could be regarded as more

    of high-speed counting devices. For, they could only work on numbersalone!

    The Boolean algebra developed in the 19th century removed thenumbers-alone limitation for these counting devices. This technique of

    mathematics, invented by Boole, helped correlate the binary digits withour language. For instance, the values of 0s are related with false

    statements and 1s with the true ones. British mathematician Alan

  • 8/3/2019 The Computer Chronicles

    18/23

    Turing made further progress with the help of his theory of a computing

    model. Meanwhile the technological advancements of the 1930s helpedmuch in furthering the advancement of computing devices.

    But the direct forefathers of present-day computer systems evolved in

    about 1940s. The Harvard Mark 1 Computer designed by Howard Aiken

    is the world's first digital computer which made use of electro-mechanical devices. It was developed jointly by the International

    Business Machines (IBM) and the Harvard University in 1944.

    But the real breakthrough was the concept of the stored-program

    computer. This was when the Hungarian-American mathematician Johnvon Neumann introduced the Electronic Discrete Variable Automatic

    Computer (EDVAC). The idea--that instructions as well as data shouldbe stored in the computer's memory for better results--made this device

    totally different from its counting device type of forerunners. And sincethen computers have increasingly become faster and more powerful.

    Still, as against the present day's personal computers, they had the

    simplest form of designs. It was based on a single CPU performingvarious operations, like, addition, multiplication and so on. And these

    operations would be performed following an order of instructions, calledprogram, to produce the desired result.

    This form of design, was followed, with a little change even in the

    advanced versions of computers developed later. This changed versionsaw a division of the CPU into memory and arithmetic logical unit (ALU)

    parts and a separate input and output sections.

    In fact, the first four generations of computers followed this as theirbasic form of design. It was basically the type of hardware used that

    caused the difference over the generation. For instance, the firstgeneration variety was based on vacuum tube technology. This wasupgraded with the coming up of the transistors, and printed circuit

    board technology in the 2nd generations. It was further upgraded by thecoming up of integrated circuit chip technology where the little chips

    replaced a large number of components. Thus the size of computer was

    greatly reduced in the 3rd generation, while it become more powerful.But the real marvel came during the 1970s. It was with the introduction

    of the very large scale integrated technology (VLSI) in the 4thgeneration. Aided by this technology a tiny microprocessor can store

    millions of pieces of data.

    And based on this technology the IBM introduced its famous Personal

    Computers. Since then IBM itself, and other makers including Apple,Sinclair, and so forth, kept on developing more and more advancedversions of personal computers along with bigger and more powerful

    ones like Mainframe and Supercomputers for more complicated works.

    Meanwhile the tinier versions like laptops and even palmtops came upwith more advanced technologies over the past couple of decades. But

    only advancement of technology cannot take the full credit for theamazing advancement of computers over the past few decades.

  • 8/3/2019 The Computer Chronicles

    19/23

    Software, or the inbuilt logic to run the computer the way you like, kept

    on being developed at an equal pace. The coming of famous softwaremanufacturers like Microsoft, Oracle, Sun have helped pacing up the

    development. The result of all these painstaking research is to add toour ease in solving complex problems at a lightning speed with a device

    that is easy to use and operate, called computer.

    Dated: Feb. 21, 2006

    Related Categories

    PHP

    SQLBy Donald W. Hyatt

    Inserting a New Table Entry

    For the examples we have been using in this tutorial, we are using an account called

    games in which there is a table called scores for keeping track of high scores. The tablewas initialized from a file, but now we are going to add a new player in interactive mode.

    We will use the MySQL command called INSERT INTO to select the table and

    operation, and then the command SET to specify the value of any variables that we wishto initialize. In order to add a new player called "Richard", we will use the following

    syntax:

    mysql> INSERT INTO scores SET Name="Richard";Query OK, 1 row affected (0.00 sec)

    Let's see the current values in the table scores.

    mysql> SELECT * FROM scores;+---------+------+

    | Name | Num |

    +---------+------+

    | Phyllis | 987 |

    | Randy | 1285 |

    | Don | 919 |

    | Mark | 0 |

    | Mary | 567 |

    | Bob | 23 || Pete | 456 |

    | Sally | 333 |

    | Richard | NULL |

    +---------+------+

    9 rows in set (0.00 sec)

    It is important to note that if a variable is a "PRIMARY KEY" or is specified in the initial

    table creation as being something "NOT NULL", a value mustbe supplied at the time the

    http://www.techiwarehouse.com/cat/31/PHPhttp://www.techiwarehouse.com/cat/33/SQLhttp://www.techiwarehouse.com/cat/31/PHPhttp://www.techiwarehouse.com/cat/33/SQL
  • 8/3/2019 The Computer Chronicles

    20/23

    entry is inserted. Notice that Richard does not have a score at this time, so his score is not

    0 but NULL instead.

    Updating Information

    Since Richard does not have a score at this time, let's take a look at the syntax to changethe information in a table. We will use the command UPDATE to identify the type ofaction and the table being used, and then the operation SET to assign a value to a variable

    as well as WHERE to establish the criteria for updating the record. The systax for that

    command would be:mysql> UPDATE scores SET Num=0 WHERE Name="Richard";Query OK, 1 row affected (0.00 sec)

    Rows matched: 1 Changed: 1 Warnings: 0

    mysql> SELECT * FROM scores WHERE Num=0;Query OK, 1 row affected (0.00 sec)

    Rows matched: 1 Changed: 1 Warnings: 0

    Now Richard's score is also zero. Of course, we could have created Richard's entry and

    assigned the initial score to zero during the insert operation by doing the followingcommand instead:

    mysql> INSERT INTO scores SET Name="Richard", Num=0;

    We can even change one of the user's names. Let's suppose that Mary actually should becalled Marianne. We can change that entry for the name in the following way:

    mysql> UPDATE scores SET Name="Marianne" WHERE Name="Mary";Query OK, 1 row affected (0.00 sec)

    Rows matched: 1 Changed: 1 Warnings: 0

    +----------+------+

    | Name | Num |+----------+------+

    | Phyllis | 987 |

    | Randy | 1385 |

    | Don | 919 |

    | Mark | 0 |

    | Marianne | 567 |

    | Bob | 23 |

    | Pete | 456 |

    | Sally | 333 |

    | Richard | 100 |

    +----------+------+

    9 rows in set (0.00 sec)

    Now let's try a slightly more sophisticated update operation. Suppose we wish to give 100Bonus points to the score of anyone whose name begins with an "R", such as in "Randy"

    and "Richard". We could update each row separately by replacing their scores with the

    appropriate values, but the following approach is a bit better. We will use the command

    LIKE which permits us to have match of some value such as the leading "R" in both

    names, and use the wildcardcharacter "%" to match the rest. We will then allow MySQL

  • 8/3/2019 The Computer Chronicles

    21/23

    to do the arithmetic by adding 100 points to the old value of Num for any of those that

    match. The syntax for that command is:

    mysql> UPDATE scores SET Num=Num+100 WHERE Name LIKE "R%";Query OK, 1 row affected (0.00 sec)

    Rows matched: 1 Changed: 1 Warnings: 0

    +----------+------+| Name | Num |

    +----------+------+

    | Phyllis | 987 |

    | Randy | 1385 |

    | Don | 919 |

    | Mark | 0 |

    | Marianne | 567 |

    | Bob | 23 |

    | Pete | 456 |

    | Sally | 333 |

    | Richard | 100 |

    +----------+------+

    9 rows in set (0.01 sec)

    Now both scores have been changed.

    Deleting a Table Entry

    Now that we can add entries to the table, it will be important to learn how to delete them,

    too. The command for removing something from a table is DELETE FROM to specifythe action and table, and then WHERE to indicate the criteria for deletion. If we desire to

    delete Markfrom the table, the command would be:

    mysql> DELETE FROM scores WHERE Name="Mark";Query OK, 1 row affected (0.00 sec)

    Let's see the current values in the table scores.

    +----------+------+

    | Name | Num |

    +----------+------+

    | Phyllis | 987 |

    | Randy | 1385 |

    | Don | 919 |

    | Marianne | 567 |

    | Bob | 23 |

    | Pete | 456 |

    | Sally | 333 |

    | Richard | 100 |

    +----------+------+

    8 rows in set (0.00 sec)

    If we add another user back to the table, MySQL apparently puts it in the empty slot it

    has because Mark has been deleted.

    mysql> INSERT INTO scores SET Name="Marty", Num=0;Query OK, 1 row affected (0.00 sec)

    +----------+------+

    | Name | Num |

    +----------+------+

  • 8/3/2019 The Computer Chronicles

    22/23

    | Phyllis | 987 |

    | Randy | 1385 |

    | Don | 919 |

    | Marty | 0 |

    | Marianne | 567 |

    | Bob | 23 |

    | Pete | 456 |

    | Sally | 333 |

    | Richard | 100 |

    +----------+------+

    9 rows in set (0.00 sec)

    ModifyingTable Attributes

    Occasionally, it becomes necessary to change the attributes of one of the variables or

    columns in a table. This is a frequent situation for a variable that might be declared

    VARCHAR(20) lets say, and then the user wants to add something that might be 25

    characters in length. Rather than destroying the entire table and starting from scratch, thismodification can be done using the MySQL command, ALTER TABLE combined with

    the MODIFY command.

    Before we modify a column or a field entry, lets take a look at how the fields arecurrently defined using the SHOW command:

    mysql> SHOW FIELDS FROM scores;+-------+-------------+------+-----+---------+-------+

    | Field | Type | Null | Key | Default | Extra |

    +-------+-------------+------+-----+---------+-------+

    | Name | varchar(20) | YES | | NULL | |

    | Num | int(5) | YES | | NULL | |+-------+-------------+------+-----+---------+-------+

    2 rows in set (0.00 sec)

    Now to change the Name variable from 20 to 25 characters, the command would be:

    mysql> ALTER TABLE scores MODIFY Name VARCHAR(25);Query OK, 9 rows affected (0.02 sec)

    Records: 9 Duplicates: 0 Warnings: 0

    Let's see how the values have changed:+-------+-------------+------+-----+---------+-------+

    | Field | Type | Null | Key | Default | Extra |

    +-------+-------------+------+-----+---------+-------+

    | Name | varchar(25) | YES | | NULL | |

    | Num | int(5) | YES | | NULL | |

    +-------+-------------+------+-----+---------+-------+2 rows in set (0.00 sec)

    There are very sophisticated queries and updates that can be done with MySQL. It ispossible to add new columns to existing tables and even merge two databases into one

    large table. Please check out the documentation at the MySQL website for more

    information.

  • 8/3/2019 The Computer Chronicles

    23/23

    MySQL: www.mysql.com

    http://www.mysql.com/http://www.mysql.com/