technical

8
In a very layman terms Information Technology is the study or use of systems (especially computers and telecommunications) for storing, retrieving, and sending information in the context of business or other enterprise The systems development life cycle (SDLC), also referred to as the application development life-cycle, is a term used in systems engineering, information systems and software engineering to describe a process for planning, creating, testing, and deploying an information system. SDLC constitutes of Requirement, Design, Implementation, Verification and Maintenance Data Intelligence or Business Intelligence is a set of techniques and tools for the transformation of raw data into meaningful and useful information for business analysis purposes. Data intelligence may sometimes be mistakenly referred to as business intelligence. Although there are some similarities between these two terms, there are also some key differences. Data intelligence focuses on data used for future endeavors like investments. Business intelligence, on the other hand, is the process of understanding a business process and the data associated with that process. Business intelligence involves organizing, rather than just gathering, da ta to make it useful and applicable to the business's practices. In very simple terms the practice of examining large pre-existing databases in order to generate new information is Data mining A data warehouse is a subject-oriented, integrated, time-variant and non- volatile collection of data in support of management's decision making process. Data mining is the process of finding patterns in a given data set. These patterns can often provide meaningful and insightful data to whoever is interested in that data. Data mining is used today in a wide variety of contexts – in fraud detection, as an aid in marketing campaigns, and even supermarkets use it to study their consumers. Data warehousing can be said to be the process of centralizing or aggregating data from multiple sources into one common repository. Data analytics (DA) is the science of examining raw data with the purpose of drawing conclusions about that information. Data analytics is used in many industries to allow companies and organization to make better business decisions and in the sciences to verify or disprove existing models or theories. Unix (all-caps UNIX for the trademark ) is a family of multitasking , multiuser computer operating systems that derive from the original AT&T Unix, developed in the 1970s at the Bell Labs research center by Ken Thompson , Dennis Ritchie , and others. UNIX was one of the first operating systems to be written in a high-level programming language, namely C. This meant that it could be installed on virtually any computer for which a C compiler existed. This natural portability combined with its low

Upload: rickmartin

Post on 09-Dec-2015

212 views

Category:

Documents


0 download

DESCRIPTION

Technical - UNIX

TRANSCRIPT

Page 1: Technical

In a very layman terms Information Technology is the study or use of systems (especially computers and telecommunications) for storing, retrieving, and sending information in the context of business or other enterprise

The systems development life cycle (SDLC), also referred to as the application development life-cycle, is a term used in systems engineering, information systems and software engineering to describe a process for planning, creating, testing, and deploying an information system.

SDLC constitutes of Requirement, Design, Implementation, Verification and Maintenance Data Intelligence or Business Intelligence is a set of techniques and tools for the transformation of raw data into

meaningful and useful information for business analysis purposes. Data intelligence may sometimes be mistakenly referred to as business intelligence. Although there are some

similarities between these two terms, there are also some key differences. Data intelligence focuses on data used for future endeavors like investments. Business intelligence, on the other hand, is the process of understanding a business process and the data associated with that process. Business intelligence involves organizing, rather than just gathering, data to make it useful and applicable to the business's practices.

In very simple terms the practice of examining large pre-existing databases in order to generate new information is Data mining

A data warehouse is a subject-oriented, integrated, time-variant and non-volatile collection of data in support of management's decision making process.

Data mining is the process of finding patterns in a given data set. These patterns can often provide meaningful and insightful data to whoever is interested in that data. Data mining is used today in a wide variety of contexts – in fraud detection, as an aid in marketing campaigns, and even supermarkets use it to study their consumers. Data warehousing can be said to be the process of centralizing or aggregating data from multiple sources into one common repository.

Data analytics (DA) is the science of examining raw data with the purpose of drawing conclusions about that information. Data analytics is used in many industries to allow companies and organization to make better business decisions and in the sciences to verify or disprove existing models or theories.

Unix (all-caps UNIX for the trademark) is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, developed in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, and others. UNIX was one of the first operating systems to be written in a high-level programming language, namely C. This meant that it could be installed on virtually any computer for which a C compiler existed. This natural portability combined with its low price made it a popular choice among universities. (It was inexpensive because antitrust regulations prohibited Bell Labs from marketing it as a full-scale product.)

Low-level languages are designed to operate and handle the entire hardware and instructions set architecture of a computer directly. Low-level languages are considered to be closer to computers. In other words, their prime function is to operate, manage and manipulate the computing hardware and components. Programs and applications written in low-level language are directly executable on the computing hardware without any interpretation or translation. Machine language and assembly language are popular examples of low level languages.

High level languages are designed to be used by the human operator or the programmer. They are referred to as "closer to humans." In other words, their programming style and context is easier to learn and implement, and the entire code generally focuses on the specific program to be created. High-level language doesn’t require addressing hardware constraints to a greater extent when developing a program. However, every single program written in a high level language must be interpreted into machine language before being executed by the computer. BASIC, C/C++ and Java are popular examples of high-level languages.

A trademark, trade mark, or trade-mark is a recognizable sign, design or expression which identifies products or services of a particular source from those of others.

Page 2: Technical

Difference between Unix and Windows

Windows and Unix As far as operating systems go, to some it would seem as if UNIX has a clear advantage over Windows. UNIX offers greater flexibility than Windows operating systems; furthermore, it is more stable and it does not crash as much as much as Windows. To some, UNIX is just as easy to use as Windows, offering a GUI interface as well as command line. But there are users out there that believe UNIX is for only for computer gurus only, claiming that the fragmentation of the UNIX GUI is its greatest competitive weakness.

One thing that has been established though, UNIX is quite a bit more reliable than Windows, and less administration and maintenance is needed in maintaining a UNIX system. This is a huge cost saver for any organization. Rather than employing many individuals to maintain a Windows based system, one part-time employee would be needed for the upkeep of a typical size UNIX system. One key difference between UNIX and Windows is the implementation of multiple users on one computer. When a user logs onto a UNIX system, a shell process is started to service their commands. Keeping track of users and their processes, a UNIX operating system is able to keep track of processes and prevent them from interfering with each other. This is extremely beneficial when all the processes run on the server, which demands a greater use of resources - especially with numerous users and sizeable applications.

Another main difference between UNIX and Windows is the process hierarchy which UNIX possesses. When a new process is created by a UNIX application, it becomes a child of the process that created it. This hierarchy is very important, so there are system calls for influencing child processes. Windows processes on the other hand do not share a hierarchical relationship. Receiving the process handle and ID of the process it created, the creating process of a Windows system can maintain or simulate a hierarchical relationship if it is needed. The Windows operating system ordinarily treats all processes as belonging to the same generation.

UNIX uses daemons, Windows has service processes. Daemons are processes that are started when UNIX boots up that provide services to other applications. Daemons typically do not interact with users. A Windows service is the equivalent to a UNIX daemon. When a Windows system is booted, a service may be started. This is a long running application that does not interact with users, so they do not have a user interface. Services continue running during a logon session and they are controlled by the Windows Service Control Manager.

UNIX has a novel approach to designing software. Since UNIX is open-sourced, it attracts some very intelligent programmers who develop many applications free of charge. With this in mind, many designers choose to resolve software problems by creating simpler tools that interconnect rather than creating large application programs. In contrast, Windows applications are all proprietary and costly. With UNIX, each generation extends, rather than replaces the previous like Windows it is rarely necessary to upgrade - old and new Unix are all compatible. The main reason for this is the way UNIX is built, which is on a solid theoretical foundation. There are many advantages to this, for instance, a book written 20 years ago that discusses programming UNIX can still be used today. Imagine trying to figure out how to run Windows XP with a Window 3.1 manual - it can't be done.

One argument to be made about UNIX is its lack of standardization. Some feel there are too many choices to be made regarding which GUI to use, or which combination of UNIX hardware and software to support. UNIX operating systems make great high-performance servers, but for end-users, every application on each arrangement of UNIX platform requires a different set, and each application has a different user interface. Microsoft has "the" Windows operating system; there simply isn't one standardized UNIX operating system, or for that matter, a single standardized UNIX GUI. One could argue and say this is a downfall for UNIX, but on the other hand, these variations add flavor and versatility to a solid, reliable operating system.

In summary, the best way to choose between UNIX and Windows is to determine organizational needs. If an organization uses mostly Microsoft products, such as Access, Front Page, or VBScripts, it's probably better to stick with Windows. But, if reliability, universal compatibility, and stability are a concern, UNIX would probably be the way to go.

Page 3: Technical

Here is more input:

Simply stated, the main difference is Windows uses a GUI (Graphical User Interface) and UNIX does not. In Windows one uses the click of a mouse to execute a command where as in UNIX one must type in a command. There are GUIs that can be used in a UNIX environment though very few UNIX users will stoop that low to use one.) Before there was a Windows environment, DOS (Disk Operating System) was used on PCs. DOS was based on and was similar, but only a poor subset, to the UNIX system.

Differences between UNIX and WINDOWS: Unix is safe, preventing one program from accessing memory or storage space allocated to another, and enables protection, requiring users to have permission to perform certain functions, i.e. accessing a directory, file, or disk drive. Also, UNIX is more secure than Windows on a network because Windows is more vulnerable than UNIX. For example, if you leave a port open in Windows it can be easily used by a hacker to introduce a virus in your environment.

UNIX is much better at handling multiple tasks for a single user or for multiple users than windows. For each user, UNIX in general, and especially Sun's Solaris provides many more utilities for manipulating files and data than windows does. For a corporate environment, Unix (especially Solaris ) provides much more control for the administrator than windows does. Solaris, for example, enables the administrator to mirror or stripe data across several disks to minimize risk or optimize performance without 3rd party products. In general, for a programmer or for an administrator, Unix provides more power and flexibility than windows. For the less sophisticated user, Windows can often more easily be installed and configured to run on cheaper hardware to run a desired 3rd party product. In short -- Unix is better, Windows is easier for less sophisticated users.

Hardware: This refers to the hardware layer of any computer system

Kernel:  This sits on top of Hardware and it interacts with the hardware. This is the core of the Operating system and acts as

an interface between the user activities and the hardware. It provides the base functionality of the OS. The major functionality

of the kernel includes process management, memory management, thread management, scheduling, I/O management and

power management.

Shell: This is an interface between the user and the kernel. It interprets the commands from a user and executes the

resulting request. Post processing the commands kernel returns back the instructions to the shell.

There are various types of command line shells in Unix: Bourne Shell, C Shell, Korn Shell, Bourne Again Shell (bash).

User: communicates with Shell through Commands. As mentioned earlier the user communicates with Shell through

Commands. Shell being a command interpreter translates them into the kernel understandable language. Kernel then

processes the results and sends back if any response is available to the shell. The shell finally displays a prompt back to the

user.

Bourne shell (sh)

Page 4: Technical

This is the original Unix shell written by Steve Bourne of Bell Labs. It is available on all UNIX systems.This shell does not have the interactive facilites provided by modern shells such as the C shell and Korn shell. You are advised to use another shell which has these features.C shell (csh)This shell was written at the University of California, Berkeley. It provides a C-like language with which to write shell scripts - hence its name. TC shell (tcsh)This shell is available in the public domain. It provides all the features of the C shell together with emacs style editing of the command line.Korn shell (ksh)This shell was written by David Korn of Bell labs. It is now provided as the standard shell on Unix systems. It provides all the features of the C and TC shells together with a shell programming language similar to that of the original Bourne shell. It is the most efficient shell. Consider using this as your standard interactive shell.Bourne Again SHell (bash)This is a public domain shell written by the Free Software Foundation under their GNU initiative. Ultimately it is intended to be a full implementation of the IEEE Posix Shell and Tools specification. This shell is widely used within the academic commnity. bash provides all the interactive features of the C shell (csh) and the Korn shell (ksh). Its programming language is compatible with the Bourne shell (sh). If you use the Bourne shell (sh) for shell programming consider using bash as your complete shell environment.StructureAll UNIX files are integrated in a single directory structure. The file-system is arranged in a structure like an inverted tree. The top of this tree is the root and is written as a slash ‘/’.

Directory name Typical contents/bin commands and programs used by all the users of the system/boot files required by the boot loader/dev CD/DVD-ROM, floppy drives, USB devices, etc./etc System configuration files/home User data files

Difference between Linux and UNIXThe first version of UNIX was created in 1969 by Kenneth Thompsonand Dennis Ritchie, system engineers at AT&T's Bell Labs. When it comes to operating systems, UNIX is the mother of operating systems. Members of rich Unix family are:

SVR4 (by AT&T) BSD 4.4 (by University of California) HP-UX (Hewlett-Packard) Solaris (Sun Microsystems)

Mostly, Linux is considered to be a copy of UNIX. Let’s hear the actual story now. Linux was actually a late addition to the family. It was written by Linus Torvalds back in 1991 and it was meant for IBM computers. As a matter of fact, in the world of operating systems, Linux has come up as a great operating system and it is welcomed with huge popularity. Commercial enterprise servers are running on Linux. Another cherry on the cake, Laptop and PC companies are also providing GNU Linux as a pre-installed OS on their systems so that individual users can also get a bite of it.An extensive brain storming confusion is if Linux a kernel or Operating System. After some time, people just start ignoring this doubt as it never gets answered; well keep reading for the solution. What Linus Torvalds wrote was Linux kernel and it had a lot of features similar to UNIX one. Why it is confused with OS, actually commercially available distributions that provide graphical interface, compilers and other utilities along with Linux kernel are referred to as Linux Operating System.

Page 5: Technical

Linux is, as they say, a UNIX-like kernel, because it has ‘some’ common features but still there are areas where they are not same. Difference between UNIX and Linux can be understood by going through following points.

Oracle:-

The Oracle Corporation is an American multinational computer technology corporation headquartered in Redwood City,

California, United States. The company specializes in developing and marketing computer hardware systems and enterprise

software products – particularly its own brands of database management systems. As of 2011, Oracle is the second-largest

software maker by revenue, after Microsoft. The company also builds tools for database development and systems of

middle-tier software, enterprise resource planning (ERP) software, customer relationship management (CRM) software

and supply chain management (SCM) software.

Larry Ellison, a co-founder of Oracle, had served as Oracle's CEO throughout its history. On September 18, 2014, it was

announced that he would be stepping down (with Mark Hurd and Safra Catz to become co-CEOs)

Oracle Database:-Oracle Database (commonly referred to as Oracle RDBMS or simply as Oracle) is an object-relational database management system produced and marketed by Oracle Corporation.A relational database is a digital database whose organization is based on the relational model of data, as proposed by E.F. Codd in 1970. This model organizes data into one or more tables (or "relations") of rows and columns, with a unique key for each row. Generally, each entity type described in a database has its own table, the rows representing instances of that entity and the columns representing the attribute values describing each instance. Because each row in a table has its own unique key, rows in other tables that are related to it can be linked to it by storing the original row's unique key as an attribute of the secondary row (where it is known as a "foreign key"). Codd showed that data relationships of arbitrary complexity can be represented using this simple set of concepts.

What is the difference between sql and pl/sql?

SQL is a data oriented language for selecting and manipulating sets of data. PL/SQL is a procedural language to create applications.

PL/SQL can be the application language just like Java or PHP can. PL/SQL might be the language we use to build, format and display those screens, web pages and reports. SQL may be the source of data for our screens, web pages and reports.

SQL is executed one statement at a time. PL/SQL is executed as a block of code. SQL tells the database what to do (declarative), not how to do it. In contrast, PL/SQL tell the database how to do

things (procedural). SQL is used to code queries, DML and DDL statements. PL/SQL is used to code program blocks, triggers, functions,

procedures and packages. We can embed SQL in a PL/SQL program, but we cannot embed PL/SQL within a SQL statement.

SQL (pronounced "ess-que-el") stands for Structured Query Language. SQL is used to communicate with a database. According to ANSI (American National Standards Institute), it is the standard language for relational database management systems. SQL statements are used to perform tasks such as update data on a database, or retrieve data from a database. Some common relational database management systems that use SQL are: Oracle, Sybase, Microsoft SQL Server, Access, Ingres, etc.

Creativity is the capability or act of conceiving something original or unusualInnovation is the implementation of something new.Invention is the creation of something that has never been made before and is recognized as the product of some unique insight.

Page 6: Technical

There are following six phases in every Software development life cycle model:

Requirement gathering and analysis Design Implementation or coding Testing Deployment Maintenance

Basics of software testing

There are two basics of software testing: blackbox testing and whitebox testing.

Blackbox Testing

Black box testing is a testing technique that ignores the internal mechanism of the system and focuses on the output generated against any input and execution of the system. It is also called functional testing.

Whitebox Testing

White box testing is a testing technique that takes into account the internal mechanism of a system. It is also called structural testing and glass box testing.

Black box testing is often used for validation and white box testing is often used for verification.

Types of testing

There are many types of testing like

Unit Testing Integration Testing Functional Testing System Testing Stress Testing Performance Testing Usability Testing Acceptance Testing Regression Testing Beta Testing