thesis - linux on the desktop
TRANSCRIPT
1
CCM4902 – Postgraduate Project
Linux on the desktop:
A study into why it has failed to succeed in
capturing desktop market share
Adam Lalani
M00549948
Supervisor: Santhosh Menon
27 September 2016
"A thesis submitted in partial fulfilment of the requirements for the
degree of Master of Science in Computer Network Management."
2
Table of Contents Abstract ................................................................................................................................................... 4
List of Figures ......................................................................................................................................... 5
List of Tables .......................................................................................................................................... 5
Introduction ............................................................................................................................................. 6
Background ......................................................................................................................................... 6
Problem Statement .............................................................................................................................. 9
Research Objectives .......................................................................................................................... 12
Approach ........................................................................................................................................... 13
Literature Review .................................................................................................................................. 14
Timeline ............................................................................................................................................ 14
Process (Method for collection) – sources, keywords ...................................................................... 15
Review of Topics .............................................................................................................................. 18
Conclusions ....................................................................................................................................... 22
Output – a simple definition, a conceptual model (Dimensions) ...................................................... 27
Literature Gap ................................................................................................................................... 27
An Experimental Comparison of Linux and Windows ......................................................................... 30
Experimental Procedure .................................................................................................................... 31
Stage 1 - Installation ......................................................................................................................... 32
3
Stage 2 – Start Up / Shutdown .......................................................................................................... 32
Stage 3 – I/O Intensive Operations ................................................................................................... 33
Stage 4 – Processor Intensive Operations ......................................................................................... 33
Stage 5 – Power Management ........................................................................................................... 34
Presentation of Results ...................................................................................................................... 34
Discussion of Results ........................................................................................................................ 36
Interviews with IT Professionals .......................................................................................................... 38
Interview Procedure .......................................................................................................................... 42
Discussion of Results ........................................................................................................................ 43
Conclusion ............................................................................................................................................ 53
Appendix A - The history of Unix and Unix-like operating systems ................................................... 57
Appendix B - Interviews ....................................................................................................................... 76
Interview 1 – Robert Fitzjohn ........................................................................................................... 76
Interview 2 – Prasad KM .................................................................................................................. 86
Interview 3 – Sanjay Banerjee .......................................................................................................... 99
Interview 4 – Renjith Janardhanan.................................................................................................. 110
Interview 5 – Glen Coutinho ........................................................................................................... 126
References ........................................................................................................................................... 136
4
Abstract
The Linux kernel has been wildly successful since its creation in 1991 by Linus Torvalds. Propelled
forward by the diffusion of the Internet and portable devices, Linux is now used in over 1.4 billion
devices – powering inter alia smartphones, tablets, the social media juggernaut Facebook, nuclear
submarines and the International Space Station. Despite this success, it is only used on just 1.74% of
desktop PCs.
Two lines of inquiry were followed to ascertain the reason(s) for Linux’s lack of success on the desktop
– firstly, an experimental comparison between Linux Fedora 24 and Windows 10 was undertaken, in
order to demonstrate that the lack of market share was not as a result of deficiencies of the operating
system itself, and secondly, qualitative interviews were conducted with 5 IT industry professionals with
a combined 96 years of experience – responsible between them for the purchasing, configuration,
support and usage of tens of thousands of PCs during their careers.
The experimental comparison proved that the performance and functionality of Linux is similar enough
to Windows to be discounted as a factor for its lack of adoption on desktop PC, whilst the qualitative
interviews established that the fundamental reason for the lack of success was due to the lack of a ‘killer
app’. Windows has Microsoft Office, but such a ‘killer app’ does not exist on the Linux platform.
Furthermore, the Linux kernel came too late to become widely prevalent during the desktop PC
explosion that began in the early 1990s, whereas its availability at the time of the rise of the Internet era
and the portability revolution allowed it to dominate those market spaces.
In the case of the desktop, it was the right kernel at the wrong time.
5
List of Figures
Figure 1 – Number of Scholarly and Peer-Review Papers on Summon, timeline based ...................... 14
Figure 2 – Keywords Established From Content Analysis ................................................................... 15
Figure 3 – Classification of final set of publications for literature review ........................................... 18
Figure 4 - Output – a simple definition, a conceptual model (Dimensions) ......................................... 27
List of Tables
Table 1 – Desktop Operating System Market Share as at February 2016 (netmarketshare.com) ........ 10
Table 2 – Mobile/Tablet Operating System Market Share as at February 2016 (netmarketshare.com)11
Table 3 – Final set of publications for literature review ....................................................................... 17
Table 4 – Content overview of papers used for literature review relating to Linux’s architecture ....... 19
Table 5 – Content overview of papers used for literature review comparing Linux to Windows/other
operating systems .................................................................................................................................. 20
Table 6 – Content overview of papers used for literature review concerned with the adoption of
Linux/other open source software ......................................................................................................... 21
Table 7 – Results of the 5 experimental stages ..................................................................................... 35
Table 8 - Qualitative Interviews - Definitions and Measurements ....................................................... 42
6
Chapter 1
Introduction
Background
“…I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for
386(486) AT clones…”
(Linus Torvalds, Usenet posting 25 August 1991) (Peng et al, 2014)
On the 25th August 1991, an unknown 21 year old student at the University of Helsinki in Finland,
named Linus Torvalds, posted on a Usenet forum that he was working on creating and releasing an
operating system kernel based on Minix, a Unix-like operating system (Malone and Laubacher, 1999).
His stated intention was that it would be created for hobbyist purposes, and would not be intended for
professional use.
Minix had been created by Andrew S Tanenbaum (Tanenbaum, 1987), in order to better assist students
that he taught at Vrije Universiteit Amsterdam in the Netherlands about operating systems. The idea to
create his own operating system came about when Tanenbaum was teaching his students how to use
AT&T’s Unix version 6. As Tanenbaum stated “The bean counters at AT&T didn’t like this: having
every computer science student in the world learn about its product was a horrible idea. Version 7 came
out with a license that said, “thou shalt not teach,” so you couldn’t teach version 7 Unix anymore”
(Severance, 2014). So he created his own operating system that was similar enough in its principles to
Unix version 7 that he was able to teach unfettered by AT&T’s draconian licensing constraints.
7
Minix had become the academic researcher’s platform of choice due its readily available source code
that could be examined and changed easily (if deemed necessary) - due to it being written in C, having
a system call interface that worked exactly like Unix version 7, and whilst it was a fully-fledged
operating system it was lightweight enough for one person to quickly absorb and comprehend (Mull
and Maginnis, 1991).
Torvalds was himself an avid Minix user. In 1991, he purchased for himself a new Intel 80386 based-
PC, but he soon realised that Minix could not take advantage of the enhanced protected mode (also
known as protected virtual address mode) of the newly released processor, so he took it upon himself
to write his own operating system kernel so that he could do so (Dettmer, 1999). His original kernel
was just a basic task-switching kernel – all it could do was display a message from each of two running
processes. Minix was used to compile the kernel and provide a file system. Torvalds managed to post a
semi-complete version of his operating system source code onto an FTP site in November 1991
(Wiegand, 1993).
The following year, Torvalds combined his work with another on-going open-source project entitled
GNU - which stood for GNU is not Unix (Casadesus-Masanell and Ghemawat, 2006). GNU, was the
brainchild of Richard Stallman (Hars and Ou, 2001), who worked at MIT (Massachusetts Institute of
Technology), and was under development in order to create an entirely free Unix-like operating system.
By 1992, the GNU project had yet to complete its own kernel (Stallman, 1998), but had completed
many other components required for an operating system, which included compilers, a command shell,
libraries, a windowing system and text editors. Torvalds combined his kernel with the readily and freely
available GNU programs to create a fully-fledged operating system (Bokhari, 1995).
Linux was made available under the GNU General Public License. This license allows the freedom to
any end user to have access to and be able to modify the software source code (as long as it is made
clear the source code has been modified), or distribute (and if so desired - charge for) copies of the
software. Additionally, the software can be used in new programs – modified or unmodified, and that
8
if that is the case, the recipient of the software is granted the same freedom as the distributor (The GNU
General Public License v3.0 – GNU Project – Free Software Foundation, 2007)
The computing landscape in the early 1990s was somewhat different to what it is today. In a May 1990
article in IEEE’s Computer magazine entitled ‘Recent Developments in Operating Systems’ (Boykin
and LoVerso, 1990) it was noted that generally operating systems of the time fell into one of two
categories – the first being referred to as mere “loaders” of programs (such as MS-DOS and DR’s
CP/M) with limited support for additional peripherals, and the second being of a more complex variety
that could offer access to manifold devices on a concurrent basis (examples include AT&T’s Unix and
Data General’s AOS/VS). However, mainly due to the rise to prominence of Ethernet networking,
commoditised CPUs and other significant hardware improvements, future operating systems would
have to address newly evolving requirements to specifically power graphical user interface based
workstations that were interconnected using local area networks (LANs).
Whilst Torvalds had begun work on his kernel, at the same time other operating systems began to appear
that could also harness the power of Intel’s 80386 processor, such as IBM’s OS/2 and Microsoft’s
Windows NT - additionally, at this point in time Unix had just become the first major ‘machine
independent’ operating system, enabling it to run on different hardware platforms. (Wilkes, 1992). All
of this evolution was being driven by the aforementioned recently evolving resource-hungry usage
scenarios like networking and graphical/multimedia applications (Cheung and Loong, 1995)
Just a few weeks before Torvald’s Usenet post, the World Wide Web was first made available to the
public on the Internet (Carbone, 2011). Undoubtedly, the advent of the Internet era would have also
contributed to the necessity for both hardware and operating system improvements. This line of
argument can be strengthened by Curwen and Whalley (2014), who wrote that changes in technology
generally move forward via a series of generations or part generations, and that these changes are
achieved either through better hardware, software, or a combination of the two. Indeed, as it has already
been demonstrated, Torvalds wrote his kernel to harness the power of his newly purchased hardware
9
that Minix was not able to do. Furthermore, West and Dedrick (2001) assert that the rise of Linux’s
prominence is as a direct result of the Internet.
Problem Statement
Almost 25 years after that initial Usenet post, the kernel created by Torvalds, which later became known
as Linux, has gone on to become the number one most used operating system kernel in the world (The
Linux Foundation, no date). Linux finds itself being used for such diverse applications as the running
of nuclear submarines (Claiborne Jr, 2001), the International Space Station (Ortega, 1999), over 1.4
billion portable devices (Vincent, 2015), as well as powering and underpinning the social media
juggernaut Facebook (Zeichick, 2008) inter alia.
Whilst all of this has shown that the Linux kernel is versatile and has many usage cases, there is one
cross section of the computing landscape that Linux has, as of the time of writing, not managed to
successfully permeate – the desktop computing space. For the purposes of this paper, the term desktop
computing is defined as traditional desktop or laptop PCs that utilise the x86 instruction set, and
therefore will exclude servers, mobile devices - such as tablets or smartphones, and games consoles.
Operating system market share data for February 2016 is presented in Table 1 for desktop operating
systems, and Table 2 for mobile operating systems. This data was provided by netmarketshare.com, a
website that collects data from the web browsers of individual unique devices that visit one of over
40,000 websites in their content network, as well as from over 430 referral sources including search
engines, enabling them to provide statistics on different web browsers being used, as well as the
operating system(s) used by those browsers (Can you explain the Net Market Share methodology for
collecting data?, 2016).
10
Operating System Total Market Share
Windows 7 52.41%
Windows 10 12.31%
Windows XP 11.34%
Windows 8.1 10.13%
Mac OS X 10.11 3.57%
Windows 8 2.56%
Mac OS X 10.10 2.27%
Linux 1.74%
Windows Vista 1.68%
Mac OS X 10.9 0.86%
Mac OS X 10.6 0.35%
Mac OS X 10.8 0.29%
Mac OS X 10.7 0.29%
Windows NT 0.10%
Mac OS X 10.5 0.06%
Mac OS X 10.4 0.02%
Windows 2000 0.01%
Windows 98 0.01%
Mac OS X (no version reported) 0.00%
Table 1 – Desktop Operating System Market Share as at February 2016 (netmarketshare.com)
11
Operating System Total Market Share
Android 59.65%
iOS 32.28%
Windows Phone 2.57%
Java ME 2.4%
Symbian 1.57%
Blackberry 1.45%
Samsung 0.05%
Kindle 0.02%
Bada 0.01%
Windows Mobile 0.00%
LG 0.00%
Table 2 – Mobile/Tablet Operating System Market Share as at February 2016
(netmarketshare.com)
In addition to the data presented in Table 1 and Table 2, w3techs.com (Usage statistics and market share
of Unix for websites, 2016) states that 36.2% of the top 10 million websites (based on rankings collated
by Alexa, a company belonging to Amazon.com), are powered using the Linux kernel.
Therefore, using those data sources as evidence, it is clear that Linux has failed to capture desktop
market share whilst it has been a proven success on mobile devices and mission-critical web servers on
the Internet. The intention of this paper is to perform an exploratory research in order to establish the
reasons for Linux’s failure to penetrate the desktop computing space. It will be demonstrated that Linux
is comparable in features and performance to the other popular desktop operating systems that it is
ranked against in Table 1, so it stands to reason that there must be other reasons for this disparity in
market share versus other market segments, which this paper will attempt to uncover.
12
The working hypothesis is that Linux has failed to achieve a sizeable portion of the desktop operating
system market because of a multitude of reasons, stated below:
It is not preinstalled on new PCs that are sold
There are too many Linux distributions available, which has led to fragmentation
Different package managers are used by different distributions
Multiple desktop GUI environment choices
A perceived lack of user friendliness and a steep learning curve
Deficiencies in hardware support, especially for graphics adapters
Paucity of available software/native versions of popular applications
Research Objectives
The research objectives of this paper will be:
Looking at the history of Linux from the evolutionary perspective of Unix and other Unix-like
operating systems (refer to Appendix A)
Experimentation with various competing operating systems to better understand the difficulties
that might be faced to get a user up and running
Establishing the causes for Linux on the desktop’s failure through qualitative interviews
Understanding the reasons for Linux’s success on other non-desktop hardware platforms
Attempting to discover if it is possible to reverse the trend, and how it might be reversed
13
Approach
In order to prove or disprove the working hypothesis, and to establish the reasons for its success on
other non-desktop platforms, Linux will be compared to other operating systems through the use of
experimentation - with installation and configuration, through the creation of a desktop base image
across each operating system. The working hypothesis will be interrogated further through qualitative
interviews with a number of IT professionals known to the researcher. Once proved or disproved, finally
an answer will be sought to understand if there is a possibility to reverse the trend, and if so, how it
might be done.
Additionally, in Appendix A, the history of Unix and Unix-like operating systems is presented to
demonstrate how Linux has evolved into what it is at the time of writing
14
Chapter 2
Literature Review
“…The time will come when diligent research over long periods will bring to light things which now
lie hidden…”
(Seneca, Natural Questions) (Ellis, 1998) Timeline
As was discussed in the introduction, Torvalds’s first version of the Linux kernel was released online
in November 1991, so the literature review timeline began from 1991 to the present. Initially, a very
loose preliminary search was performed using Summon – the University of Middlesex’s database of
publications for the keyword ‘Linux’. As demonstrated in Figure 1, the most recent 10 years or so
provides quite a significant body of research on Linux in general to be delved into.
Figure 1 – Number of Scholarly and Peer-Review Papers on Summon, timeline based
0
1000
2000
3000
4000
5000
6000
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
15
Process (Method for collection) – sources, keywords
Two major databases were used for the literature review on the research topic. The first was Summon,
a database capable of searching many library resources at once, which is provided by the University of
Middlesex to its students. The second database that was used to uncover relevant articles was Google
Scholar. It was felt that these two databases would yield sufficient data for the literature review phase
of this paper.
Only top quality journal publications and articles, conference proceedings, as well as peer reviewed
magazines were utilised. So that a collection of pertinent keywords could be established for a more well
focused and defined search, an analysis of content was carried out against the websites of prominent
organisations that have a strong involvement in the contemporary world of Linux and its ongoing
propagation as a successful operating system. The organisations that were analysed were the Linux
Foundation, IBM, Dell, Fedora, Red Hat, Ubuntu, and DistroWatch. The initial findings from the
keyword analysis are shown in Figure 2.
Figure 2 – Keywords Established From Content Analysis
0
5
10
15
20
25
Ope
n So
urce
Cost
s / F
ree
Clou
dSe
rver
sX
Win
dow
sU
bunt
u/Ca
noni
cal
Com
mun
ity/C
omm
uniti
esM
ainf
ram
ePe
rfor
man
ceSe
curit
yO
pera
ting
Syst
emEn
terp
rise/
Corp
orat
eDe
skto
pW
indo
ws
Dist
ribut
ion
CPU
/Pro
cess
orRe
d H
atSu
seKe
rnel
KDE
Free
Sof
twar
eW
orks
tatio
nTe
rmin
al/C
omm
and
Line
GN
OM
EXF
ree8
6Li
cens
ing
Apac
heH
ardw
are
Arch
itect
ure
GN
UPO
SIX
Repo
sitio
ryU
nix
16
To further filter the keywords in order to bring forward results from the database searches that were
pertinent to this paper’s line of enquiry, all keywords that yielded less than five results were disregarded,
which left a shortlist of thirteen keywords. Surprisingly, ‘kernel’ came some way down the list. Some
of the remaining keywords on the shortlist such as ‘Mainframe’, ‘Cloud’ and ‘Ubuntu/Canonical’ were
removed as they were too far removed from the subject matter that this paper is concerned with. A
shortlist of ten keywords would be used to search the selected databases (in conjunction with the
associated word ‘Linux’):
Open source
Cost / Free
Server
X Windows
Community / Communities
Performance
Security
Operating System
Enterprise / Corporate
Desktop
Search results would only be considered if they fell under the remit of computer science, and yet that
still yielded a combined total of 74,275 publications on Summon. As this was such a broad number of
papers, that in all likelihood would be mostly irrelevant, it was therefore decided to further reduce the
number of keywords to only include costs/free, operating system, enterprise/corporate and desktop.
Whilst a significant reduction had been made, a total of 27,117 papers remained. Therefore,
combinations of the keywords were used to achieve a more manageable number of papers to be
reviewed. Eventually, a finalised total of 45 papers were collated that were ascertained to be relevant to
this research paper. These are summarised as per Table 3 and Figure 3.
17
Type of Publication
Publication Title Number of
articles Journal 1 Elsevier – Journal of Computers and Security 1 2 ACM - SIGOPS Operating Systems Review 2 3 Elsevier – Journal of Information Economics and Policy 1 4 Journal of Management Science 1 5 Elsevier – Journal of Systems and Software 2 6 MIS Quarterly 1 7 Library Hi Tech 2 8 Computing in Science and Engineering 1 9 International Digital Library – Perspectives 1 10 Journal of Academic Librarianship 2 11 Elsevier – E-Commerce, Internet and Telecommunications
Security 1
12 Springer – Knowledge, Technology and Police 1 13 ACM – Transactions on Security 1 14 Journal of Corporate Accounting and Finance 1 Magazine (Peer-Reviewed)
1 IEEE Security and Privacy 1 2 ACM Queue 1 3 Elsevier – Network Security 1 4 IEEE Software 8 5 SSM IT Professional 1 6 The CPA Journal 1 7 Library Journal 1 8 IEEE Computer 2 9 Elsevier – Computer & Security Report 1 Conference Proceedings
1 IEEE Proceedings 8
2 Proceedings of the Workshop on Standard Making – A Critical research frontier for Information Systems
1
Other 1 Forrester Research 1
Table 3 – Final set of publications for literature review
18
Figure 3 – Classification of final set of publications for literature review
Review of Topics
The final set of publications that were used for the literature review were collated and briefly
summarised to better analyse their topics, content, direction, lines of inquiry / research, and how they
would fit with the research to be performed in this paper. The papers have been broken down into three
broad areas – the first covers Linux architecture (Table 4), second are papers that involve comparison
to Windows and other operating systems (Table 5) and finally publications that are focused on adoption
of Linux/other open source software or operating systems (Table 6).
Journal, 18
Magazine, 17
Conference Proceedings, 9
Other, 1
Classification of final set of publications for literature review
19
Linux Architecture
Paper Description
Lu, et al. (2014) In depth study into Linux file system evolution and its features – demonstrating the Ext4 file system is ruggedised enough for use.
Harji, et al. (2011) Demonstrates that different kernel versions of Linux have major performance variations between them.
Dukan, et al. (2014) An analysis of performance versus power consumption between Intel/AMD and ARM based processors using Linux – concluding that type of processor architecture is becoming irrelevant – an indication into the future direction of computing.
Xiao and Chen (2015)
Comprehensive study into potential logging overhead issues when using Linux when not using adaptive auditing.
Thiruvathukal (2004)
Further evidence of distribution fragmentation in this paper, as well as a look at some of Linux’s perceived weaknesses – such as hardware support and binary package dependencies.
Radcliffe (2009) Paper that comparatively examines how access to hardware is controlled using Linux, FreeBSD and Windows.
Shankar and Kurth (2004)
An evaluation into security implications for open-source code, such as is used in the Linux kernel.
Harji, et al. (2013) Discussion into the complexity and problems encountered during Linux kernel upgrades.
Table 4 – Content overview of papers used for literature review relating to Linux’s architecture
20
Comparison to Windows / Other Operating Systems
Paper Description
Bean, et al. (2004) A paper concerned with establishing that open source operating systems and software in general are primed to perform akin to that of their proprietary counterparts
Chaudri and Patja (2004)
Shows that whenever possible, Microsoft has sought to perpetuate their operating system monopoly through the use of litigation.
Macedonia (2001) Focussed on Linux’s inability to compete with Windows as the PC gamer’s platform of choice and the reasons why.
Massey (2005) This article discusses a 2005 open source software conference, where claims were made that 2005 would be the year that Linux would break through on the desktop.
Goth (2005) Talks about how Linux and other open source software has matured to rival commercial software, and that how to move to open source is more important than whether or not.
Sanders (1998) Highlights how Microsoft assimilates functionality of emerging software to aggressively dominate the software industry to the detriment of others.
Hilley (2002) Establishes that as early as 2002, governments and government agencies across the world begin Linux adoption programs, much to Microsoft’s chagrin.
Dougherty and Schadt (2010)
A case study that demonstrates that widely used Windows applications have Linux based alternatives, whilst cautioning that some software may never have an alternative.
Coyle (2008) Shows where Linux lags behind versus its contemporaries, and that there are hundreds of distributions to choose from.
Kshetri (2007) Argues that software piracy (principally of Windows) takes away potential Linux market share on the desktop.
Dedeke (2009) Proposes the idea that Linux is not necessarily better than Windows from a vulnerability perspective.
Tsegaye and Foss (2004)
A comparative study into both Windows and Linux device driver implementation, praising Windows’ better ability to work on a plug and play basis versus Linux.
Salah, et al. (2013) A review and analysis in to security concerns when deploying commoditised operating systems.
Casadesus-Masanell and Ghemawat (2006)
Provides a close look at what motivates contributors to Linux and other open source development projects, and how Linux’s availability causes competition like Microsoft to reduce its pricing to remain competitive.
West and Dedrick (2001)
Study into the rise of Linux – primarily focused upon the motivations of suppliers and buyers of complimentary assets as well as how Microsoft reacted to this changing of the landscape.
Stange (2015) Highlights that in an IT environment it is common to find a mixture of different operating systems being used.
Table 5 – Content overview of papers used for literature review comparing Linux to
Windows/other operating systems
21
Adoption – Reasons for, Costs, Drawbacks, Risks, Benefits
Paper Description
Giera and Brown (2004)
A comprehensive research into the costs, drawbacks and risks associated with migrating to open source software – specifically the differences versus commercial software.
Young (1999) An article that argues that the claim that Linux systems potentially have a lower cost of ownership across the lifecycle may be naïve despite the fact the operating system is free of charge.
Lewis (1999) Asserts that open source software does not become mainstream unless commercialised. Leibovitch (1999) An early case study into an all Linux enterprise – weighing up Linux’s strengths against
its barriers to its acceptance. Despite being an older paper, the same arguments appear to hold true against contemporary literature, making it a valuable primary source.
Ven, et al. (2008) Examination of advantages and disadvantages of open source software adoption – specific to Linux.
Gwebu and Wang (2010)
An exploratory study of the user perceptions of open source software adopters – to see if there are different mind-sets involved in those who decide to adopt.
Auger (2004) Discussed how older hardware can be repurposed by using Linux, by stripping away unnecessary features and overhead, thus leading to cost savings.
Maddox and Putnam (1999)
A paper highlighting both positives and negatives to Linux adoption, mainly from a cost centric view.
McClaren (2000) Paper that discussed the notion that Linux is essentially an unsupported operating system.
Delozier (2008) Another paper that discussed Linux fragmentation – many distributions and desktop environments – however does positively propose software alternatives to commercial applications on other platforms.
Chau and Tam (1997)
An exploratory study into factors that impact the adoption of open source software and systems.
Kirby (2000) More evidence of Linux distribution fragmentation, discussion into Linux supporting various hardware platforms, and being optimal when looking to extend the useful life of old hardware. Sole focus on cost of software is not the only utility of Linux.
Mustonen (2002) Research undertaken into the economic logic of Linux and other open source applications.
West and Dedrick (2001)
Conference paper that establishes the reasons for the rise of Linux, and presents research into the adoption motivation of various organisations between 1995 and 1999.
Anand (2015) Another paper that discusses fragmentation in Linux desktop GUIs and distributions, but also establishes positive reasons for using a Linux distribution.
Dedrick and West (2004)
Exploratory study into the various factors influencing open source platform adoption, and the processes used to evaluate and then implement such technologies.
Ajila and Wu (2007)
Empirical study into factors that cause an effect on open source software development economics, as well as understanding the steps involved in open source software adoption.
Kshetri (2004) Comparison of macro and micro influences in decision to adopt Linux in developing nations – asserts a lack of interoperable software is an issue requiring attention.
Dedrick and West (2003)
Looks at the consequences of adoption of standards – from the standpoint of technology, environmental and organisational views.
Bokhari (1995) Establishes that a high level of system administrator competency is required to support Linux in a networked environment.
Decrem (2004) Article that looks at obstacles to Linux’s broader adoption on the desktop – several of which are established.
Table 6 – Content overview of papers used for literature review concerned with the adoption of
Linux/other open source software
22
Conclusions
As pointed out by Stage (2015), it has become commonplace to find an amalgam of different operating
systems within an IT operation. In the past, supporting Unix-like operating systems (such as Linux), in
a networked environment, necessitated the need for high level system administrator skills and
proficiency in order to support and maintain both system and network stability (Bokhari, 1995).
Despite this, towards the latter half of the 1990s, organisations began to sense that there was value in
exploring the possible adoption of open source operating system software primarily to avoid the
constrictions that were imposed by the use of proprietary software (Chau and Tam, 1997). Perhaps
sensing this, at around the same time, Microsoft had begun to embark upon an aggressive strategy of
incorporating any well-received new features introduced by other software companies into their own
Windows operating system with the overall effect (and probable motivation) of removing most of their
competitors from the market (Sanders, 1998).
Around this time, cost perspective implications began to arise in discussions. Some held the position
that in order to succeed, Linux would have to become commercialised and become a chargeable product
(Lewis, 1999). Others began to debate that whilst Linux is free of charge, the actual total cost of
ownership makes it more expensive than Windows – in the main due to having to spend more on
software maintenance and support than one would spend on Windows – fitting quite logically into
Bokhari’s ‘administrator skills proficiency’ requirement previously mentioned (Young, 1999). This
argument is further elaborated upon by McLaren (2000) who states that whilst being free is Linux’s
biggest selling point, it is as an operating system that is essentially unsupported.
However, arguing against Young, in the same edition of the same publication, Kirby stated, that being
just concerned with cost(s) detracts from the utility of Linux, especially as it can be used to increase the
longevity of hardware beyond the traditional vendor supported lifecycles, and would therefore offer an
23
advantage against its commercially available operating system rivals (Kirby, 2000). The same benefit
was also highlighted by Auger (2004).
Around the same time (in the late 1990s), the earliest case studies of the corporate use of Linux began
to manifest. One such case study, focused on a Canadian start up, called Starnix, that adopted Linux
due its primary technical strengths of scalability, flexibility and reliability (Leibovitch, 1999). Again
the topic of support was highlighted as an impediment to the widespread adoption of Linux, although
in the case of Starnix it was not an issue because of the Unix-based background of its team that provided
the necessary complimentary skill set required to support their set-up.
At a similar juncture, papers began to appear that charted, studied and analysed the rise of Linux (West
and Dedrick, 2001 and West and Dedrick, 2001). It has already been touched upon in the introduction
that Linux’s rise to prominence is as a direct result of the Internet. The two aforementioned papers
discuss that often, new platforms (be it hardware or software) only become acceptable to IT
departments, ordinarily resistant to change, when these platforms are used to introduce new usage case
scenarios – and specifically in the case of Linux, its most common early usage cases were Internet
centric – being used for web services, firewalls, security and other such similar services.
Once more, those papers are in agreement with the sentiments already written that relate to Linux
requiring support staff of technical sophistication, the cost saving benefits through the usage of pre-
existing hardware and the need for industry giants such as IBM and HP to throw their weight behind
the commercialisation of the operating system in order to be better positioned against Microsoft, which
by 2001 had begun to happen. Dedrick and West also discuss the notion of complimentary assets – i.e.
so that in order for Linux to gain traction, those industry giants must provide a complimentary basket
of both hardware and software, which would theoretically in turn encourage more widespread adoption
of both, in a hand-in-hand fashion (also discussed by Decrem, 2004). They also warned against the
concept of “forking” – essentially the lack of the adoption of a common standard, becoming an
impediment to Linux adoption.
24
This concept of fragmentation (or as Dedrick and West put it “forking”) is in all likelihood one of the
central contributors to the complication and confusion of Linux adoption. Many papers have highlighted
the fact that there are a myriad of available Linux distributions, and due to the availability of countless
flavours of Linux, it makes organisations more loath to adopt it (Kirby, 2000) (Anand, 2015) (Delozier,
2008) (Coyle, 2008) (Thiruvathukal, 2004) (Decrem, 2004).
Subsequent research was carried out in 2003 and 2004 aimed at investigating the reasons that might
influence the adoption of open source software (Dedrick and West, 2003) (Dedrick and West, 2004).
Their research ascertained that the choice of server software did not affect how the general employee
populace viewed their computing experience – one interview respondent said “(the users) don’t know,
(and) don’t care” – meaning that so long as the underlying platform is not obvious it has little effect to
the end user. Once more the need for complimentary Linux skills was highlighted as an obstacle to
adoption. The most prominent issue was the potential inability to run third party applications on Linux
(also corroborated by Kshetri, 2004 and Decrem, 2004). However, several advantages were cited
namely the reduction of software costs, and the ability to repurpose otherwise obsolete hardware – all
positives already discussed. Although, such adoption decisions are said to be made on an infrequent
basis, probably due to the aforementioned resistance to change.
Some additional adoption factors were uncovered in the same two papers (Dedrick and West, 2003)
(Dedrick and West, 2004) which were the topics of ‘slack’ and ‘innovation’. With innovation, the
inference being that following a path of innovation leads to the earlier adoption of new technology, and
that such early adoption is a direct result of the strategy laid out by the business, and how IT is aligned
to it. So, if IT is of central strategic importance to a business, it will lead to earlier adoption of
technology such as Linux.
Looking more closely at the concept of slack, for an organisation that has additional IT department
human resources capacity but limited financial spending power, it begins to make more sense to use the
25
human resource slack to save money by using a free operating system, because this additional human
capacity allows the time and effort for experimentation with new technology (named by Dedrick and
West as “trialability”) and to learn and therefore fill in the skillset gap, making it no longer an obstacle
to possible deployment.
During the investigation in to the relevant literature, it also became apparent that a number of papers
were concerned with comparing Linux to its contemporaries, in the main the comparisons were with
Microsoft’s Windows operating system. One several strengths of Windows on the desktop is its
prevalence for playing computer games. Linux has overall failed to dent the computer games market –
primarily due to the inability to support Microsoft’s DirectX graphics API and audio driver issues
(Macedonia, 2001). At the time of writing there is still no native DirectX support on Linux.
Not only businesses and organisations were investigating potential systems migrations to the Linux
platform - governments worldwide began feasibility studies with the serious intent migrating away from
proprietary platforms. These included both the German and United States governments, despite
Microsoft’s best attempts to propagate the notion that open source operating systems were inherently
insecure compared to their own (Hilley, 2002).
This movement began to gain further momentum as pointed out by Bean et al, (2004) that major
computer industry players like Hewlett Packard and IBM were heavily marketing their Linux based
hardware – with IBM even using their own employees as field testers for Linux (on the desktop) to
ascertain its impact on worker productivity. Not taking this lightly, Microsoft began an aggressive
campaign of litigation in order to maintain the status quo of their monopoly (Chaudri and Patja, 2004).
However, there were still several advantages to using Windows over Linux. One of those advantages
related to hardware support (Tsegaye and Foss, 2004). They stated that ideally the design of device
drivers should reduce the necessity for end user interaction in order to allow the full functionality of the
device in question. Windows handles this rather better than Linux, especially when it comes to plug and
26
play operability. For an end user on a desktop, this kind of ease of use is, to say the least, rather
important.
The year 2005 was talked about as the year that Linux would finally break through into mainstream
desktop use (Massey, 2005). The ecosystem of Linux had matured to a point that it could now be
considered as being on par with its commercial rivals, with the question of whether one should move to
open source software evolving in to how one would make the leap (Goth, 2005).
These developments, and the changing attitudes towards Linux and open source software in general,
led to Microsoft reducing the pricing of its software because of the availability of Linux, in order to
remain competitive – as Linux could just be downloaded free of charge (Casadesus-Masanell and
Ghemawat, 2006). Casadesus-Masanell and Ghemawat also proposed the idea that software piracy of
Windows has a detrimental impact on the installed base of Linux (a sentiment also echoed by Kshetri,
2007).
Following on, further comparisons were made between Linux and Windows in terms of security
vulnerabilities. Dedeke (2009) wrote that whilst Linux has an overall perception of being more secure
and therefore less vulnerable compared to Windows, his research that analysed both Red Hat Linux and
Windows between 1997 and 2005 indicated that Red Hat had more reported vulnerabilities during that
time span compared to Windows, and that it was a fallacy to assume Windows was inherently insecure
compared to Linux. Wheras Salah et al (2013) warn that overall, most operating systems have flaws in
terms of security.
Dougherty and Schadt (2010) referred to the availability of applications on Linux (such as OpenOffice,
Rhythmbox and Firefox) whose utility was equivalent to similar applications available on Windows
(such as Microsoft Office, iTunes and Internet Explorer). They further elaborated on this, informing
that whilst there were like for like applications for many usage case scenarios, making the choice to use
27
Linux did exclude the ability to use certain applications that may never be ported over to or made for
Linux, and this consideration should not be taken lightly.
Architecture-wise, Linux also has some hurdles to overcome. One such major concern centres around
kernel upgrades. Knowing when to upgrade kernel versions, (and to which version), is a serious concern
(Harji et al, 2011) (Harji et al, 2013). There are significant performance variances between different
kernel versions, and without referring either to benchmarks that can be found online, or through testing
on the job, it is difficult to know at what point to upgrade or not upgrade the kernel. Further to that, in
the researcher’s own experience, a kernel upgrade can break graphics driver dependencies, rendering
the GUI portion of Linux unusable, as the graphics drivers are compiled using whatever version of the
kernel was available at the time, and would need to be recompiled using the newer kernel.
Figure 4 - Output – a simple definition, a conceptual model (Dimensions)
28
Literature Gap
What has been ascertained from the review of the literature is that from 2009 it has become much harder
to find or uncover new research into Linux adoption on the desktop. This could be due to changing
patterns of computing as discussed by Dukan et al (2014) when it is noted that traditional PC based
desktop computing is becoming less relevant in an era of portability driven on by low power
consumption processors used in mobile/tablet devices, low power sensor networks and the lightweight
operating systems based on the Linux kernel that power them – or the reasons established earlier in
2009 are still the same. It is also possible that Linux (driven on by competing juggernauts such as HP,
Dell and IBM) has focused on its core success areas such as server centric application use.
As no further research has been located after this subsequent gap, this paper intends to cover this period.
Furthermore, these studies are focused primarily on server side technology, of which Linux has gained
widespread acceptance at the time of writing (see introduction for statistics).
It has been demonstrated that a clear pattern has emerged during the ‘adoption’ portion of the literature
review that the decision to adopt is generally influenced by the weighing up of the shortage of skills
pitted against a saving of costs on software, plus the ability to reuse older hardware (Maddox and
Puttnam, 1999) (Ven, et al, 2008) (Ajila and Wu, 2007) (Giera and Brown, 2004) (Decrem, 2004).
As has been demonstrated earlier, whilst Linux does have many comparatively similar applications
versus its Windows nemesis, there is still a lack of applications overall. Issues with device drivers and
hardware support also appear to be a relevant issue when considering which of the two to choose from.
When it comes to security, there is some conjecture as to which is the more secure operating system –
but with both Windows and Linux, it would depend on the attack footprint of any given specific system,
making it more difficult to argue either way, although data provided by Dedeke (2009) leans towards
Linux being the more insecure operating system.
29
It is therefore apt to revisit the research topic again due to Linux’s widespread adoption on other
platforms, such as servers and portable devices. It stands to reason that a lack of skills (for support or
otherwise) has not hindered its advance on other platforms, so there must be other substantive reasons
that have influenced Linux’s small segment of captured desktop market share when compared to
Linux’s other already mentioned successful platform penetration.
30
Chapter 3
An Experimental Comparison of Linux and Windows
“…When we design and architect a server, we don't design it for Windows or Linux, we design
it for both. We don't really care, as long as we're selling the one the customer wants. If a server
goes down the production line, it doesn't really know what OS it has on it…”
(Michael Dell, Interview with PC Magazine 3 February 2004 (Miller, 2004)
One of the key contentions of this paper is that Linux, from a functionality and performance perspective,
is comparable to Windows, and therefore should be discounted as a reason for its lack of adoption.
Whilst it is apparent that they do not share the same lineage, it is assumed that the performance of Linux
is not a reason behind its lack of desktop market share. According to research performed by Dederick
and West (2003 and 2004) one respondent said “(users) don’t know, (and) don’t care (about the
operating system in use)” as long as a user is able to adequately perform the tasks they want to perform.
In order to prove or disprove the aforementioned assumption, an experiment was undertaken between
14 to 17 July 2016, to compare Windows 10 Professional, and Linux Fedora Workstation 24. Fedora
was specifically chosen as this is the Linux distribution of choice used by Linus Torvalds (Torvalds,
2014). Various measurement metrics were defined, and are elaborated upon in the experimental
procedure section that follows. Originally, FreeBSD had also been considered as an operating system
candidate for the experiment, but as a GUI has to be separately installed, it was decided to withdraw
FreeBSD due to time constraints, as its withdrawal would not have a material impact on the research
focus of this paper.
31
Experimental Procedure
A Lenovo X201 laptop (manufactured in 2010) was chosen. As identified in the literature review
section, Auger (2004), Kshetri (2004), Decrem (2004) and Kirby (2000) had written that the longevity
of older hardware can be extended if the hardware was repurposed by having Linux installed as its
operating system. Therefore, the experiments would also be a logical extension to the existing body of
research work. The hardware used was as follows:
Processor – Intel Core i5 M540 2.53Ghz
8GB RAM (DDR3-1066Mhz)
SanDisk Ultra Plus 256GB SSD Hard Drive
9 cell battery
12.1” WXGA LED Display
On-board Intel HD Graphics Adapter
External LG GP30NB30 Slim Portable DVD Drive
SanDisk Cruzer Blade 8GB USB Drive
Sony Xperia Z3+ Smartphone (for timer measurements)
The experiment was broken down into five broad areas:
Installation
Start-up / Shutdown
I/O intensive operations
Processor intensive operations
Power management
32
Stage 1 - Installation
Prior to each installation, the SanDisk Ultra Plus 256GB SSD Hard Drive used for the experiment had
all partitions deleted, so that it would present itself to the operating system installer as a new empty
drive. During the installation, default automatic drive partitioning was selected on both Windows 10
and Fedora 24. All default installation options were chosen, and one unique user entitled “unitest” was
created, without a password.
The number of unique interactions – such as pointing device clicks, or keyboard entries used to define
a username, were noted, as well as the number of reboots required to arrive at a working desktop, and
the time taken to complete the installation.
Both operating systems were installed using an external LG DVD drive as the laptop did not have an
on-board optical drive. Once the installation had been completed, both operating systems were updated
to the most current patch levels available from their respective providers. The time to install updates
was not measured, as the media used for Windows 10 was issued in late 2015, whereas the Fedora 24
media was downloaded on the first day of the experiment (14 July 2016), and would not therefore
present data that would be comparable.
Stage 2 – Start Up / Shutdown
In order to test both start up and shutdown performance, several timing measurements were recorded:
The time taken to start up the laptop from the powered off state to the user desktop
The time taken to completely shut down the laptop from the user desktop to the powered off
state
The time taken to hibernate the laptop from the user desktop to sleep mode
33
The time taken to wake the laptop from sleep mode back to the user desktop
Stage 3 – I/O Intensive Operations
Tests were undertaken to measure I/O performance of the two operating systems using two types of
files:
Small files – 106 files of varying file types and sizes - total 326MB
Large file – 1 Matroska Multimedia Container file (.MKV) containing a 1080p Blu Ray rip of
a film – total 3.88GB
In both cases, the small files, and the large file were subjected to three file move operations, and the
time taken to do so on each operating system was recorded:
Hard drive to hard drive
Hard drive to USB drive
USB drive to hard drive
Stage 4 – Processor Intensive Operations
Three sets of processor intensive tests were carried out. In the first test, both operating systems had the
64bit version of Handbrake Open Source Video Transcoder installed and the Matroska Multimedia
Container file from the large file experiment in stage 3 was converted from MKV format to MPEG-4
format (using the default Normal setting) and the duration taken to convert was recorded.
34
Secondly, Geekbench processor benchmarking software was installed (only the 32-bit version, as a
license must be purchased to use the 64-bit version), and Geekbench benchmark scores were calculated
by the software and noted down.
Finally, WinRAR (for Windows) and RAR (for Linux) – (both 64 bit versions) were used to compress
the Matroska Multimedia Container file using the highest level of compression possible (setting entitled
Best).
Stage 5 – Power Management
To measure the effectiveness of the power management of both operating systems, the MPEG-4 video
file created during stage 4, was played on a consecutive loop using VLC Media Player, from a fully
charged battery state, until the 9 cell battery was completely discharged and the operating system
initiated a shutdown, and reached that state.
Presentation of Results
Windows 10 Professional
x64, Build 10586.494,
Version 1511
Linux Fedora 24 Workstation
x64, Kernel 4.6.3-300.fc24
Stage 1 – Installation
Installation time 24 minutes, 41 seconds 20 minutes, 39 seconds
Number of clicks/interactions 16 15
Reboots required 3 1
Stage 2 – Start Up / Shutdown
Start Up Time 18.35 seconds 21.10 seconds
35
Shut Down Time 9.59 seconds 6.22 seconds
Hibernate Time 3.66 seconds 2.09 seconds
Wake from Sleep Time 2.17 seconds 2.09 seconds
Stage 3 – I/O Intensive Operations
Small files – Hard drive to hard drive 7.0 seconds 3.1 seconds
Small files – Hard drive to USB drive 56.8 seconds 32.9 seconds
Small files – USB drive to hard drive 56.5 seconds 16.4 seconds
Large file – Hard drive to hard drive 28.50 seconds 26.00 seconds
Large file – Hard drive to USB drive 10 minutes 2.3 seconds 8 minutes 52.1 seconds
Large file – USB drive to hard drive 3 minutes 13.1 seconds 2 minutes 44.7 seconds
Stage 4 – Processor Intensive Operations
Geekbench 32 bit single core benchmark 1966 2028
Geekbench 32 bit multi core benchmark 4043 4068
WinRAR/RAR 5.4 x64 compress MKV file
on maximum compression setting
8 minutes 56 seconds 9 minutes 50 seconds
Handbrake conversion of MKV file using
normal setting
1 hour 41 minutes 0 seconds 1 hour 48 minutes 19 seconds
Stage 5 – Power Management
Playback time of MPEG-4 file until battery
discharged from full
4 hours 40 minutes 45
seconds
3 hours 49 minutes 58 seconds
Table 7 – Results of the 5 experimental stages
36
Discussion of Results
The installation of Fedora completed just over 4 minutes faster than Windows. It may have also been
possible to install Fedora in less time, as the install media booted first to a live desktop, and then
provided the option to install the operating system. During the initial boot from the optical media, an
option to directly enter the installation program was presented, but the keypress to initiate it did not
register and the live desktop proceeded to be booted. A probable reason for the quicker install for Fedora
is that it is a distribution stripped of unnecessary software (although it included the Libre Office
productivity suite, and several other potentially useful applications in the default installation
parameters).
Aside from the time taken to start up, Fedora was quicker to shut down, hibernate and wake from sleep.
One reason for the slower start up time was that despite the unitest account being configured without a
password, on Windows 10 the default behaviour is to directly boot to the desktop without further
interaction, whereas Fedora requires the username to be clicked/selected from the login page before the
GNOME desktop starts up – however this does not fully account for, or explain, the (almost) 3 seconds
disparity between the two.
Fedora performed significantly better than Windows did on all six of the I/O intensive tests carried out.
Fedora uses the EXT4 file system versus Microsoft’s NTFS. The performance results are corroborated
by research undertaken by Safee and Voknesh (no date) who stated that generally file operations of a
sequential nature perform more poorly on Windows compared to Linux.
During stage 4 (processor intensive operations), Windows performed much better than Fedora in both
tests undertaken by the researcher (RAR and Handbrake conversion – both using x64 binaries).
Interestingly, Geekbench (albeit benchmarked on 32 bit operations due to licensing restrictions)
performed better on Fedora than Windows. It is entirely possible that Fedora has been optimised to
perform better on benchmarking software – not an entirely unheard of phenomenon (Cai et al, 1998),
37
or perhaps it performs better using 32-bit processor operations – and if that is the case, at the time of
writing, most new desktops are shipping with 64 bit operating systems and applications and therefore
should be optimised for the same. Whatever the cause for the Geekbench results, the real world tests
measured during the experiment show that Windows was far better in this regard.
The final phase, designed to test power management, yielded a startling disparity. Windows, whilst
playing the same video file, using the same media player (VLC), lasted just over 50 minutes longer than
Fedora. In both cases, neither operating system used any third-party drivers to optimise power settings
or consumption, therefore out of the box Windows was demonstrated to be better than Fedora in this
regard.
As stated at the start of this section, part of the working assumption is that the performance of Linux
should not be a reason for its lack of desktop market share. Based on the results, it can be argued that
Fedora performed better than Windows in some cases (I/O, Start up/ shutdown) as well as being argued
that it is deficient versus Windows in other cases (Processor intensive operations and power
management). Taking a balanced approach between the two, the research indicates that the operating
systems were overall comparable (albeit depending on the usage case scenario), thus proving to a
satisfactory extent that Linux is similar to Windows from a performance angle, as per one of the tenets
the working hypothesis.
38
Chapter 4
Interviews with IT Professionals
“… A research method is a strategy of enquiry which moves from the underlying philosophical
assumption to the research design and data collection…”
(Myers and Avison, 2002)
In the previous chapter, it was established through experimentation that Linux is comparable to
Windows overall, from a functionality and performance perspective, and therefore (lack of)
functionality and/or performance can be discounted as a reason for its lack of adoption. Further research
was therefore required, in order to ascertain and establish what the reasons are for the lack of Linux’s
penetration in the desktop operating system market space. Therefore, qualitative interviews were
undertaken with 5 IT professionals, with a cumulative experience of 96 years, working in high pressure
business environments tasked with the responsibility of evaluating, purchasing, maintaining and
monitoring thousands of desktops and servers between them over the course of their careers.
In the problem statement earlier in this paper, the working hypothesis states several reasons that
postulates why Linux has not gained traction in the desktop space. Those tenets of the working
hypothesis are restated again below for the benefit of the reader:
It (Linux) is not preinstalled on new PCs that are sold
There are too many Linux distributions available, which has led to fragmentation
Different package managers are used by different distributions
Multiple desktop GUI environment choices
A perceived lack of user friendliness and a steep learning curve
39
Deficiencies in hardware support, especially for graphics adapters
Paucity of available software/native versions of popular applications
In order to prove or disprove the above statements, the aforementioned information technology
specialists were selected and interviewed because of their exposure over many years to a variety of
operating systems, the fact that they have and are working in diverse industries, based in different
geographical areas, and were likely to understand technical complexities and challenges that a layman
may not. It was believed that such candidates would know much better the reasons for Linux’s failure
on the desktop than a layman.
For the purposes of this qualitative research, the interviews would be based upon the principle of
‘phenomenology’ (Husserl, 1970). Phenomenology is a method which encourages a respondent to
provide information that is based upon his/her subjective perception of a particular situation. Questions
are asked with the specific intent that the respondent will provide descriptive responses to the questions
posed to them, devoid of the motivations (or assumptions) of the interviewer – thus allowing for insights
into the behaviour, motivations and actions of the interview subject that are not influenced by the
researcher.
Several potential interview candidates that were approached had requested that the questions to be posed
would be provided in advance. The researcher made every effort to avoid providing the questions, so
that pre-preparation would be avoided, with the specific intention that the research question would not
be revealed – as having advance sight of the questions could allow the subject to extrapolate the
motivations, assumptions and actions of the interviewer and therefore rendering the ‘phenomenology’
method of interviewing null and void.
It was intended that the interviews would provide data that either supports, or does not support the
various tenets of the working hypothesis restated above, and that qualitative interviews would best
40
provide the required insight to answer the research question put forward, in comparison to quantitative
research methods that could have alternatively been undertaken.
The overall question framework for the qualitative interviews was created based upon the working
hypothesis, and other points of interest that were raised during the literature review phase of this paper.
In total, up to 29 open ended questions (refer to Table 8) were to be put to the interviewee, in order to
ascertain as much data/information as possible from the interview. However, the set of questions were
seen more as a guideline framework, and would not (nor in fact could not) be rigidly followed as certain
answers elicited during the interview may (and in fact did) inform questions that would have been asked
later on within the question structure.
The overall structure of the questions started with a more generalised line of enquiry, such as
establishing the respondent’s career history, and the general changes to operating systems that they have
observed over many years. The reasoning behind this was to engage the subject in conversation, opening
up about themselves, whilst actually narrowing down the scope of enquiry with each subsequent
question to specific areas of interest.
Questions Rationale
1. How old are you? 2. Male or Female?
Ascertain demographical information of the interviewee.
3. How long have you worked in IT for? 4. Can you describe your career from its start to now?
5. Please discuss the technological changes that have
occurred during your working career thus far?
Gain an understanding / general overview of the information technology specialist’s employment background and history, and how information technology has changed during the course of their career.
6. Going through your career, can you discuss the operating systems you have used in a personal capacity, that your employer(s) have used, and how that may have changed over the years.
7. What about on portable devices? Please discuss your
experience over the years with those devices, and
Learn about the background and opinions of the IT specialist’s experience with various operating systems including portable devices, both at work and personally. Also ascertain if they are aware that these devices are
41
how they have changed from an operating system standpoint.
8. If answer to 7 does not elicit from interviewee that
Android uses Linux or iOS uses BSD inform interviewee and ask their opinion on that.
primarily using BSD Unix or Linux to run.
9. Desktop PC sales are supposedly on the wane, discuss, and are they relevant anymore – and why?
10. During your career, have you ever installed an
operating system on a desktop or laptop, and if so what was it? If not Linux based, why not?
Diving deeper into desktop related topics. Could lack of penetration be down to the desktop being less relevant in an era of mobile devices? Also understand operating system installations undertaken by the respondent
11. What has been your exposure to Unix and Linux? 12. What do you think (or know) about Linux in general?
Start narrowing the questioning down to Linux specifically, initially from an open ended standpoint.
13. Linux distributions, which ones are you aware of?
14. Are you aware there are currently 815 unique distributions? What do you think about that?
15. Discuss your experiences with the Linux
distributions you are aware of.
16. If the person has in depth experience, ask about preferred GUIs
17. If the person has in depth experience, talk about
package management for different distributions.
Focus on distribution related topics (as well as their respective GUIs and package managers if possible).
18. Is Linux easy to use? Why? 19. In some circles Linux is viewed as difficult to use
and needs substantial training time and effort to be invested. What do you think about that statement?
Opinions on Linux’s ease of use.
20. Why do you think Linux is rarely preinstalled on a new desktop or laptop?
Try and understand the view of the interviewee about why Linux is not preinstalled by manufacturers.
21. In your opinion and experience, discuss hardware support with Linux
22. Do you think Linux performs well on obsolete hardware? If yes, do you use it on obsolete hardware?
Is the support of hardware (like graphics adapters) an impediment to adoption in the minds of the respondent? Do they believe it works well on old hardware? Ascertain if software piracy is a reason for lack of adoption.
42
23. If not used on obsolete hardware why not? If because using Windows, try and elicit if using pirated version.
24. If 23 doesn’t answer that – ask if they think that
software piracy has an effect on the user base of Linux on the desktop
25. Do you think it is possible to do everything on a Linux desktop that one can do on a Windows desktop? Why?
26. If lack of applications is not cited, ask what the respondent feels about availability of applications on Linux versus other platforms.
27. With the prevalence of cloud and webapps would
this no longer be an issue (if believe lack of apps is an issue)?
Test the understanding of applications on the Linux desktop, and see if its viewed as a reason for lack of adoption.
28. Do users care what operating system runs on their desktop or laptop? Why?
29. If you could setup a network of workstations from scratch with a limited budget would you consider Linux? Why?
See what the general opinion is based on their perceptions of their users. As well as see if Linux would be used to save costs on software licensing, or there is just a bias in general against using it.
Table 8 - Qualitative Interviews - Definitions and Measurements
Interview Procedure
The interviews were conducted between 26 July 2016 and 1 August 2016 and were recorded if future
inspection was required to interrogate the veracity of the research undertaken. Additionally, transcripts
of the interviews performed were written up and they form Appendix B of this paper.
43
The respondents were advised that their name and name of any employer (both past and present) would
not be published, to encourage openness and to build trust between the interviewer and the interviewee,
unless if they expressly requested that is was to be published. Furthermore, those interviewed were
advised beforehand that the interview would be regarding their knowledge of operating systems – rather
than specifically on Linux, to elicit as much data as possible, even if some was not relevant to the
purpose of this paper, and to avoid pre-preparation on their part.
Interestingly, all of those interviewed waived their right to anonymity and were happy for their real
names, as well as names of organisations (if provided) to be published. The respondents were also
advised that they would be offered a copy of the completed thesis paper, once submitted to the
university, to ensure that the process on the part of the interviewer was transparent and that their answers
were published exactly as they had answered them – and also because once the end of the questions had
been reached, the interview subjects all wanted to know the result(s) of the research.
Discussion of Results
First of all, only one respondent was able to authoritatively answer question 16 about the different GUI
options available on Linux. Secondly, question 17, regarding different package managers was only
asked in one interview, as it was felt that it would take something away from the flow of the
conversation, as well as that the respondents appeared unlikely to be able to answer the question. The
researcher felt that by not asking, it would not detract from the interviews, as questions were still asked
about distributions and software in general.
As a result of the inability to have full answers for questions 16 and 17, two parts of the working
hypothesis were not able to be proved or disproved, and for the balance of this paper would be dropped.
44
Those two parts to be dropped were:
Different package managers are used by different distributions
Multiple desktop GUI environment choices
Overall, each of the five respondents noted a similar path of progression with their experiences with
operating systems. This path generally followed a DOS -> Windows 3.1 (or 3.11) -> Windows 95 and
so on and so forth pattern.
“…we were using this traditional operating system called MS-DOS 3.0 and then the evolution of the
graphical applications with Windows 3.1 was an amazing thing in front of us. And then say going to
the development of this OS by Microsoft of Windows 95, 98, and the other things. It made a
revolutionary change…”
“…Operating systems – mainly Win systems all the way from Win something and then Win NT and 95
and above…”
“…So I started off personally using DOS, it used to be DR-DOS, then MS-DOS, so then you had
Windows 95, as its own operating system. Then I’ve used Windows NT, XP, Windows 2000…”
All of the respondents have had exposure to Linux, but to varying degrees. In two cases, the first
exposure came about from free CD media provided on the cover of computer magazines, which at the
time (the mid to late 1990s) when there was less widespread Internet penetration, appeared to be vital
enabler in the spread of knowledge and information to IT literate (or those that wanted to become IT
literature) individuals.
“..We used to get CDs with magazines and the CDs used to contain a lot of software. So it was through
that that I came to know about Red Hat and Suse…”
45
“…they were popular (computer magazines) at the time before the internet and they used to come with
a CD stuck to the front with some software for you try … And that was way to find out new things and
try out new things and on one edition there was a full version of Linux to install…”
Most of the respondents agreed that the desktop as a platform was gradually becoming less relevant in
an age of portability. It was agreed that in certain cases where large screens and other high end hardware
was required for very specific tasks, there would still be a place though for such hardware. The
respondents almost overwhelmingly pointed towards the paradigm shift towards mobility – an area
which all respondents were aware is dominated by Android (Linux kernel) and iOS (BSD Unix and
Mach kernel).
“…the desktop environment is slowly getting phased out and it is getting into a different type of working
environment…”
“..some of the users require large amounts of storage, where expansion is required, additional
expansion – like those who have high end graphics requirements…the desktop will not phase out from
the market, or for the end user completely … The difference is the demand will not be the same as
before…”
“…in another 5 years the complete, complete, computing platform will be changed with this portable
equipment…”
“…we are using more portable devices like laptops, tablets, phablets and since the UI is more web
based the need for desktop PCs as such is not really there. Especially desktop PCs are not really needed
when we do not have a need for severe client resources like the old systems used to…”
46
“…there still is a place for that (the desktop) – one, the computing power and two, the form factor -
sometimes you do need to sit at a desk, have a full sized keyboard, a full sized monitor and have a mouse
for input, the ability to use those peripherals to do your job…”
“…I think desktops are becoming less relevant now…so as we going into this mobile era especially,
portable devices, laptops, there’s a bit of a market but we can see things declining there. Desktops are
losing their market share for sure. I mean people want mobility. There may be specific functions, maybe
something high end workstations where you are doing some sort of engineering or drawing – things
like that, which require a lot more resources and necessitate desktops. But I think people are shifting
more towards just getting their work done…”
During the literature review, there was some evidence of software piracy being responsible for Linux’s
reduced desktop market share (Casadesus-Masanell and Ghemawat,, 2006, and Kshetri, 2007).
However, this was contradicted by the interviews undertaken, where this notion was dismissed as being
a major contributing factor. It should be pointed out that the literature referred to dates back almost 10
years from when this research has been undertaken, and may have been more relevant previously.
“…I don’t think it’s down to piracy. I think it’s down to what people are already familiar with and what
they have…”
“…under those circumstances it shouldn’t really be too much of a piracy issue for Linux…”
“…nowadays nobody is really using a pirated operating system…It’s my opinion, nobody will be
looking for any pirated software…”
What was a point of interest is that it was a generally accepted opinion that an end user does not care
what operating system is running on their device – whether it is a desktop, laptop, or a portable device.
This corroborates earlier research identified during the literature review (Dedrick and West, 2003, and
47
Dedrick and West, 2004). It was clearly established that an end user has a set of generally repetitive
tasks to undertake (or perhaps overall repetitive patterns of use) – be it for work, or for leisure – and
they expect to be able to accomplish those tasks – irrespective of the underlying operating system.
“…No, they just want to be familiar, they just want to get their job done, they just want to be able to do
it…”
“…Say, for performing the task – how much, how quickly can they do it, how easily can they do it. That
is a factor which the user will consider when choosing the OS (to use)…”
“…From a personal use (perspective) I don’t think so. From a business use I think whatever makes
them more comfortable as long as they can deliver…”
“…unless if I have specified it to them (the users), they would not know what is the operating system
(in use)...nobody is getting into the operating system core capabilities. Their experience on functionality
is based on core application level experience not on the operating system…”
Now, referring to the previous paragraph, one of the tenets of the working hypothesis was that the lack
of available software/native versions of popular applications is a contributing factor when answering
the research question. What became apparent from the discussions is that there is one key suite of
applications that is missing from Linux distributions – which is Microsoft Office. Whilst there are
alternatives available, it would appear reading between the lines that this makes no difference to the
perception of users.
“…they expect to find a piece of software just there available, such as Word, Excel, their Outlook…”
“…maybe that is because having used Windows systems for so long, but I feel much more comfortable
working with Windows Excel than Google Excel (Sheets)…”
48
“…one of the challenges is that most of the applications, say around 75% of the applications are
available or programmed for Windows…Applications availability is very poor under Linux, the Linux
platform…”
“…So they start off talking about ok what are the common applications we use, so say Word, Excel…
Can I use Word and Excel? Some of the features are not as exactly the same as apples for apples and I
think that’s the issue…I think the question they would ask if about applications – Can I do this? Can I
do that? Can I use Word and Excel? For me it’s a bit about the compatibility of the other applications
– Windows has that edge over the others...”
“…It depends upon the compatibility. Some of the applications, the compatibility…”
“…there are a lot of functions that are missing from that, that are only available in the Windows version
right…But from a user perspective, I think the applications are quite limited. So you have your own set
of applications, I think Linux has it, but OS X has its own version of a word processor, or a spreadsheet,
things like that – but functionality wise it’s not up to the mark as some of the Windows Office suite
applications are…”
During the interviews, the respondents were in overall agreement that the reason why Linux is not
preinstalled often on new PCs sold is primarily due to familiarity (or rather lack of in Linux’s case) to
the end user. This feeds into the operating systems experience that was established in most of the
interviews, where the respondents themselves followed a path of progression, discussed earlier in this
section. This can therefore be argued to be the case for the general user populace, who are exposed to
Windows from an early age, usually at school, so they would naturally gravitate towards what they
already know. In addition to this, Microsoft’s deals with OEM manufacturers to preload their operating
system is also a contributing factor, but not the substantive reason.
49
“…There’s few people who are familiar with it, so the level or knowledge in your typical family,
whereas they’d know Windows already…”
“…People are taught Windows at school…”
“…I would put it under something called an oligopoly, which is something practiced by Microsoft. So,
once they have captured the market, 90 plus, 95% plus of the market, then they can pretty much dictate
or collude with various manufactures to ensure that their systems get on board…”
“…well the consumer market has not accepted Linux, mass consumers have not adapted to the Linux
environment. Every user has adopted the Windows environment. If anyone buys a laptop, anyone would
go for only a Windows operating system. Even with consumers, any business that is selling in the market
they would rather sell the Windows environment than a Linux preinstalled piece of hardware, unless of
course it’s a mobile…”
“…I don’t know if there’s some sort of OEM contract in place or something like that, but one guess I
would have to make would come down to user preference – what’s the most popular OS that they are
used to…for the masses if you look at it everyone’s most familiar with or aware of is Windows – which
I think is what sells. So someone’s going out there to buy a laptop and they come with Linux installed I
don’t think they have such a big market share…”
When those interviewed were questioned about their knowledge of Linux distributions, the answers
tended to circle upon Red Hat, Suse and Ubuntu – this tallies well with the keywords established from
the content analysis undertaken during the literature review section (refer to Figure 2). None of the
respondents were aware of the vast number of distributions available, so the fragmentation of
distributions can be disregarded as an influencing factor, as those interviewed were not aware of them,
so could therefore not be influenced by what they are not aware of.
50
“…If someone told me that there was 200 I’d have thought well possibly, but 800 sounds like quite a
lot…”
“…Because it is an open platform, anybody will be able to use their ideas and develop their own OS.
This is actually adding more value and power to this particular OS because the contribution from
multiple people and they have the liberty to take their own ideas into this OS – and that’s the reason
why so many versions have been developed…”
“…Wow. It’s almost like, it feels like a fragmented market…”
“…but I am surprised to see that 800 variants or different flavours (exist)…”
“…I knew there were a lot, but I didn’t expect it to be that many. Definitely over 100 but that’s amazing.
I think it’s a good and bad thing…but in terms of a regular user I think they would find it difficult if
there isn’t a common standard across these distributions. To me it’s a good and bad thing. Each person
has a flavour for what they want, or want to try. So they have many options, but in terms of
standardisation and people having to keep track of different commands and different ways to do things,
that could be a downside to it…”
From a hardware support standpoint, those interviewed generally believed that hardware support with
Linux was adequate. The general consensus is that Linux runs well on hardware with differing levels
of computational power and/or age. However, particular reference was made twice to graphics adapter
support, which was part of the working hypothesis. Therefore, this tenet has been proved to be correct,
although it is not considered to be the major contributing factor to the lack of market share, it is just
part of the reason.
“…Linux does have its deficiencies on the desktop - I’d say mainly down to graphics…”
51
“…compatibility is one of the challenges we face both with Linux and Solaris. Some of the devices are
not recognised, and the drivers are not available and the functionality is restricted…so that way there
are huge challenges when it is coming to this OS…”
“…Yes, the hardware was problematic. The drivers especially. You had to look for these compatible
drivers. It wasn’t plug and play, so everything at that time I had to try and download several drivers to
find one that would work. It was problematic… I think mostly it was printers, network cards, I think
graphic cards…”
In the main, it was also established that there is a substantial learning curve when adopting Linux, as
well as user-friendliness concerns. However, as several of those interviewed pointed out, this is most
likely as a result of many years of user exposure to Windows. This learning curve was also cited by one
respondent when referring to Apple’s OS X operating system, that well known user commands such as
the right-click are not there – this being something that Windows users have been used to stretching
back to the early 1990s with their exposure to Windows 3.1/3.11 onwards. Whilst on the face of it, such
matters may seem trivial, but they are not when a user just wants to perform his or her particular patterns
of use. Therefore, this part of the working hypothesis is also considered to be proved.
“…So, the problem is the dominance of Windows has been there for so long that it becomes so familiar
when using the system. Just things like right click which on the Mac is a little different and people find
that difficult, so why I said I don’t think they will be widespread adoption is that people are so familiar
with the shortcuts and how to navigate through, I think that has an influence on their decision…”
“…people like us who are brought up on Windows - we know Windows inside out, and then move to
another operating system have to learn everything again…”
“…On top of that, once the users are thoroughly trained, then they, there is reluctance on their part,
on their side to want to learn or migrate to something else…”
52
“…what I would say is that the application that is extensively used is press the button, wait for the
operating system to load. During that time, they must be looking around, looking at the phone, having
some coffee or something, they don’t care how it comes up…other than that I don’t think that anyone
is really noting what is an operating system. Back then they didn’t notice what was the operating system
and now also they are not knowing that…”
“…I would say that’s fairly accurate. Especially to a person, coming from my background…setting up
the Squid proxy, it did take some time to pick it up so there is some training, even though it was self-
learning. But if you are planning to deploy this, you know say in an office place you would need some
training to get used to it...Over the years, just because they are so use to one OS it could be down to
that…”
53
Chapter 5
Conclusion
“…The desktop hasn't really taken over the world like Linux has in many other areas, but just looking at my
own use, my desktop looks so much better than I ever could have imagined.…”
(Linus Torvalds, speaking at the Embedded Linux Conference, 2016) (Bhartiya, 2016)
The research question for this paper is “Why has Linux, despite its popularity on many platforms, failed
to be successful on the desktop?” To the satisfaction of the researcher, the two pronged research has
answered that question – it is almost completely due to the lack of popular desktop applications. On the
most popular desktop operating system platform (Microsoft Windows) it is Microsoft’s Office suite is
what one could term “the killer app”.
The idea of a platform either succeeding or failing based on the notion of a killer app was also raised
by West and Mace (2010), when they discussed the runaway success of the iPhone. In that particular
case the killer app was the Safari web browser because it could readily access and take advantage of
the estimated 1 trillion web pages available at no cost to users with desktop browsers, in an era when
mobile operators still operated a ‘walled garden’ of services – offering their own selective content whilst
charging their customers an additional subscription cost to access that content.
Linux’s lack of killer app on the desktop, and its overall lack of third party applications is considered
by the researcher to be the primary reason for its failure to succeed in the desktop market based upon
the findings of the research undertaken. This issue was discussed in the literature review section – citing
papers from the early 2000s (Dedrick and West, 2003, Dedrick and West, 2004, Kshetri, 2004 and
54
Decrem, 2004) and clearly nothing has changed in the intermediate years between then and the time of
writing, as evidenced by the data collected during the qualitative research.
The lack of third party applications has also been responsible for the failure of both Blackberry’s BB10
operating system (Reilly, 2016 and Spence, 2013) and Microsoft’s Windows Phone operating system
(Warren, 2015 and Thurrott, 2016) platforms – so this contention is backed by compelling real world
evidence. Specifically in the case of Blackberry, the operating system kernel was not versatile enough
to be successful on other platforms – whereas with Windows Phone, there is still an opportunity due to
Microsoft’s CEO Satya Nadella continuum (phone as a PC) strategy for Windows Phone stating
“…three years from now, I hope that people will look and say, ‘Oh wow, that’s right, this is a phone
that can also be a PC’…” (Thurrott, 2016). Ubuntu is also working on a similar approach with its Unity
8 UI that aims to converge both desktop and portable devices (Wallen, 2016).
The other main key reason for Linux’s desktop failure is that users in the general computing populace
have become used to Windows, and have evolved with Windows as it has evolved – this point became
readily apparent during the interview research undertaken. Microsoft gained its foothold on the desktop
long before the Linux kernel matured into version 1.0 on 14 March 1994, when Windows 3.1 was
released in 1992 (Gibbs, 2014). Windows 3.1 is still found in the wild, for example running the air
traffic control system for Orly Airport in Paris, France (Waugh, 2015 and Whittaker, 2015).
During the course of the interviews conducted, none of the respondents felt that Linux was
technologically inferior when compared to Windows, or other desktop operating system environments.
In most cases, those interviewed went on to praise Linux’s design and use of computational and memory
resources. Those opinions are corroborated by the experimental research undertaken that reached the
conclusion that overall (depending on the usage case scenario), both Windows 10 and Fedora 24 were
generally comparable performance wise.
55
It is the opinion of the researcher that Linux has succeeded on other platforms because it was there at
the beginning of those particular breakthroughs or advances in technology. This idea was substantiated
by 2 papers that were uncovered during the literature review (West and Dedrick, 2001 and West and
Dedrick, 2001) which discussed that often, new platforms become accepted when they are used in order
to support and underpin new usage case scenarios and it was specifically pointed out that with Linux,
its most common early usage cases were Internet centric – being used for web services, firewalls,
security and other such similar services – because it was there to be adapted to those particular types of
usage at the start of the prevalence of the Internet era.
Similarly, when Google, as part of the Open Handset Alliance, began development in late 2007 of the
Android operating system, with the Linux kernel at its heart (Industry Leaders Announce Open Platform
for Mobile Devices, 2007) it was at the cusp of the portable computing era discussed by Dukan et al
(2014) which was also established as a point during the literature review. Most of the interview
respondents based on their own subjective experiences also discussed the very same matter when being
questioned (refer to Discussion of Results portion of the Interview section).
This convergence of computing and communications was prophesised in 1977 by Koji Kobayashi, who
was the president of NEC, when he spoke of a time when both telecommunications and (presumably
mobile) computing would converge as a result of eventual improvements to the design and technology
of integrated circuits (Rumelt, 2011).
As Dukan et al (2014) explained, this era of portability has been driven on by low power consumption
processors that are used in mobile/tablet devices (dovetailing with Kobayashi), low power sensor
networks and the lightweight operating systems based on the Linux kernel that power them – and this
has now been extended to wearable devices such as smartwatches, as well as other IOT (or Internet of
Things) devices. In almost every case, these devices are running a Linux kernel.
56
Even Microsoft has been forced to recognise that Linux is a major force in the operating systems market.
Microsoft announced on 6 April 2016 as part of its Windows 10 Insider Preview Build 14316 (Aul,
2016) that users would be able to run Ubuntu’s BASH (Bourne Again Shell) natively on Windows. This
was enabled by Microsoft and Ubuntu working together to implement WSL (or Windows Subsystem
for Linux), allowing a user to run “…tens of thousands binary packages available in the Ubuntu archives
(using Bash on Ubuntu on Windows) …” (Vaughan-Nichols, 2016) – so that developers would continue
to use Windows. One interview respondents talked about Linux as a developer’s platform of choice
during the interviews.
In Appendix A, the theory of ‘Cumulative Selection’ (Dawkins, 1986) is discussed. Further credence
was lent to the theory’s applicability to technological amelioration during one of the interviews
undertaken – “…if you go and develop something, it makes sense to try and work off something which
already exists, rather than try and create it from scratch. So you know it’s more (if) you’re going to start
a new operating system and if something can give you a head start it would make sense to use that head
start, so in a way it makes sense to use the work others have done already if it’s helpful to you…”
Linux has been demonstrated to be a versatile, robust and adaptable operating system kernel. This
versatility and adaptability to almost any type of usage scenario has allowed for its successful
propagation across a multitude of platforms. In the case of the desktop, in the opinion of the researcher,
it was 3 years too late when kernel version 1.0 was released in 1994 – Windows 3.1 had already taken
hold and by 1994, when Linux was in a position to compete it was already too late and the opportunity
had gone.
Finally, due to the lack of a ‘killer app’, there was no compelling reason for all those existing Windows
users to switch to Linux. So, in conclusion, Linux’s failure on the desktop cannot be reversed, but with
the reducing relevancy of the desktop it is less of an issue, and now it is most likely to be other operating
systems that will, in the next 5 years, be searching for relevancy and trying to catch up with Linux.
57
Appendix A - The history of Unix and Unix-like operating systems
What is past is prologue.
(William Shakespeare, Tempest 2.1.253)
In order to better understand the current challenges faced by Linux when trying to make a breakthrough
on the desktop, it is important to consider Linux first within a historical context. In this section it is
contended that Linux is the logical culmination of a phenomenon known as ‘Cumulative Selection’.
This is the concept that as a result of a sequence of non-random, cumulative steps, a complex end-
product is derived from beginnings that were comparatively simple. (Dawkins, 1986)
In 1440, the printing press was invented by Johannes Gutenburg. His invention was an aggregation of
existing technology, which combined oil-based ink and screw presses that were used previously in order
to produce wine and olive oil (Shenkar, 2010). Therefore, had those existing technologies not yet
existed, Gutenburg obviously would not have been in a position to converge them together to create his
new device, which one could argue would turn out to be the most important invention in the history of
mankind.
Further strengthening this train of thought, according to Curwen and Whalley (2014), technological
amelioration usually advances via a series of generations (or part generations). They also point out that
such amelioration is usually achieved through better hardware or software, or even the combining of
both together.
Using the aforementioned ideas of both cumulative selection and technological amelioration, this
section hopes to successfully demonstrate and explain that Linux is an amalgam of all the useful
58
incremental changes that occurred within the Unix world (and indeed Unix is an amalgam of what came
before it too), over a sustained period of time.
The beginnings of Unix can be traced back to 1962, when the founder of DARPA (the Defense
Advanced Research Projects Agency), Joseph Carl Robnett Licklider, proposed a new operating system
that would allow for connected, multi-user, collaborative computing that was to be called Multics
(Shapiro, 2004). Multics, which stood for ‘Multiplexed Information and Computing Service’, began
development in 1965 after a series of six papers on the proposed operating system were presented during
the proceedings of the annual Fall Joint Computer Conference (Corbató et al, 1972). Its development
was jointly undertaken by Project MAC at the Massachusetts Institute of Technology (MIT), General
Electric, and Bell Telephone Laboratories.
The design goals of Multics were manifold - it was primarily created as an operating system that could
provide a platform for a large group of users, mainly using remote terminals, allowing for a large
allocation of machine-independent virtual memory, thus removing a user’s reliance upon
predetermining (between different storage levels) the transfer of information, thereby making the
programs that were run by users become uncoupled from the types of different storage devices utilised
by the operating system (Corbató and Vyssotsky, 1965). This feature remains part of Linux (and other
Unix and Unix-like operating systems) where every device is a file.
Additionally, it was designed to allow one process to utilise another process, only needing to know the
other process’s name, rather than having to know what storage requirements that process might have or
further procedures calls that might be later instigated by that process. It was ascertained that the sharing
of a process’s data or its procedures in main memory would optimise the operating system, eradicating
unnecessary transfers of data to or from memory, whilst maintaining proper authorisation – ensuring
users were only allowed access to running processes, or data held in memory to which they were entitled
(Daley and Dennis, 1968).
59
By 1969, Bell Telephone Laboratories made the decision to withdraw their participation from the
Multics project (Organick, 1975), as the management at Bell had arrived at the opinion that Multics
would not fulfil its promise within an appropriate timeframe, nor would it do so within budgetary
constraints (Ritchie, 1996).
However, some employees at Bell Telephone Laboratories had already informally begun to look at
alternatives to Multics. Ken Thompson and Dennis Ritchie, both of whom had previously been involved
with the Multics project, proposed to their employer that they purchase an old DEC PDP-7 computer
so that they could work on creating an operating system to run on it that would be interactive and
capable of time sharing (Hauben, 1994).
During the conception of their new operating system they were not afraid to revisit Multics for some of
its more useful features – one such example was the Multics hierarchical file system, which Thompson
in his own words decided to steal “because it was a good idea” (Cooke et al, 1999). Ritchie gave much
of the credit to his colleague Thompson saying “…His work soon attracted me; I joined in the enterprise,
though most of the ideas, and most of the work for that matter, were his...” (Ritchie, 1984).
A new programming language entitled C was created by Ritchie that could be used for Thompson’s
fledgling Unix operating system. Not dissimilar to Unix, the C programing language was put together
in order to be easier to use, without constraints and with flexibility in mind. These principles led to
interest in their development spreading within Bell, culminating in Thompson and Ritchie’s successful
bid to provide the software that would be aimed at automating Bell’s internal systems (Raymond, 1999).
As the C language was designed to be portable, and therefore be able to run on different hardware
platforms, its use is still prevalent today, and remains a testament to the genius of its creator, who sadly
passed away on 12 October 2011 (Campbell-Kelly, 2011).
The first version of Unix was completed in 1970 and gained initial traction internally at Bell when three
typewriter operators working for AT&T (Bell’s parent company) started to use the system to help them
60
with the automation of filing patent applications (Toomey, 2011). Around the same time, the computer
arm of General Electric was taken over by Honeywell (Corbató et al, 1972).
In 1973, Unix was completely rewritten in C (Ritchie, 1984). The reasoning behind rewriting Unix in
C was to make it portable, so that it could run on different hardware platforms – this made sense on
almost every level – it was financially prudent to do so as a company such as Bell with multiple locations
and a vast array of different hardware platforms at these locations could utilise Unix no matter what
hardware was available, as well as other obvious upshots such as not to be impeded by hardware
obsolescence when one can easily port over the operating system to newer equipment (Johnson and
Ritchie, 1978).
In 1974, Ritchie and Thompson published a paper in the Communications of the ACM (Association for
Computing Machinery) explaining how Unix had an installed base of around 40 implementations within
Bell - used for patent filing (already discussed), gathering data on issues within the Bell switching
network, and the handling of telephony related service orders – in addition to their own installation
being used primarily for research purposes (Ritchie and Thompson, 1974).
Their paper led to keen outside interest in their operating system, especially in academic circles, but
this presented an insurmountable obstacle to AT&T (Bell’s parent company), due to the fact that in
1956 AT&T had entered into an agreement with the United States government that stipulated that they
would not be involved in any commercial business activities outside of the telecommunications sphere,
in exchange for a state sanctioned monopoly to run the United States’ long distance telecommunications
services – making it impossible to monetise the Unix operating system (Toomey, 2011).
As AT&T could not charge more than just a nominal fee for Unix (Mowery and Simcoe, 2002), coupled
with the fact that it was portable, and had been demonstrated to work well in real world use at
AT&T/Bell, Thompson and Ritchie were able to provide Unix at almost no cost to various universities
61
and research institutions, which allowed the number of implementations to increase from 40 (at the time
of the 1974 ACM paper) to approximately 500 in 1977 – of which 125 were at universities (Bach, 1986).
At these universities, the development effort on Unix benefitted from both the input of students and
faculty alike, as they started to make incremental improvements to the Unix source code, as well as
creating new features that were fed back to Thompson and Ritchie at Bell (Brenton and Hunt, 2006). In
1978, Ritchie, along with one of his colleagues, Brian Kernighan, published what was to become a
bestselling book called ‘The C Programming Language’(Kernighan and Ritchie, 1978), which was
concise and to the point as “…C is not a big language, and it is not well served by a big book…”
(Kernighan and Ritchie, 1988). The book helped organisations such as universities to train new disciples
in the ways of C (and therefore by default, Unix) who would then be most likely go on to evangelise
Unix when they graduated and began to work in various industries.
As a result of Unix’s diffusion in academia, in 1977 at the University of California (Berkeley campus),
the additions and changes that they themselves had made to Unix were compiled together by their CSRG
(or Computer Systems Research Group), which were then modified to be able to be used on new
hardware platforms, and was called the ‘Berkeley Software Distribution’ (BSD) (Schwarz and
Takhteyev, 2011). The BSD bundle was released on tape, at a moderate cost of US$50, and buyers of
the tapes were entitled to share and/or duplicate the tapes as they wished (Salus, 1994).
BSD is still in use today, and indeed forms one of the primary parts of the kernel of Apple’s OS X
operating system (as well as its iOS variants), specifically the network stack, the BSD process model,
Unix security, as well as the virtual file system. This technology was in Apple’s hands as a result of
Apple’s acquisition of NeXT Computers and its NeXTSTEP operating system in 1997 (Lalani, 2016).
The CSRG were heavily involved in making major contributions and improvements to Unix, in part
due to significant financial backing from DARPA who leveraged the advancements (which they had
paid for) made at Berkeley to provide operating systems to their contractors to use (McKusick et al,
62
2015). Perhaps the most important of all the advancements made on BSD was the addition of TCP/IP
(used initially to power the ARPANet – which would become the Internet that is known today), which
became the primary basis for all subsequent TCP/IP implementations on Unix (Cameron et al, 2010).
For a while though, the only way to use TCP/IP on Unix was by using BSD, until eventually AT&T
merged this into their own Unix platform implementation. (Bretthauer, 2002)
Richard Stallman is regarded by some as the founding father of open source software. From the mid-
1970s until the early 1980s, he was employed as a programmer at MIT’s Artificial Intelligence Lab
where they had developed their own operating system called ITS (or Incompatible Timesharing
System). It was at this time that Stallman became an advocate of the free software movement, because
“…the entire operating system (referring to ITS) was software developed by people in our community,
and we’d share any of it with anybody. Anybody was welcome to come and take a look, and take away
a copy, and do whatever he wanted to do…” (Stallman, 2001). When the ITS project was shelved by
MIT, Stallman decided that he would attempt to produce his own Unix compatible operating system,
that would be distributed with all requisite software utility tools to run it. (Bretthauer, 2002)
So in early 1984, Stallman resigned from his role at MIT so that he could start work on his operating
system, which he decided to call GNU (a recursive acronym for GNU’s not Unix). By resigning, he
believed that MIT would therefore not be able to interfere with the dissemination of his free software.
Just over a year later, he was able to release a text editor called GNU Emacs (Stallman, 2002) – available
free of charge from his anonymous FTP site or alternatively, for those without access to the fledgling
Internet, on tape for US$150. It did not take long for Stallman to begin receiving almost ten orders per
month. (Bretthauer, 2002)
As the user base of GNU Emacs increased, Stallman began to receive feedback from the community
with fixes for bugs that they had uncovered, and in some cases he received source code that added
additional functionality to the software. Stallman was also happy to appropriate the contributions of
others, so long as the distribution of that source code would allow him to do so - this meant for example
63
he was able to use as part of GNU the X Windowing system, instead of creating a new one. (Bretthauer,
2002)
There reached a point that as the GNU bundle became more widely used that Stallman had to ensure
that his work was protected from being used as part of other non-free/proprietary bundles. He first
established in 1985 the Free Software Foundation to administer GNU, and pumped any revenue
received back in to the project by employing additional programmers to work on coding parts of GNU,
like the command shell and its C library. (Bretthauer, 2002)
In 1989, the GNU General Public License was formally introduced, which as discussed in the
introduction to this paper allows the freedom to any end user to have access to and be able to modify
the software source code (as long as it is made clear the source code has been modified), or distribute
(and if so desired - charge for) copies of the software. Additionally, the software can be used in new
programs – modified or unmodified, and that if that is the case, the recipient of the software is granted
the same freedom as the distributor (The GNU General Public License v3.0 – GNU Project – Free
Software Foundation, 2007)
In 1988, the Institute of Electrical and Electronics Engineers (the IEEE) created a framework called
POSIX (Portable Operating System Interface) – the name apparently suggested by Richard Stallman
(Josey, 2015). The intention of introducing the POSIX framework was to introduce a “Single Unix
Specification” or a common structure to envelope the main competing variants of Unix, providing a
“write once, adopt everywhere” interoperability approach to all operating systems (that are POSIX
compatible) (POSIX – Austin Joint Working Group, 2016).
During the introduction to this paper, it was pointed out that Linus Torvalds was a user of Minix which
was created (in 1987) by Andrew S Tanenbaum (Tanenbaum, 1987), in order to teach his students at
Vrije Universiteit Amsterdam in the Netherlands about operating systems because he was no longer
able to teach Unix when version 7 was released by AT&T.
64
By the early 1990s, Richard Stallman had almost been able to produce a complete operating system,
(having spent most of the 1980s creating the software required to do so), but with one glaring hole –
the lack of a kernel. The lack of a widely popular and free to use kernel that one could use is exactly
what led to Linus Torvalds starting his own kernel project. Torvalds himself said “…If 386BSD had
been available when I started on Linux, Linux would probably never had happened…” (Torvalds,
1993).
It is clear that whilst several ongoing parallel operating system development efforts that either centred
around Unix, or based upon Unix, were all in circulation or underway at the time that Torvalds began
his own efforts, it was because he was unable to take full advantage of his new Intel 80386 based-PC,
that he took it upon himself to write his own operating system that could do so. It took the GNU
movement until 1994 to have its own kernel, called Hurd, finally boot up for the first time (Le Mignot,
2005).
Around the same time in 1992, AT&T decided to take legal action against the University of California
Berkeley campus claiming that their BSD derivative of AT&T Unix was being distributed whilst
containing copyrighted code belonging to AT&T, and therefore BSD was guilty of copyright
infringement. The consequence of this was that development efforts on BSD (and 386BSD) slowed
down until a settlement was reached in 1994 (Cameron et al, 2010), and by then Linux had already
begun to take a foothold.
Torvalds released version 0.02 of the Linux kernel on 5 October 1991, announcing that he had
“…successfully run bash, gcc, gnu-make, gnu-sed, compress, etc. under it.'' (Welsh, 2003). In that same
post he also reached out to other programmers and developers in order to generate development interest
in his kernel, which eventually led to the release of version 1.0 on 14 March 1994, containing 176,250
lines of code (Hayward, 2012).
65
In the release notes for version 0.12 (released on 15 January 1992), Torvalds stated that as a result of
receiving several requests to make Linux’s copyright policy become aligned with GNU’s ‘copyleft’
licensing model he would do so effective as of 1 February 1992. Additionally, he also stated that this
kernel version “…was by now clearly more useable than Minix…” (linux-historic-scripts, 1992).
Tanenbaum, most likely feeling threatened by such comments, took to the Usenet group comp.os.minix
on 29 January 1992 to state that in his opinion as an authority on operating systems, due to his
occupation, that Linux’s use of a monolithic kernel was “…a giant step back into the 1970s…(and) is a
truly poor idea…” This undoubtedly infuriated Torvalds who replied a day later accusing Tanenbaum
of profiteering from Minix, whilst he provides Linux without charge, and talked of the limitations of
Minix, and how being a university lecturer is “…a hell of a good excuse for some of the brain-damages
of minix…” (LINUX is obsolete, 1992)
The flame war did not continue for much longer, as other members of the group calmed down the
tensions between Tanenbaum and Torvalds. In the end, it is fair to say that Torvalds did create the better
operating system kernel, thanks in part to the use of the GNU General Public License, and sharing his
development efforts with the wider community. Tanenbaum has ended up as a footnote in history,
Torvalds much more than that.
The same year, the very first widespread standalone Linux distribution/install was released, which was
called SLS (short for Soft Landing Systems). The SLS distribution was bundled with kernel version
0.99pl12, the X-Window system, programming libraries, language processors and other
shell/command-line utilities but was notable for its exclusion of almost any applications (Conlon, 2012).
In keeping with the theme of cumulative selection, SLS would in turn soon afterwards form the basis
of the Slackware distribution, which is the longest lived of all Linux distributions and still in use today
(Smart, 2010).
66
Further evidence of Tanenbaum’s misjudged post on Usenet began to emerge as several hundred
programmers, including many that were employed by IBM, worked upon Linux in their spare time
(IBM100 – Linux – The Era of Open Innovation) and by the time of the release of Linux kernel version
1.0 in 1994, the coding contribution made by Torvalds was just a small proportion of the overall code
– the balance was contributed by the Linux community at large, although there was a group of some
100 or so that formed the basis of the core development effort (Poole, 2005).
When version 1.0 was announced and subsequently released, it included a GUI (graphical user
interface) provided by the Xfree86 project (Key Open-Source Projects, 1999). Just like standard X
Windows, it facilitates a client-server architecture between the hardware of a computer and its GUI
desktop environment. The Xfree86 project began in 1992 as an 80386 compatible version of the X
Windows system, which in keeping with Linux being an operating system kernel created to take
advantage of the 80386 architecture it made much sense to use it.
The first book relating to Linux was written and published in 1992 by Matt Welsh, with a second
updated edition appearing in 1995 – which was called Linux Installation and Getting Started (Welsh,
1995). Welsh later for a short while became a professor at Harvard University (Matt Welsh promoted
to full professor; granted tenure, 2010) and today is an engineering manager at Google, leading the team
at Google’s Chrome cloud division (mdw.la, no date).
Meanwhile, another Linux distribution sprang to life in August 1993, entitled Debian. It was created by
Ian Murdock with the intention of assembling a distribution that was maintained with the same open
principles and spirit of both GNU and Linux – encouraging every developer or user, with an interest of
doing so, to be able to openly and freely contribute to the project. Debian has proved to be immensely
successful, being counted amongst the most important non-commercial distributors of Linux (A Brief
History of Debian, 2015).
67
In late 1993, close to the end of the acrimonious legal action over Unix copyright initiated by AT&T
(who by that time had actually sold their interest in Unix to Novell) against BSD, an open source 80386
version of BSD was released based upon some aspects of BSD 4.3 as well as several other modules
provided by the Free Software Foundation, which was then called FreeBSD. In the summer of 1994,
the dispute was settled on the understanding that offending code was to be removed from BSD 4.3 and
rewritten and re-released as BSD 4.4, in turn triggering a re-write of the offending code in use on
FreeBSD’s 1.0 release. (About the FreeBSD Project, no date). The FreeBSD fork of BSD is still
available at the time of writing, although has failed to achieve the adoption rates that Linux has.
Further developments in the Linux sphere began to emerge the following year, starting with the
publication of Linux World, a magazine devoted to all things Linux, with its first edition featuring a
one-on-one interview with Linus Torvalds (Young, 1994). Additionally, the same year (1994) saw the
launch of two other well-known distributions – Red Hat (Munga et al, 2009) and Suse Linux (Company
History | SUSE, 2016).
During the Usenet flame war between Tanenbaum and Torvalds several years prior, one of
Tanenbaum’s additional criticisms was that the Linux was not portable. In 1995, this argument was
diminished when the Linux kernel was ported to DEC’s Alpha 2000 AXP hardware along with several
drivers as part of a Linux Developer’s kit (Brothers, 1995). The same year saw the launch of the first
Linux Expo (Airoldi et al, 2008), demonstrating that Linux was gaining further mainstream attention.
Due in no small part to the increasing popularity of the Linux kernel, Torvalds with the aid and
assistance of Linux International was able to be granted the trademark to the Linux name in 1997
(O’Mahony, 2003). This had become a necessity as two years earlier, in the United States, William R.
Della Croce, Jr., a lawyer, had registered the Linux trademark under his own name, and was using his
ownership of the trademark to attempt to extort royalty payments from various companies that used the
name Linux (Hughes, 1997) (Richardson, 1997).
68
Even NASA (the National Aeronautics and Space Administration) took to Linux for their Beowulf
project, aimed at replacing their aging supercomputer infrastructure. The idea being that by creating a
cluster of off the shelf PC hardware, the combined processing power of these machines would be
equivalent (or indeed more) than a supercomputer. The project came to life when a mistake in budgeting
for a project at their Oak Ridge National Labs, meant that no monetary provision had been made for
computing resources. As 48 PCs had been replaced at the Oak Ridge a short time earlier, the obsolete
hardware was then reused to create a cluster akin to a supercomputer (Sterling, 2002).
The above had only become possible with the release of version 2.0 of the Linux kernel on 9 June 1996
(Wu and Holt, 2004), which allowed for SMP (symmetric multiprocessing), making it possible for
several CPUs to be worked in parallel, making Linux a serious alternative to existing operating system
competitors – this only got better with version 2.6 of the Linux kernel, which allowed for pre-emptive
scheduling, so any process running could be pre-empted as long as the process does not hold a lock on
the kernel (Love et al, 2005).
According to a report published by IDC that discussed operating system growth for the 1998 calendar
year, due to several large computer industry juggernauts throwing their weight behind Linux (IDC
Reports Notable Growth in Shipments of Client Operating Systems in 1998, 1999), shipments of Linux
based operating systems had experienced a growth rate of 212% over the previous year, a threefold
increase over 1997 – this was versus just a 4% growth rate for Unix (Shankland, 2002).
The precursor to the Google search engine, was a web-crawler called BackRub. BackRub was put into
production in 1996, running on a clutch of Sun Ultra workstations and Intel Pentium based PCs, using
Linux (Stanford BackRub, 1997). By 1998, BackRub had morphed into the Google search engine with
Linux continuing as its backbone (Our history in depth – Google Company, no date). Google continues
to the current day to be a staunch advocate of Linux, with many of its products (such as the Android
mobile operating system) being built around the Linux kernel.
69
In late 1996, an advertisement was made the comp.os.linux.misc Usenet group that called for
programmers to assist in the development of a new GUI (graphical user interface) for Unix and Unix
like operating systems that was to be called KDE (Kool Desktop Environment) as “…a
consistant (sic), nice looking free desktop-environment is missing… (and) that
Linux/X11 would almost fit everybody needs if we could offer a real GUI…” (KDE – KDE Project
Announced, 1996).
Due to the announcement of KDE and the ever increasing popularity of Linux, an alternative desktop
environment development effort called GNOME (GNU Network Object Model Environment), was
instigated in 1997 by Miguel de Icaza at the Mexican Autonomous National University that also allowed
for its GUI to work on any latter-day Unix implementation. It soon attracted the interest of many outside
programmers, who began to contribute thousands of lines of additional code to the project (Pennington,
1999).
Whilst no actual data sources are offered, one journalist claimed that in 2007, KDE was used by 65%
of Linux desktops and 26% was attributed GNOME, with the remaining 9% spread amongst alternate
GUIs such as Xfce (Byfield, 2007). Xfce also started development in late 1996/early 1997 (Then, 2009),
and was designed mainly to be fast, and have low system overhead. (Xfce Desktop Environment, 2016)
Further down the line, both KDE and GNOME suffered setbacks due to poorly implemented design
changes. For example, at the beginning of 2008, KDE 4.0 was released “…with almost as many new
bugs as it does features…” (Paul, 2008) or in 2011 when GNOME 3 was launched Linus Torvalds was
quoted as saying “…the developers have apparently decided that it's "too complicated" to actually do
real work on your desktop, and have decided to make it really annoying to do… Seriously. I have been
asking other developers about gnome3 (sic), they all think it's crazy…” (Vaughan-Nichols, 2011).
70
To be fair to the developers of both KDE and GNOME, the kind of backlash that they suffered is not
only consigned to the open-source world. Operating system market leader Microsoft faced extensive
criticism for controversial changes made to their GUI with the launch of both Windows Vista (Hiner,
2008) (Har-Even, 2009) (Clyman, 2007) and then again with Windows 8 (Leonhard, 2012) (Paterson,
2013) (Houghton, 2014) – in both cases leading to fairly prompt releases from Microsoft that addressed
many of the concerns raised by users.
Other GUIs appeared as a result of the dissatisfaction expressed with KDE and GNOME, such as
Cinnamon (Projects, no date) – which was a fork of GNOME, just like Unity which was released as
part of another well-known distribution called Ubuntu and designed to be a GUI environment that could
optimally run both on the desktop as well as smaller devices (not dissimilar to Windows 8 in that regard)
(Unity, no date).
Meanwhile, the SCO Group (SCO standing for Santa Cruz Operation), who had acquired some rights
to the Unix operating system from Novell (Novell having purchased Unix from AT&T in 1993)
(Moritsugu, 2000), in 1995 commenced a joint project with IBM, which was named Project Monterey,
to create a variant of Unix that could be run on Intel’s Itanium based CPUs that IBM would use to
power equipment they would plan build and sell (Rodriguez, 2005).
The Linux Foundation was formed in 2000 as a not for profit group purposed with the mission of
continuing the ongoing growth of Linux, backed by various developers and companies involved in the
open source software movement (The Linux Foundation, no date). This in turn helped to ensure that
Linux’s market share continued to steadily improve, albeit mainly in the server space, with Linux said
to have cornered around 24% of the server market in 2000 (Stanfield and Smith, 2006), which then
further increased to just under 30% of the server market in 2001 (both figures according to research
undertaken by IDC) (Thurrott, 2002),
71
The NSA (National Security Agency) also became contributors to Linux, when they released publicly
their SELinux software in 2001, which provided a mandatory access control (MAC) layer to Linux, as
the standard Unix-like permissions system that was in use often allowed services to run with privileges
beyond what they actually needed to run – which meant that a flaw in such a service could be exploited
to gain complete system access (Loscocco and Smalley, 2001).
The following year, media reports from a legal action between the United States Government and
Microsoft were propagated stating that Microsoft had attempted to dissuade PC manufacturers,
including Dell, from attempting to promote Linux. Emails exchanged between Joachim Kempin, who
was the head of Microsoft’s OEM business and Steve Ballmer, the CEO of Microsoft, were found to
contain language such as “…(we should be) hitting the OEMs harder than in the past with anti-Linux…”
in addition to phrases such as “knife the baby” and “cut off the air supply” (Orlowski, 2002).
Microsoft’s anti-Linux sentiment gathered pace when in 2003 they launched a campaign entitled ‘Get
the Facts’ which was used to show Microsoft’s products in a more favourable light in comparison to
Linux, Unix and other similar operating systems (Evers, 2005). The page was available for 4 years until
it was removed, along with its various case studies and white papers – said to have been researched by
paid analysts to show Microsoft’s software in a more flattering light (Foley, 2007).
Earlier in this section it was described how many contributors to the Linux kernel were employees of
IBM who worked on the project in their own spare time, as well as IBM’s partnership with SCO in the
late 1990s to build a version of Unix for Itanium based CPUs. As a consequence of IBM pulling out of
the project in 2001, the SCO Group, sued IBM in 2003 for US$3 billion, alleging inter alia that IBM
had incorporated proprietary Unix code that belonged to them, into Linux, specifically intending to
improve the operating system to the detriment of Unix (Rodriguez, 2005).
72
This in turn also led to Red Hat deciding to sue SCO, as SCO, in Red Hat’s minds, had engaged in a
campaign of deception with the “…goal of affecting Red Hat’s business…” In terms of launching their
legal action, Red Hat’s argument was essentially that the SCO assertions in the IBM case were akin to
libel and would have a detrimental impact on them as one of the market leaders of enterprise level Linux
software (Rodriguez, 2005). The Red Hat vs SCO proceedings were stayed by the court until the case
with IBM was resolved.
The case has continued to the present day, although as recent as 1 March 2016, Judge David Nuffner
had finally ruled in IBM favour, and dismissed SCO’s case. However, this did not last long as on 29
March 2016, SCO filed an appeal (Groklaw – SCO v IBM Timeline, no date) and therefore the case
that has been in play for some 13 years is set to continue for several more.
Microsoft’s aggressive campaign against Linux took a blow when in 2004 Novell put up an article on
their website called ‘Unbending The Truth’– Novell clearly stated that Microsoft’s research was cherry
picked, ignoring data that painted Linux in a positive light and only published what made Microsoft’s
operating systems look better (Evers, 2004). Further publications by Novell the following year
continued to responsd to Microsoft’s campaign stating that “…Linux is often a better choice than
Windows for satisfying the business needs of enterprises everywhere…” (Fact Finding: Things
Microsoft Doesn’t Want You To Know, 2005).
Throughout the mid-2000s, more and more new Linux distributions were released – such as Fedora in
2003 (Announcing Fedora Core 1, 2003), OpenSuse in 2004 (as a result of Suse being acquired by
Novell in 2003) (Company History | SUSE, 2016) and Linux Mint in 2006 (El Khamlichi, no date). At
the time of writing, DistroWatch, a website that aims to track all available Linux and BSD distributions
at any given time, lists the number of active distributions as 279, although they have identified 815
73
current unique distributions, with a further waiting list of 238 to be vetted (DistroWatch: Put the fun
back into computing. Use Linux, BSD, 2016).
Late in 2006, Microsoft and Novell entered into an agreement to work together on interoperability and
virtualisation technology, alongside which they agreed not to instigate litigation against each other,
which some industry analysts took as an admission from Novell that Linux contained patents belonging
to Microsoft (Paul, 2007). A few years later in 2010, Microsoft bought 882 patents belonging to Novell
for US$450 million (Kanaracus and Jackson, 2010).
By 2011, Microsoft had become the 17th largest code contributor to the Linux kernel (Paul, 2012). This
was not down to altruism on Microsoft’s part, they had inadvertently used some GPL licensed code in
their Hyper-V virtualisation software – specifically code that provided the ability to run Linux virtual
machines on Windows, so Microsoft, to avoid any potential legal pitfalls released 20,000 lines of code
for inclusion in the Linux kernel (Vaughan-Nichols, 2011).
Google, as part of the Open Handset Alliance, announced on 5 November 2007 the commencement of
development of the Android operating system, which would use the Linux kernel at its heart and would
be optimised to allow for handset makers to be able to quickly bring Android based mobile handsets to
market (Industry Leaders Announce Open Platform for Mobile Devices, 2007). As at May 2016,
according to netmarketshare.com, Android has captured 70.85% of the mobile/tablet market space
(Operating system market share, 2016).
Version 3.0 of the Linux kernel was released in 2011, although Linus Torvalds confirmed that it was
not a real leap from the earlier kernel version, but was renumbered mainly because of the proximity to
the 20th anniversary of Linux’s initial announcement as “…it will get released close enough to the
20-year mark, which is excuse enough for me…” (Hachman, 2011).
74
At the 2013 annual conference hosted by the Linux Foundation, it was announced by IBM Vice
President Brad McCredie that “…the Linux market is now bigger than the Unix market…”, although
he did make it clear that interest in Unix had not evaporated, IBM were still very much behind a
multitude of operating systems, and that they continued to work on Unix based projects as “…IBM is
a big company. We have a lot of resources, and we can do more than one thing…” (Brodkin, 2013).
In some cases, Linux based distributions have made breakthroughs on the desktop. One such example
was a project instigated by the City of Munich in Bavaria, Germany – whom by 2011 had successfully
migrated 6,800 desktops out of around 15,000 to LiMux, their own in house distribution based upon
Ubuntu (Maier, 2011). The French Gendarmerie also completed a total migration of 37,000 PCs to
Ubuntu Linux (Finley, 2013), followed by the City of Turin, Italy announcing in 2014 that they would
also be installing Ubuntu Linux on 8,300 PCs, saving some EUR 2.5 million on software licensing alone
as a result (Guerrini, 2014).
At the time of writing, even though the Linux kernel is freely available, and one could assemble their
own distribution to suit their own needs (such as LiMux utilised by the City of Munich), many
organisations have been successfully able to monetise and profit from Linux. In fairness, these
organisations – such as Dell, IBM, Hewlett Packard, Red Hat and Novell/Suse engage a not
insubstantial amount of their own resources – whether financial and human to adapt Linux to various
usage case scenarios, and also provide support for the products they provide to their customers.
What should have become apparent during this section is that overwhelming evidence has been
presented to demonstrate that Linux is a logical culmination and amalgamation of all that came before
it. What made Linux stand out from its contemporaries is that it was the first open source Unix like
operating system that could be run on the 80386 processor architecture. Competitors, if they can even
be called that, such as Minix, or the GNU Hurd kernel, or FreeBSD were at least 12 months behind the
Linux kernel in this regard. Whilst some might argue this is a short space of time, in the computing
world 12 months can seem like eons.
75
Torvalds created his kernel to fulfil a need that he had, and indeed others had – or the developer
community would not have flocked to Linux to lend their assistance in crowdsourcing thousands of
lines of code, or providing suggestions to make it more robust and optimal as time went on. He took all
the best of what had happened before like the Unix style monolithic kernel, Stallman’s GNU utilities,
designed to be open source alternatives to Unix applications, and improved upon them with each
progressive kernel release.
However, despite all of this, Linux has not broken through to the PC desktop in the way it has on other
platforms. It has hopefully been demonstrated that because Linux itself is just a kernel, it has to be
packaged into distributions with other software to become useful to an end user. From an early stage
numerous distributions began to appear. Further to this, multiple GUI choices also manifested
themselves such as KDE and GNOME. To this end Torvalds himself said “…I know people who
decided to give up on the Linux desktop even though they're technical people, just because they got so
fed up with Gnome and KDE…I'm very unhappy with what Gnome and KDE have done…” (Interview
with Linus Torvalds from Linux Format 163, 2012).
It is a key tenet of the working hypothesis outlined in the introduction that fragmentation, or lack of a
prevalent Linux distribution is one of the primary causes of its failure to capture desktop market share.
This author’s first interaction with Linux came when purchasing a bundled SUSE distribution in 1999,
with manuals, CD media etc. This was installed on a PC with a Pentium processor and 16MB of RAM.
The performance versus Windows 95 was immeasurably faster when using the bundled GUI, however
due to the lack of driver support for the internal modem that was in use, the author had to abandon the
operating system as Internet access was not possible. One of the contentions stated in the working
hypothesis is that lack of hardware/driver support is another reason for its apparent lack of success on
the desktop, although this was disproved in Chapter 3.
76
Appendix B - Interviews Interview 1 – Robert Fitzjohn
Robert is an IT infrastructure expert that is based in Oxford in the United Kingdom. He is 40 years old,
and has 17 years of IT industry experience.
Q. I’ve got several questions, so like I said why I don’t what to tell you or give you the questions in
advance is because I want that whatever is in your mind just comes out, and that you’re not able to pre
plan, I know you wouldn’t pre plan so much anyway, and it’s not like some shocking question or
revelation will come out. It’s just to get what’s exactly in your head at that particular point. So…
A. Ok, ok,
Q. You’ll see, trust me you are someone that’s very qualified to answer the questions as we go through
them. So we’ll just quickly start off if you don’t mind telling me your age just to get that out of the way.
A. 40.
Q. Ok, and how long have you worked in IT for?
A. Since 1999
Q. OK, so 17 years or so. So almost….
A. Something like that
77
Q. Without taking too long can you describe your career from its start to now. Like how did you start
off?
A. I started off in development, software development. (I) did that for a few years and then when I came
to Abu Dhabi I did more infrastructure work and now I’m back in England I’m doing infrastructure
work as well.
Q. Ok, and during the 17 odd years that you’ve been working in IT can you talk a bit about the
technological changes that you’ve seen occurring during your work career. Just a general just a one
paragraph type answer.
A. When I first started it was all the classic client server model. The Internet hadn’t hit yet, but as time
progressed the Internet became more essential - nobody really wanted desktop applications anymore.
They all had to become internet based etc etc and now we’re moving on, the Internet is more entrenched,
now we’re seeing everyone with a great big push partly due to marketing, everyone’s talking about the
cloud. You know, you don’t need to run anything yourself you just subcontract that out to someone else
and you just rent it per month
Q. Ok great and going through your career can you discuss the operating systems that you’ve used both
in a personal capacity (so on your own private machines) as well as the ones you may have used at
various employers and how that might have changed with time
A. I remember at school, well not my career, but at school there was a BBC (BBC Micro) in the
computer lab. There were IBM PCs with DOS. As time moved on, (when I) went to school, Windows
3.1 was making an appearance. So after Windows 3.1 came 95. Then (at) university (I) had to some use
some VAX systems, Novell Netware, and as time moved on Windows became more prevalent.
78
Windows NT came, 2000, XP, Vista etc and at the same time back in the early 2000s, Linux appeared
as well but was still very niche, and now, well I don’t know if it’s everywhere but certainly Windows
it seems to be on a decline, with Linux certainly in some sectors taking over from Windows or other
Unix systems (that) used to sort of well not dominate but (were) the mainstay of that particular sector.
Q. Ok excellent, this is exactly what I was hoping to hear from you, So, talking a little about portable
devices just very briefly. Again, how you’ve just talked me through the evolution of what you’ve seen
from an operating systems standpoint on desktops or servers, on portable devices, or phones or tablets
or PDAs or whatever – how have you seen that change as well?
A. Ok, portable devices in the early days, well not early days but around 2000, I wasn’t that familiar
with them but I knew there were some devices, PDAs, you know certain manufacturers made them -
HP did, I don’t think Microsoft were into portable devices at the time though they did produce an
operating system for them at some later point.
For example, on phones, everyone had their own operating system. Nokia certainly had its own. Nokia
was very popular at the time in mobile phones and those devices have got more mainstream and
obviously the processors on the devices have got more powerful. We’ve seen other people come in –
Apple, Android, Google, Microsoft. So they’ve been, they have flourished as well in terms of portable
devices. But mainly it’s been with mobile phones other than early PDA type devices.
Q. So you know Microsoft windows phone is pretty much dead in the water. The main two are Android
and Apple’s iOS, obviously your aware that Android uses Linux’s kernel and the Apple one uses pieces
of BSD Unix. So you’re obviously aware of that, but what do you think about that?
A. Yes, yes. Well in a way it’s sort of, if you go and develop something, it makes sense to try and work
off something which already exists, rather than try and create it from scratch. So you know it’s more
(if) you’re going to start a new operating system and if something can give you a head start it would
79
make sense to use that head start, so in a way it makes sense to use the work others have done already
if it’s helpful to you.
Q. That’s excellent. So desktop pc sales are supposedly on the wane. Do you think that desktops are
relevant any more, and why?
A. Yes there still is a place for them, there still is a place for that – one, the computing power and two,
the form factor - sometimes you do need to sit at a desk, have a full sized keyboard a full sized monitor
and have a mouse for input, the ability to use those peripherals to do your job. They’ll still exist but
obviously not in the same numbers as the boom years of from 1998/99 up to 2008, that boom period for
desktop devices, is over. But there’s still important space for them.
Q. During your career have you ever installed an operating system on a desktop or laptop and if so
what was it – so you yourself what have you personally installed on various x86 systems?
A. DOS, various versions of Windows and Linux.
Q. You’ve mentioned already that you’ve installed Linux on machines before, and I know you are quite
familiar with Linux and you’ve touched upon VAX which is obviously Unix based. Can you tell me a bit
about your experience with Linux and Unix – so what got you into that exposure?
A. I remember I bought a new, well me and a friend of mine, when I was at university, we saw an advert
in the paper for a computer and we went out and bought a computer second hand. And this was my first
computer I could do anything I wanted with it. So once I was buying these PC magazines, computer
magazines, which I’m not sure (if) they still exist now but they were popular at the time before the
internet and they used to come with a CD stuck to the front with some software for you try, shareware,
etc. There was always a full CD on the cover which came with the magazine. And that was way to find
out new things and try out new things and on one edition there was a full version of Linux to install so
80
I tried that, deleted all the data on my windows partition and that was the first time I installed it, and
that was the first time I started playing with Linux and understanding how it worked.
Q. When it comes to Linux distributions can you just reel off the ones you are aware of?
A. The ones I’m aware of? Or the ones that I’ve tried?
Q. The ones you haven’t necessarily tried, but know that they exist
A. Suse, Red Hat, Debian, Ubuntu, various other small players in the niche market - for example
embedded devices, for example, openWRT - it’s a distribution for routers. Embedded Linux there’s
various small ones. The main ones I’ve just mentioned.
Q. So if I tell you that they were currently 815 unique distributions of Linux according to Distrowatch,
what does that make you think when I tell you that?
A. Well possibly, I don’t know all of them. Sounds like a lot. It sounds like a lot. If someone told me
that there was 200 I’d have thought well possibly, but 800 sounds like quite a lot.
Q. The Linux distributions that you’ve used, can you just describe a bit your experiences with them,
where they negative, positive – in between?
A. Well they all sort of have their advantages and disadvantages – earlier on they used to be quite
difficult to install and get up and running. Obviously with time they have got better and my expertise
of using them has got better. Overall, the experience is nowadays generally positive, you don’t get asked
too many questions at installation time, well know I know the answers, but for new users the barrier to
entry is lower than it used to be.
81
Q. When it comes to GUIs, there’s many different choices when it comes to the graphical interface you
could use with Linux, and where certain distributions that might be your favourites, they usually have
a default during install that reverts to a particular one, but is there one in general, one that you prefer?
A. Well yes, the two main ones are KDE and Gnome. I’ve been a KDE user partly because it’s always
been well supported. It’s always been a good, by default, it’s been well configured on Suse and Suse
has been my distribution to use on the desktop, so I’ve always been a KDE user.
Q. You’ve touched upon it in terms of the barriers to entry and people being able to install much more
easily, but when it comes to actually using Linux, let’s say Suse, your distribution of choice, do you
think it’s easy to use and why?
A. Well for me it is easy to use but to give it to my grandma no it wouldn’t be easy to use, it’s a bit of
a question which applies to me. Yes, it is easy to use, sometimes you do need to know what you’re
doing to troubleshoot or fix problems - so expertise of computers in general sometimes does help a lot.
I can’t really answer your question one way or another yes or no.
Q. That’s actually good the way you answered it, as it answers the next one which is - in some circles
Linux is viewed as difficult to use, which you said for you it isn’t, but as you said if its someone like for
example your grandma or perhaps mine, it might be difficult, but also for someone to really get into it
then it would require substantial training time or effort to be invested – do you think that’s true?
A. No, if you give anyone a computer for the first time, the amount of training and investment is
substantial whether its Linux or Windows. Its only for people like us who are brought up on Windows
- we know Windows inside out, and then move to another operating system have to learn everything
again. So yes, that is training and investment, but if we never saw a Windows machine and started with
Linux then yes five years later we’d be very familiar with Linux but wouldn’t know a thing about
Windows. So it just depends what you were initially taught to use. So irrespective, I don’t think one is
82
more difficult than the other or requires more training, it just depends what you were taught the first
time around and had to really learn. With Windows there’s no relearning. People are taught Windows
at school
Q. Why do you think Linux is rarely preinstalled on a desktop or laptop?
A. There’s few people who are familiar with it, so the level or knowledge in your typical family, whereas
they’d know Windows already, or Macintosh already - but if you bring in Linux the level of knowledge
is less and secondly there’s still some areas where Linux doesn’t shine in the home.
If you have children that want to play games, the latest games might not be available on Linux, they
might be available on Windows only. So the level of software support, though good for typical
mainstream applications, but in certain sectors such as gaming are non-existent. But on the other side,
there are areas where Linux outshines windows, so for example on the server side or server applications
you find there’s a lot better support there than there is on windows.
Q. This is not a scripted question, but when you mentioned about games, do you think Steam OS is going
to help to change and provide many more games because Steam OS is using the Linux kernel?
A. It’s a good start, I’ve tried Steam OS but the graphics card on my Linux PCc isn’t particularly
powerful, the one on the Windows machine is more powerful. So if I’ve got a certain game I’m going
to play, I’m still going to go to Windows. But yes it’s a good start but there’s still a lot of work to make
gaming on Linux better, but will it be successful? It’s like anything it’s difficult to say.
Q. Do you think that Linux performs well on obsolete hardware?
A. Yes it can perform well on any hardware, even if obsolete, it can be made to perform acceptably.
83
Q. As you mentioned about the PC you bought at Uni, that was obsolete hardware…
A. Yes I did, but nowadays I use it on decent hardware.
Q. In some cases, we’ve all had it at work, someone says I’ve messed up my laptop or PC or whatever.
Can you reload it for me? Now, do you think, without saying you are involved in software piracy, let’s
say other hypothetical IT people, do you think because they can easily install Windows on that machine
with a volume license key, does that stop the install base of desktop Linux machines increasing? – as
someone would have to pay for Windows and you could offer them say a free install of Suse, do you
think that this compounded over all the IT people in the world would make a difference?
A. I don’t think it’s down to piracy. I think it’s down to what people are already familiar with and what
they have. As you said, if you messed up your laptop and came to me, and said could you reload it, if I
told yes sure I’ve got Linux as well, if you agreed and didn’t know what it was, I’d install it and you’d
try it and say I can’t understand it. So whether, or, I don’t think piracy has helped windows I think it’s
just the time the head start that Windows has had to entrench itself in the market which makes it more
widely available on most computers.
So, maybe piracy does contribute a few percent around the install base, so the install base doesn’t get
smaller, but ultimately it’s down to knowledge. If tomorrow I put a Linux pc in front of someone, can
they continue to do their work? Can they continue to be entertained? Can they continue to
communicate? All those things that they expect - will their Whatsapp or their Skype chat still work? –
it’s those sorts of things.
Q. That’s good, that feeds into my next question. We touched upon the problem with the gaming which
you brought up in fact, which is do you think it’s possible to do everything on a Linux desktop that you
could do on a Windows desktop? You’ve already said with gaming that there’s some deficiencies there
with the availability of games
84
A. Linux does have its deficiencies on the desktop - I’d say mainly down to graphics. Graphics is an
issue which when you look at the reason is down to the manufacturer of the graphics card - especially
NVIDIA and AMD. Intel is a lot better in making their hardware and the drivers open source. NVIDIA
and AMD are still very, for whatever reason, whatever they are hiding or not hiding, whether its patents
or legal issues they’re not willing to make their hardware open. Intel is a lot better with their graphics,
but obviously high end graphics your talking NVIDIA and AMD - so yes gaming (is) down to graphics
drivers, graphics manufacturers.
There are other areas where Linux is better. Obviously I’m sure you are aware of that, but yes there are
areas where in the corporate desktop (area), they expect to come in, they expect to find a piece of
software just there available, such as Word, Excel, their Outlook. If you give someone a PC when
someone starts a job and you give them a Linux desktop you’re going to be in major trouble. You’re
going to be spending time trying to get a person to work before they are even supposed to be doing their
work, so they’re major productivity issues if Windows is not the default installation. But yes, some
areas Windows is a better choice if you don’t want to lose productivity - it’s a better choice. It doesn’t
mean it’s a technically better choice, it’s just a better choice by default.
Q. You mentioned it at the beginning when you talked about the changes through your career - do you
think that now with the prevalence of cloud, cloud based applications and web apps - do you think this
would make it less of an issue in the future?
A. It is. It is making it less of an issue. I’m quite able to have a desktop computer and do 95% of the
things I want to with just Linux, with the help of cloud apps and alternative software. I still, and
obviously my case doesn’t apply to everyone, in my case I’m able to do 95% of what I want to do with
Linux on my computer. The other 5% I have to switch on my wife’s Windows computer. There are
cloud facilities that help whether it’s email, or apps, productivity apps that are in the cloud, they help
85
as well, which requires just a browser but I wouldn’t say it’s happening at a fast rate but it is happening
at a reasonable rate.
Q. You probably already answered early on but just to ask the question anyway – do you think users
care what operating system is running on their desktop or laptop?
A. No, they just want to be familiar, they just want to get their job done, they just want to be able to do
it.
Q. Final question, if you were in a position to set up a network of workstations from scratch, with a
limited budget, would you consider Linux and why?
A. Yes I would consider it, many of the advantages I’ve already been into, but I would be weary that
there could be issues with sharing files or data with third parties outside. So, for example, if this was on
Office, if I would have to share files or certain documents. this would cause issues with people who
would not understand the ways to convert them. It would introduce a layer of additional unnecessary
work for some people. So in that sense I’d expect some resistance from the users, but if a case was more
limited where I didn’t have to work with 3rd parties or share files etc. then by all means yes I would
consider it – not simply because its free, but because it would make sense.
86
Interview 2 – Prasad KM
Prasad is the Group IT Manager for an international liquid logistics provider with multiple business
units spread across 17 geographical locations. Prasad has 25 years of IT industry experience, in several
diverse fields, and he is responsible for a user base of 1800 employees.
Q. So, like a mentioned to you before, the interview is about operating systems. Just quickly, to get some
background about you, could you tell me how old you are please?
A. I’m 47. I started my career in ’91 as a trainer with an organisation called Aptron Academy of
Learning in India
Q. Ok, and going from that point in your career, can you briefly describe that career from the start until
now?
A. Yes, it was actually the time of IT booming in India, and I took a turn from training (as a) PC
profession to development and administration by joining an organisation called Aptech. They are a
worldwide training institution. They have branches all over the world, specifically in IT training and
they have a development division as well. So I spent around 6 years with Aptech, doing IT
administration, plus a little bit of development and sales. Then, I moved to the Middle East, in Dubai,
in 1997.
In 1997, I joined an institution called NI IT, which is a global institution called the National Institute of
Information Technology, headquartered in New Delhi, India. But my assignment was to take care of
business opportunities in (the) Africa and Middle East region. I was one of the persons assigned with
starting the new business ventures in the African region. As a part of this, first I moved to Sudan in
87
2004 to start a new business centre, then I moved to Ghana. From there I travelled to Iran for starting
some new business opportunities there. In addition to these 3 training centres which were established
by me, I participated in the Indian government sponsored training in many other locations such as
Kenya, and in Uganda, and other locations. In fact, this program was an initiative by the government of
India, where they do the IT training programmes for other countries – in cooperation with the local
government. The participants will be government officials from different sectors like banking and
administrative staff, and other government officials. So, I’ll be moving to this place, staying there for
some time, like 2 or 3 months, doing the training as per the curriculum suggested by the NI IT. That
was my job. So, I continued from ’97 till say 2010 with this organisation called NI IT.
Q. OK, so then currently now you’re working as the Group IT Manager at Tristar?
A. Yes, I joined with this organisation Tristar from 2010 onwards. My mission was to support the United
Nations project which is happening in Sudan – they were having some system for fuel management
which is developed in Lotus Domino Server. So, my task was to manage that application – first doing
some development on this thing and generating reports as per the requirements of the UN mission. So
this was really a very challenging and interesting project, where I got a chance to travel to almost say
18 locations, within this country, Sudan – which is where most of the locations are forests and the UN
army is located there, and Tristar the company, established a fuel dispensing unit plus a refuelling
facility. So my task was moving to this location, enabling the Internet access by using VSAT
connectivity, then setting up the software for this fuel management, training the staff, and staying with
them for say another 2 or 3 days, until they are familiarised with this software system. I was able to
move around almost 18 locations around Sudan, and do the training for all these people.
Q. Excellent, so you would say that you have a good understanding of users and their habits?
A. Exactly, exactly.
88
Q. So, during the time from 1991 until today, can you also just discuss briefly the technological changes
that you have observed during that time?
A. It was really a very big curve, or big change over that’s happened. When I started with this thing,
say we were using this traditional operating system called MS-DOS 3.0 and then the evolution of the
graphical applications with Windows 3.1 was an amazing thing in front of us. And then say going to
the development of this OS by Microsoft of Windows 95, 98, and the other things. It made a
revolutionary change. But, in my viewpoint, what has basically happened is, when users are working
with an OS like DOS or the Unix platform, they are having a better understanding about the systems,
how they were working, and they were having a good control over the computer. Whereas, when it has
moved to the graphical applications, users are becoming a separate category where they don’t need to
have much technical knowledge, they can just be, well anyone can be an expert user where they don’t
need to have much technical knowledge on a computer. So there are two categories that have evolved -
just the users/end-users, and the technical experts.
Q. Ok, terrific. So going through your career and also in your own personal capacity as well could you
discuss, I mean you’ve touched upon already some of the operating systems you’ve been exposed to,
but are there any others you’ve been exposed to, not only in a work context but also yourself whether
in a test environment or for your own ‘playing around at home’ or something, can you just…?
A. Yes in fact, there is an OS, which is one of the OS that attracted me, which is Solaris.
Q. Ok which is Unix based…
A. Yes, Sun Solaris is one of the (most) brilliant platforms for the Oracle applications. I got a chance to
use this Sun Solaris for one of the banks located in Khartoum – their banking application is hosted using
this particular OS. So we got the project of training the bank staff, under this particular platform. It was
an Oracle training under the Oracle DBA certification program, whereas the platform should be under
89
Sun Solaris. So we established a server, on the Sun Solaris OS, under the Intel platform, not under Sun.
On the Intel platform we installed it, and it’s really a brilliant operating system. Compared to all other
OS I like and consider this Solaris to be the most fantastic platform for any stable database application.
In addition to that, Linux is one of the best ones which I ever saw, but right now that is only used as a
server hosting platform unfortunately. But the superiority of these OSs as an end user computing
environment is also great, but it is not marketed in that way.
Q. Ok, let’s talk very briefly about portable devices, so if you could discuss a little bit your experience
over the years with those devices, and how they’ve changed from an operating system standpoint…
A. Portable devices you mean like Android and these kinds of OSs?
Q. Yes, so I mean if it’s easier to start from the present day and go backwards
A. What I find is in another 5 years the complete, complete, computing platform will be changed with
this portable equipment. Whether in form of smartphones, or tabs, or these kinds of applications, I mean
the equipment. So, even the traditional operating system concept will be wiped out with these new
platforms for the computing environment. Even we can reach up to a level, without having these kind
of physical devices, doing communication. Like using a smartwatch or any other kind of gadgets that
we are using in our daily life will be acquiring computing ability and that will become an integrated
platform for computing. This is my expectation for the future of technology.
Q. You touched upon Android already - so are you aware of what kernel Android is using at its heart?
A. Sorry can you repeat that again.
90
Q. So you talked about Android, so are you aware upon what Android is based upon? What operating
system kernel is it using?
A. Yes, of course, that is a powerful programming language…
Q. But are you aware that it is actually running a Linux kernel?
A. Yes correct, Linux kernel, amongst other things which are used.
Q. And for the Apple devices? Are you aware of what they’re based upon?
A. Yes, but my thing is say the reason why Android is getting more popularity is because they’re
offering this platform for all the vendors. Whereas Apple is constrained with their own technology.
That’s the reason they are shrinking themselves.
Q. A walled garden?
A. Yes, even the computer evolution, IBM exposed their technology, and there were a lot of
manufacturers then developing the same, so that became more popular. IBM clones became popular.
Whereas in the case of a Mac, Mac is constrained so that it has reduced its market or business sector
into a smaller applicable area. It may happen this way for Android and iOS also.
Q. I like that link that you’ve drawn there between Android and IBM compatibles. I was just trying to
get out of you if you’re aware that all the Apple operating systems – be it on the desktop or on their
phones and tablets, that its actually running BSD Unix underneath…
A. That is actually Unix?
91
Q. Yes, Android is Linux and iOS is Unix…
A. BSD, once it went out of the market, because Linux came up, BSD was replaced, and then say even
though Unix was more robust than any other thing, BSD was replaced, but again, it’s used in some other
kinds of devices.
Q. Moving on to desktop PCs, even though you’ve talked a little about the future you see for computing
in general. So, the sales of desktop PCs are meant to be on the wane, so reducing over the last few years
– so why do you think that is? And, do you think desktop PCs are relevant anymore?
A. The trend which is showing, or the development which is happening, will remove the use of desktop
PCs within a short time span. As like say how the landline phones are not used much. A similar way.
Because, the business requirement will be changing, the industry requirement will be changing, so
naturally, sitting in one particular place and working with that traditional concept of having a computing
front will be changed. So, it may be restricted in a very nominal area but say 75% of the computing
equipment will be replaced with mobile equipment.
Q. So during your career, and your own private life can you tell me what operating systems you’ve
installed yourself on desktops or laptops – on x86 devices – please just walk me through the operating
systems you’ve installed yourself…
A. Myself, I have performed Windows installations of Windows 7 or Windows 10 – I’ve used on my
home PCs, personally. I have a test environment with Linux also, which is for training purposes for my
relatives – my daughter is learning Linux – so one Linux installation I have. On phones naturally the
most demanded one is the Android platform – so Android is the OS which is used on my phone.
Q. You’ve already told me about your exposure to Solaris Unix, and to Linux, so what do you think
about Linux in general, what do you know about Linux in general – just tell me a little bit…
92
A. See Linux is, our requirement with Linux was with hosting our platform for Oracle, or the database
system. Emailing is more robust when it comes to Linux, error-free, and the safety measures, the safety
facilities which are available in Linux is much much better than Windows based OS. So these are the
few which I can recall. Let’s say coding, scripting, developing is very very easy and nice to me, when
I am working with Linux.
Q. When it comes to Linux distributions, which ones are you aware of – not necessarily the ones you’ve
used but the ones you know of?
A. We used multiple versions like Red Hat is the most common to be used, and apart from that on some
of our projects we used other versions as per the customer requirement.
Q. And what were those? Can you just identify them?
A. There were several flavours, I can’t remember the names of which were used, but that was the
customer requirement for what we used.
Q. No problem. So the next question is, are you aware that there are currently 815 unique Linux
distributions? – so where you have identified Red Hat for example of one of them, and say offshoots of
Red Hat would be CentOS and Fedora, which makes it 3, there’s another 812 different distributions –
what do you think about that?
A. Because it is an open platform, anybody will be able to use their ideas and develop their own OS.
This is actually adding more value and power to this particular OS because the contribution from
multiple people and they have the liberty to take their own ideas into this OS – and that’s the reason
why so many versions have been developed. But that doesn’t decrease the power of this OS. When you
see the hosting, such as Netcraft – out of 10, we can see that 7 servers are hosted by using Unix or
93
Linux. Say, one of the sites we can see this is netcraft.com – it will be showing the currently statistics
like the major servers that are hosted by using which OS. So, if you list it out, you can see out of 10, 3
servers will be under Windows, and another say 3 or 4 will be using the Unix platform and the remaining
will be Linux – maybe various flavours of Linux. So that itself, is proving the power of this particular
OS.
Q. Now in terms of using Linux, you mentioned that you have a Linux setup at home which your daughter
uses, do you have, where again there’s lots of distributions, there’s lots of different graphical user
interface choices as well – is there a particular one you choose over another?
A. It is basically for the academic purpose, you know. So the manuals and books referenced they refer
to that particular Red Hat version which is installed, just for the academic purpose.
Q. So you would just use the one that comes as default with the particular installation that you use? So
for example, Red Hat is using GNOME and other ones are using KDE or LXDE etc. – so you don’t
really have a preference of one over the other?
A. No, because actually the objective is, the requirement is, just for the academic purpose she would
like to go through all these things. So as per the book which I’ve mentioned it goes through the same
graphical user interface which she is using.
Q. In your opinion, is Linux easy to use? And why?
A. In terms of user flexibility, maybe Windows would be much better than the current versions
available in Linux flavours. But, a lot of improvement is there. Comparitively, im comparing 3 OSs –
the Sun Solaris interface, the Windows interface and Linux – Solaris’s interface is superb and the
graphical resolution, plus flexibility – it is the highest one. Second, I’d be ranking Windows, and third
94
I’d be ranking Linux, whatever Red Hat, when it comes to graphical interfaces. In terms of flexibility,
it is Windows and Solaris which would be the high ranking ones.
Q. Do you think that Linux needs substantial training time and effort to be invested for someone to able
to use it?
A. It’s a matter of mindset. I found that in one of the states in India, their government education system
is recommending this particular open source platform – for all the schools. So amazingly, what I’ve
found is that all these schoolchildren, they are experts in Linux. Rather than Windows, they prefer to
have that. In addition, one of the projects, which we initiated whilst we were in New Delhi – it was
called ‘hole in the wall’. What we did is, in our company, (NI IT) we made a kiosk which is open
towards the outside of the office in the street and there we took this OS, Linux based application, Red
Hat, for the people who just walk around or just standing at the bus station, they are having the facility
to browse the net – that’s what they provided, that what we provided at this point. And over a period of
time, there were so many people, making use of this particular facility like browsing the train timings,
and bus timings, by using this operating system.
Q. And obviously they weren’t trained to use it?
A. Exactly. So which makes a very amazing result, which is basically down to a matter of how it is
presented. So, when we compare these things, it’s a matter of mindset, how we accept that particular
OS.
Q. Moving on, why do you think Linux is rarely preinstalled on a new desktop or laptop?
A. It is required, say I found that within the last 2-3 years, many of the vendors are shipping their laptops
with the Linux system, Linux OS, free edition of Linux. But one of the challenges is that most of the
95
applications, say around 75% of the applications are available or programmed for Windows. So, that is
one of the challenges that means we are not able to adopt another OS.
Q. Moving on to hardware support with Linux, do you have anything you would like to talk about when
you think about issues that you may have had with hardware?
A. The thing is, compatibility is one of the challenges we face both with Linux and Solaris. Some of
the devices are not recognised, and the drivers are not available and the functionality is restricted, and
even like communication with a device, even if its configured, third party equipment will not
communicate with these things – so that way there are huge challenges when it is coming to this OS.
Q. Do you think that Linux performs well on obsolete hardware? So, on old out of date hardware?
A. Yes. When properly configured, it is well and perfect.
Q. I want to ask you something now, and this is where privacy issues might come into play, and might
necessitate your name being removed. We all know that having worked in IT that many times you will
have someone coming to you saying Prasad my laptop is not working properly, can you wipe it and
reload it for me. Now you don’t have to admit it, let’s talk about a hypothetical / another person that
has access to volume licensing keys for Windows, it’s very easy to go and reinstall Windows using what
in effect is a pirated key. Do you think that because of that, that is also preventing Linux becoming more
prevalent on the desktop? So if you said if you want Windows you have to pay US$100 for the version
of I can put this free Linux distribution on there. Do you think this kind of software piracy effects the
market share of Linux?
A. Software piracy is one issue, but my concern is here, Linux is offering a free edition without any
license, and as an end user people are able to use Linux whereas for Windows we need to pay a license
free. The acceptance of the OS is not as wide as it is with Windows.
96
Q. Is that because it’s easy to put a pirated version of Windows with a key that will activate? is it
because of that?
A. That would be one reason. Secondly, it’s how the product is presented. That is one of the reasons.
Say, one example I can show is initially when the Apple Mac OS is introduced it is introduced as a
computer for playing games – the initial Mac computers were presented as a kind of game station. Then
they changed the scenario, it can be used for high end graphical professional work. So they still didn’t
present that particular product for business computing. This resulted into the growth of the IBM open
platform because they presented it as business centric. Like that, you’ll see that for Linux also, I think
there’s a kind of a bottleneck situation there with the way they are presenting into the market, or to
business(es) – and should be changed.
Q. Do you think it’s possible to do everything on a Linux desktop that you could do on a Windows
desktop? And why?
A. Yes, yes.
Q. Why do you say yes?
A. It is possible. Say theoretically, any of this equipment can be configured to do the same. The
challenge is the volume of effort we need to do to achieve this particular configuration is much higher
than the Windows platform. So that is the only restriction with what we may face or challenge we may
face under Linux.
Q. Are you satisfied with the availability of applications on Linux versus say Windows?
97
A. Not much. Applications availability is very poor under Linux, the Linux platform. With Windows a
large variety of apps are available. So that is another reason.
Q. Do you think that now with the prevalence of cloud applications and webapps, that this makes it less
of an issue?
A. Up to an extent it might be able to cover this gap.
Q. Do you think that users care what operating system runs on their desktop or laptop?
A. Yes.
Q. And why do they care?
A. Basically, the flexibility or the user experience is one of the things that matters. Say, for performing
the task – how much, how quickly can they do it, how easily can they do it. That is a factor which the
user will consider when choosing the OS (to use). Irrespective of the previous time, now, people are
aware of both what is an OS, and what OS in running in my computer and these concepts, the end user,
the bare end user, is aware about it.
Q. This is the final question, if you were on a limited budget, and had to set up a network of workstations
from scratch, would you consider Linux?
A. There is one factor. Say, the thing is for the requirement of this particular network for running what
kind of application is a factor. This app should work unider Linux. And in my concept, I would surely
go for a Linux based network rather than a Windows based network, because it will be more secure, it
will be a robust system, where there will not be any kind of issue or trouble in the network.
98
Q. Is there anything else you would like to add that maybe the questions haven’t elicited from you?
Something that in your experience might affect why Linux is not successful on the desktop.
A. Over a period of time, the drastic change which has happened in end user computing is a big change.
But what happened is after these mobile systems, smartphones and all, the growth (on the desktop) is
not as envisaged earlier, say after these smartphones it is expected that in another 5 years, everything
will be drastically changed, but the growth, the momentum (of the desktop), is not up to that envisaged
or planned earlier.
99
Interview 3 – Sanjay Banerjee
Sanjay is the Head of Business Process Automation for one of the largest shipping conglomerates in
the Middle East. He has worked in the IT field for 21 years, firstly as a developer, and is now involved
in designing and managing large scale enterprise wide systems and interfaces.
Q. Ok let’s begin, can you let me have your age please?
A. That’s 3 9, 39
Q. Ok, and how long have you worked in Information Technology for?
A. Since ’95, so 21 years roughly.
Q. Could you briefly describe your career from its beginnings until now? Like what you’ve been through
in terms of job role changes…
A. So, my career has basically been a fruition of technology and shipping industry and other related
industries. So, ok on the shipping side I’ve had a broad spectrum of various shipping roles, all the way
from consultancy to working with class (Shipping classification societies), working with ERP
companies, working with ship owners like UASC. On technology side, it’s basically been a gradual
progression starting from designing and developing my own bespoke ship design and stability systems
and moving on to the corporate side – that means designing and managing large scale enterprise wide
systems and interfaces. So, again its mainly been Windows based systems. Initially, in my career there
have been a few Unix/Linux based applications, mainly PHP programming development. That was
100
before, there was some experience on Fortran, nowadays there’s something called OS/400 which is I
think IBM’s OS for Power Systems series.
Q. So that’s nothing to do with AS/400 right?
A. The AS/400 can work on OS/400 as well.
Q. Years ago I used to support an AS/400 mainframe, it looked like a piano, like a grand piano – it was
as big as one as well…
A. They’re supposed to be extremely reliable so we used it for financials.
Q. My experience was very much the same it was rock-solid for sure. So obviously you’ve just talked
me through your career a bit, in terms of the technological changes which you’ve also touched upon
somewhat just now as well, are there any other changes that you’ve noticed since you began your career
21 years ago?
A. The way you said 21, I sound, I feel old now.
Q. I’ve been working for 20 years, so 1 year behind you...
A. Ah ok that’s comforting. But, my experience was mostly on the application side, so when I first
started out it was again mainly using assembly level languages, rather than the high level programming
languages that we use today. At that time, we used to look down at these high level programming
languages as pseudo code and today everything is GUI based and I think yes almost everything is high
level programming and meant for portability. At the same time the technology has gotten cheaper and
it is much easier to customise the off the shelf solutions. In the old days it was quite difficult once you
created it, very much like old SAP, once you’ve created it it’s much more difficult and expensive to
101
modify and today things are much much faster and cheaper. Also, that time I think it was strictly client
server applications and todays its multi-tiered, web enabled and its highly interconnected and the key
focus at the moment is more on the security side, like personal infrastructure, organisation and that
wasn’t the case, or as much of the case in the early ‘90s, especially when the Internet was still growing.
Finally, you’ve got your open source applications. They were present there but they are now
widespread, extremely widespread and popular today – GNU licenses that is.
Q. Then looking back at your career, in the time you’ve been working can you discuss the operating
systems that you’ve used both at work and also privately in your own personal capacity?
A. Operating systems – mainly Win systems all the way from Win something and then Win NT and 95
and above, for PHP programming used Linux, used it on the LAMP stack that’s the Linux AMP, Linux
Apache, MySQL, PHP. So basically was creating web apps on these languages, and currently, well I
was doing that in the early 2000’s then moved on with various employers and currently the employers
are again using Win based systems, OS/400 again that’s the IBM based on AS/400 and we do have web
app support for Android and iOS – which are based on Linux and Unix but it’s just one more variant of
Linux/Unix.
Q. Terrific, so you’ve answered on of my questions that was to come a couple of questions down which
was with regards to portable devices and how they’ve evolved. Obviously as you’ve correctly stated the
two main players in today’s market which is Android and iOS are using the Linux kernel and BSD Unix
at their heart. You’re already aware of that so that’s something we can then skip through. So talking
now specifically more about desktop PCs, so the sales of PCs are supposedly on the wane, so why do
you think that is? And do you think that the PC is still relevant anymore?
A. I look at it this way. One way of using a business system has gone done or waned – like the use of
floppy disks, but we are using more portable devices like laptops, tablets, phablets and since the UI is
more web based the need for desktop PCs as such is not really there. Especially desktop PCs are not
102
really needed when we do not have a need for severe client resources like the old systems used to. But
nonetheless there are new databases, new systems – Hadoop for example which depends on distributed
computing I think, and they, if say for example Hadoop is not on the cloud that means you’ve got it at
your site, that way desktop PCs might help in being a cluster of networked PCs to help Hadoop in its
processing.
Q. When you are looking back, I mean you probably get annoyed about these looking back through
your career questions, but have you installed an operating system before on a desktop or a laptop and
if so what was it?
A. I’ve installed but it’s all been Win. I had installed Linux decades ago so I can’t remember really
what happened with it, but it was again to do a bit of, to learn my first steps on Java or something.
Q. Can you remember why you maybe did away with that Linux installation that you had? Is there
anything that springs to mind?
A. It was a part of the training, during the training course, so the LAMP stack again – learning, to learn
MySQL and PHP you had to install, learn to install Linux and MySQL and then start using the
programming.
Q. OK, so beyond that have you had any additional exposure to Unix and Linux? Beyond what you’ve
already told me so far?
A. For installation no. For usage, extensive because iOS, Android – everyday usage. But then
development wise PHP was the last experience I had on this which was early 2000’s.
Q. What do you think about Linux in general? So if you had to state in one paragraph what you thought
about it…
103
A. One paragraph? OK, from what I know, it’s free, its reliable, it’s definitely more stable than the
previous versions of Windows. You can create portable apps and its helped to keep prices low,
providing competition to the proprietary markets. So, there are lots of products even today – for example
Jaspersoft is based on GNU licenses, on free licenses and that’s really help keep the market competitive
and at the same time the prices low.
Q. Good, and which Linux distributions are you aware of? So not necessarily ones you’ve only used
but ones you know exist…
A. Exist, I mean the, again iOS and Android they broke off quite some time ago (from Unix and Linux).
I’ve had some experience on FreeBSD (Unix) which was again some time ago and I was using various
other ones like Suse, Red Hat – do they have their own versions or do they have the same version of
Linux? That is something I do not know.
Q. Well actually, there in total 800, well as of the last time I checked, there’s 815 unique distributions
of Linux. So that’s excluding the BSDs and Solaris, AIX other Unix platforms so just specifically Linux
there’s 815 unique distributions, what do you think about that?
A. Wow. It’s almost like, it feels like a fragmented market. Are they compatible? In terms of trading
apps on one and not working on the other? Do you know about this?
Q. Well that’s a good question. I mean generally speaking they should be depending on the kernel
version that the particular distribution is using. Also beyond that…
A. Even now, iOS and Android you require a tool to help you convert code from iOS and Android and
vice-versa, so 815 distributions it’s a really fragmented market.
104
Q. Exactly, when you consider say, let’s say Windows 10, I mean I’m not sure of the exact number but
there’s probably 3-4 maybe distributions – so you’ve got the Enterprise version, Professional then
maybe a home version. So say 3 variants, when you look it’s a ridiculous amount right?
A. True, true, especially if each and every one of them is trying to hog the market.
Q. In terms of when you talked about Red Hat and Suse for example, have you used those distributions
yourself?
A. Not really, there were, I used to come across them as software. We used to get CDs with magazines
and the CDs used to contain a lot of software. So it was through that that I came to know about Red Hat
and Suse, but no personal experience on that.
Q. Then, I’m not sure if you will be able answer this question but it would actually apply as well to BSD
and other Unix variants. Do you know the different GUIs that are available for Unix, and Unix like
operating systems? Are you aware of some of them?
A. Again, this is almost like 15 to 20 years ago – so the screens at that time were clunky and at the same
time Windows screens were clunky as well. But then today if you look at Apple screens they are the
best, they’re supposed to be the best. So I won’t be able to comment except from the viewer’s point of
view.
Q. The next question which I would have asked would have been about package management which
actually feeds into something you brought up yourself when I mentioned to you (about) the 815 flavours
of Linux. You actually asked a pertinent question of me, which is are the applications compatible across
those different variants? Now the package management question would actually feed into that but I
understand you are probably not in a position to answer that because you would need to have in depth
use of one of the distributions in order to know how to update the packages.
105
A. True, true.
Q. So, we can skip that one. In your opinion would you say that Linux is easy to use? And why?
A. Easy to use in terms of the end user? Development? Which way?
Q. From your own eyes, so as Sanjay…
A. So end user yes I would say looking at iOS (Unix), looking at Android (Linux) it’s easy to use. From
the application point of view, the development point of view I used it very very long ago, but I used it
on the higher level applications so PHP etc. It’s all coding, you just have to get used to it. One thing
(that is) good about Linux, Linux is much more stable. So, working or developing code on Linux was
much easier that way.
Q. This question feeds into the previous one, which is, it’s often stated that Linux involved a lot of
training time and effort to be invested in order to get the best out of it – what do you think about that
statement?
A. I think that’s standing with every new thing, because if you look at Hadoop, and you look at
Jaspersoft and look at other new things that have come, they sell that particular product but then they
also sell the associated training and support – especially they set up training libraries and sell training
courses and classes. So it is not just a learning curve in terms of paying, but it’s also a separate revenue
stream for these very organisations – so this fits hand in hand – literally.
Q. OK, then why is it that Linux is rarely preinstalled on a new desktop or laptop that you might buy in
a shop?
106
A. I’m assuming it’s to do with market dominance, some agreements of Microsoft with Intel, or
Microsoft with other companies, and I think at the same time the users are already comfortable with
Windows or say Excel, then the versions of Linux however, or if you look at Google’s Excel (Google
Sheets), its pretty good but then still not as good as Microsoft Excel. So, if a standard user, who just
wants to use the software for his or her own business use then he or she will migrate to what they are
used to or whichever is easy to use. Which is why I assume they are using a flavour of Windows over
anything else.
Q. So do you think that its maybe because, well you’ve mentioned about market dominance, what would
make you think it is the reason of market dominance behind the success of Windows, if you had to guess
what it might be?
A. I would put it under something called an oligopoly, which is something practiced by Microsoft. So,
once they have captured the market, 90 plus, 95% plus of the market, then they can pretty much dictate
or collude with various manufactures to ensure that their systems get on board. On top of that, once the
users are thoroughly trained, then they, there is reluctance on their part, on their side to want to learn or
migrate to something else. The same thing applies to Apple users as well. Once they are trained on
Apple systems then they want to stick to it, and want to keep purchasing Apple products only.
Q. So you think its people get used to something that they know, that familiarity makes it more
comfortable for them, a fear of the unknown – would you say that?
A. It’s more like your comfort zone, so I started out development with Visual Basic. Now, it’s the
simplest of all coding languages. For me to move to the next step, was only when Visual Basic became
VB .NET which had elements of C++, it had elements of .NET technology. It was only then that I took
that stride – until then I was happy developing apps that would satisfy business needs that was using a
very simplistic tool, rather than using C++. So it all depends on the comfort zone and whether your job
107
can done in the shortest possible time. And in that particular case, people who are very comfortable
with Windows would stick with Windows.
Q. I’m not sure based on what we’ve talked about already if you would be able to answer some of the
next section of questions which is about hardware compatibility and Linux – would you feel comfortable
if I asked you the questions?
A. Not really, because hardware, I’m more from the application development side.
Q. In that case then we can skip those, but there is one question from that particular section that I would
like to ask you. Now, we won’t ask it about you, but let’s say there’s a hypothetical IT person working
for a company and one of his colleagues comes and says I’ve got a problem with my private computer,
the operating system is screwed up, can you reinstall for me? Now, obviously many people that work in
a corporate environment in IT, they have access to volume licensing keys for Microsoft products so are
able to, should they wish to, sneak through a couple of installs of the latest version of Windows – which
is in essence software piracy. Now, if the user had to spend US$100 to buy that license, and they were
offered the choice by the IT person to install Linux instead – do you think that maybe this kind of piracy
affects Linux’s desktop share in the market?
A. I wouldn’t know how much it affects the desktop share, or the usage, but the thing is piracy is there
everywhere. I mean when I was in India, we used to get these pirated CD versions of a lot of software
included Linux, different variants of the Linux OS. But the thing is from what I remember, Linux, based
on GNU licenses is freely available. It’s only if you want to buy higher end, or better versions of the
same Linux that’s when you have to pay for that. So, I wouldn’t know too much on the costs, but I’m
assuming Linux would be definitely cheaper than having a full-fledged Windows OS version. So under
those circumstances it shouldn’t really be too much of a piracy issue for Linux. One more thing that
comes up is the cloud computing part – which is tomorrow, if everybody literally the entire office is
cloud computing, then its more on the corporate level or making decisions on how many licenses – how
108
to auto scale depending on the number of requirements than individual licenses and that can be better
controlled.
Q. That also leads me into another question which was coming further down the line which was with
the prevalence of cloud applications and web apps do you think that this will give operating system like
Linux the chance to have more desktop market share?
A. Yes definitely I think so. Mainly because it increases visibility, it hopefully also increases
transparency so that way hosting your solution on the cloud will give the vendor better chances to
market as well as control the versions of products that they are selling. On the other hand, there are,
there must be other reasons why Linux presumably is not doing as well as Windows. Maybe it is
commercial, maybe market reasons – they are not going to change, so how much is it going to help in
terms of enhancing the Linux brand I’m not really sure but cloud and web apps definitely should help.
I mean iOS and Android apps on the cloud they are definitely picking up.
Q. That’s very true. So we are almost towards the end but just a couple more questions. So, based on
your knowledge and opinion is it possible to do everything on a Linux desktop that you would do on a
Windows desktop?
A. I think so. I mean yes maybe with slightly more difficulty. At a very simplistic level working with
Excel is still easier than working with Google’s Excel (Google Sheets), so in terms of features and
functionalities and so forth. So that way even today I feel for some reason, and maybe that is because
having used Windows systems for so long, but I feel much more comfortable working with Windows
Excel than Google Excel. And I forgot, what was the question?
Q. It was about whether it is possible to do everything on a Linux machine that you could do on a
Windows one? But I’ll elaborate slightly more so that it narrows the field of questioning, you talked
109
about Excel as one of your examples – so do you feel that this lack of availability of applications on
Linux is a problem or do you think that there isn’t a lack of applications?
A. Again if you look at the Apple store for iOS then all the applications that end users need seem to be
there. Even though the users may or may not need it. But on the other hand if you’re I think in the
development environment Microsoft is still holding back on giving access to certain elements of ASP
or certain elements which prevent the, there’s a particular word for it, but creating, Apple creating apps
on Windows platforms or something like that – so it is more to do with again the business and the
commercial aspects than on the design and development side.
Q. Do you think users care what operating systems run on their desktop or laptop?
A. From a personal use I don’t think so. From a business use I think whatever makes them more
comfortable as long as they can deliver their reports – the standard user.
Q. Then finally if you could set up a network of workstations from scratch with a very limited budget,
would you consider Linux and why?
A. I wouldn’t know whether I would consider Linux or not. But what I would consider is can it be done
on the cloud and how much would it cost compared to having it hosted at our site, because increasingly
the focus is getting all of the infrastructure as managed services. So if I can do that on the cloud and
somebody else has to take the headache for it that’s good for me.
110
Interview 4 – Renjith Janardhanan
Renjith is the IT Manager for Europe and the Middle East for an international steel trading company.
Working in IT for almost 15 years, Renjith has during his career been responsible amongst other things
for the purchase and support of 1000s of PC desktops.
Q. Let’s just start off quickly with some background questions. Can you tell me how old you are?
A. I’m 37.
Q. Ok, and how long have you worked in IT for?
A. Around, nearly 14 plus years
Q. OK and can you describe your career from its start until now please?
A. Basically I started as a support function, facility management engineer – like desktop support,
originally started as desktop support and that comprised of around 50 to 60 computers initially and I
had to support them. From there I gradually moved onto a different company. The company was in
Mumbai, India, called JNPT – it’s a port, so I had to support 300 to 400 computers, at the desktop level,
so PC installation, desktop installation – any kind of fault identification on the desktop – it was a
complete desktop support.
From there I had gone to a different company called Zenith Computers – its basically a computer
manufacturing company. In those days the assembled computer was in demand, and these guys they
supplied ocmputers like in a lot say a 1000, 1500 computers to different companies. So I had to, I was
111
actually posted to different companies where they were going to supply a 1000 computers, I had to cater
to the installation, transferring from the old computer the data, applications – from the old computer to
the new computer – it’s a kind of migration, as well as the support.
Q. So this was Zenith Data Systems?
A. Yes, Zenith Data Systems.
Q. Yes, they were in the UK as well – very big…
A. Yes they were big. From there I got an opportunity from a different company – basically a service
provider of Compaq. Those companies in those times – Compaq had some authorised service providers,
authorised service centres. So this company had allocated me to Reliance Industries, Reliance is a
chemical company in Mumbai, so they again had desktops, as well as network support. So I was slowly
getting into desktop support as well as the support functionality in the network. When I say support
functionality I mean switch configuration, connectivity, server side – because I had to support in those
days the mainframe connectivity as well and the desktop as well. So I was in involved in this for quite
some time, I think 2 years, 2 and a half years.
Then I was actually posted to Reuters, it’s basically a market data service. So what they have is real
time market data services, they have to deliver computers, equipment, it’s a proprietary application they
have customisation – so even Microsoft, Microsoft is customised for Reuters. So I had to implement
those customised servers and customised desktops to banks. Specifically, in the banking industry the
trading department of the banks. So this included satellite installation as well, connectivity for the
satellites, because Reuters they have proprietary satellites where they are getting a real time feed and
they have some proprietary again, dealing system – where they can trade off between the banks, between
banks from India or some bank in the US or UK or any other countries. Once such example was BNP
112
Paribas, BNP is one of the banks, HSBC, Standard Chartered. So, these banks are dealing with trades,
forex trades – I was involved in that and I got a new opportunity to go into DSP Merrill Lynch.
Merrill Lynch is again a financial institution so I was again into desktop support, as well as server
support. The primary responsibility was the trade floor, so dealing systems, I had to support different
kinds of market related systems such as Reuters, Bloomberg, Teletrade – the proprietary systems of the
trade floor.
Lastly before coming to Dubai I was in ICICI Securities, I was actually involved in a complete
migration, actually a complete office move, I was involved in around 600 to 700 computers along with
around 45 servers – including, it was actually a heterogeneous environment where you had Sun, Linux,
Microsoft, and different kinds of market related operating systems and equipment. I was actually the,
responsible for the complete data centre, starting from the power – the rating, the power capacity
management, the complete rack installations. So I had moved all the servers along with the help of the
desktop support engineers, I migrated them from one location to another, from A to B – set up
everything. This was a 1-week project, a complete week project. We had migrated the services, stopped
services in the night and started the services in the day – the next day the next business day. It was
working all fine, there were no issues at all.
I moved to Dubai in 2006. I first joined CAD Gulf Computers LLC. This was desktop support. I was
completely involved with the desktops, servers, network when I was with ICICI Securities. CAD, in
Dubai it was a completely different environment, the job profile is even though it was senior customer
support engineer, but I had to deal with all kinds of printer related issues, desktop related issues. So I
was involved in all kinds of PC related, server related, network related, printer related (issues) – to
different customers. So in a day I had to complete 5 to 6 calls, that is 1 location probably in the Bur
Dubai area, another location in Karama, some locations in Ras Al Khaimah, or Jebel Ali. I had to travel
to different places and fix the issues.
113
Within the 6 months of being with CAD Gulf, I was recruited by DLA Piper which is a law firm, so I
was the resident engineer, again I had to support the desktop, network as well as all things server related.
There was a central IT which was located in Leeds, UK. I had to follow the policies and procedures
form Leeds, as the UK was controlling the central IT and I had to deploy based on the instructions from
them and I had make daily reports relating to the IT calls.
I spent a year with DLA Piper, and then I got the job with Macsteel. Currently, I have been involved in
2 major office moves – the first office move from LOB 17 to JAFZA Views, it was a complete migration
of the services, including telephone infrastructure and the desktop environment. The second move was
from JAFZA Views to JLT – this was a complete infrastructure redesign and the user setup was also
different. It was a complete migration from scratch. Right now, after this move, there were several
changes that happened. Some of the services, the majority of the services are integrated in the cloud.
So now, to be honest my primary responsibilities are desktop support, network – I am dealing with the
network support for Hong Kong, Dubai and the US – as well as some functionalities such as Blackberry
support, some servers which are there in the cloud system which on a day to day basis I need to maintain.
Even though the infrastructure has moved to the cloud, still the management still needs to be done
locally, for example the messaging – the Exchange server, domain, SharePoint. These servers are still
managed by myself and that’s it and I have implemented various other projects like upgrading from
Windows 7 to 8.1, the Office environment up to 2016. Integrated Skype for Business, integrated Skype
for Business for mobiles, and this has helped to reduce the cost, and enhanced, given a better path for
the business.
Q. So you have extensive experience both with servers and with desktops – which is what I have
ascertained from what you have just run me through. That’s very good because that’s something we are
going to dive into a bit now. Before that, if you can just briefly, just discuss the technological changes
that you have seen since the start of your career, like how have things changed from the day that you
first went to work and where you find yourself today?
114
A. Well, I would say that from a technological perspective, a lot of changes, the majority of changes
are done from an end user perspective, even though the server side yes, from the infrastructure side,
there is always changes and new inventions, redesigning of operating systems, which are actually better,
can enhance the business. But from the user side experience, the changes for the way the user uses the
desktop. So first the desktop, the assembled PC was being phased out completely, and the branded
computer came into the picture with the majority of organisations. Previously this was not affordable
for every company like buying a branded computer. Now it is very common that a branded computer is
what everyone can afford, and its more, it’s just a label part. But its actually good that you should go
for that because it covers, the written off cost is actually covered because you will get within 3 to 4
years, 4 to 5 years it will definitely work. That’s one thing there.
The second thing I can say is that the desktop environment is slowly getting phased out and it is getting
into a different type of working environment which is probably laptop. I would say that have seen there
also changes but that’s actually very good everybody is using laptops instead of a desktop, which is
actually pretty good. The reason is that you don’t, when you are using a laptop mobility is good, you
can take your laptop whilst you are in the office or take it away without any issues. So, yes that is one
thing that everyone can afford, even a laptop. A desktop now costs around 2700 dirhams, I would say
a decent desktop from Lenovo with i7, SSD, you can afford that at 2700. But if you pay another 2000
dirhams more then you can get a decent laptop now. So assembled to branded, desktop to laptop and
now the latest is everyone’s laptop has gone to being an Ultrabook. So previously it was thick, a thick
laptop. The average weight for a laptop was probably about 3KG, 2.6KG, 3KG – now it is less than 1.2,
probably 1.3. I can say as an example, I have replaced all the laptops of the traders – which is actually
a key area of the business – replaced the traditional legacy laptop to the next one Carbon, previously
X1 Carbon was very very expensive. Now it is within the budget it is 6500, nearly 6500, it is less than
1.2 KG, 1.3KG. So the weight of the laptop is 1.3KG and it is so thin it’s an Ultrabook, and again the
hard disk, the legacy hard drive has been replaced by SSD. So with just one touch, boom the laptop
makes a noise, when you compare that, everything the operating system, it’s the way how Apple
computers, the iPad, how it is booting up, so there is absolutely no delay. So I would say the person that
115
is actually working on a desktop, a legacy desktop, which is having a spinning drive, and if the same
person, if you allocated them an Ultrabook with an SSD his productivity has increased, drastically. A
substantial increase in his productivity. So, for example opening any applications, booting any operating
system, any computer functionality – it has significantly increased the productivity and will enhance
the user experience as well. So over a period everyone’s laptop would prefer this kind of hardware
rather than the traditional desktop and laptop environment.
Plus the addition of mobility, everyone would use integrated applications on their mobile so the
preference in the way they are working, either on the mobile or the laptop – its more mobility I would
say that.
Q. OK, so looking back on your career, can you briefly discuss the operating systems that you’ve used
both in a work capacity as well as in a personal capacity – tell me about the operating systems that
you’ve used…
A. I’ve used operating systems - Windows 95, a long way back. I would still consider Windows 95 as
actually a very reliable operating system. Windows 95, and then later Windows 2000. These are some
of the operating systems – Windows 98 SE, Second Edition, I would say that Windows 95, Windows
98 Second Edition, Windows 2000 and now the latest Windows 7, the desktop Windows operating
system Windows 7 – these are the operating systems I have worked on intensively and I would say
Windows 7, and Windows 98 SE, Windows 98 SE was fast, Windows 7 yes it is still the latest operating
system, and will probably last for another 3-4 years I think.
The latest operating system which I worked upon after 7, was Windows 8.1 which is substantially, the
operating system is very light, such as the booting up procedure and major enhancements are there in
the operating system. The latest operating system is Windows 10, I have used Windows 10.
116
Q. So beyond the Microsoft operating systems, you mentioned in your work history that you had also
used Linux as well, have you used Unix or Linux – could you tell me a bit more about those that you
may have used?
A. I have used Linux intensively, for personal, as well I have used it for official use. It’s an operating
system that seems best for business, for business purposes. Again you can treat business purposes as
the enterprise. As an infrastructure site, you can best utilise the Linux operating system. Like, I would
say as a proxy server, or any kind of server that is actually used by say several market data services like
Bloomberg, Teletrade. Another operating system like QNX, QNX it’s a flavour of Linux, Unix
environment I mean.
Q. QNX is a real time operating system, if I recall correctly…
A. Yes its used by Reuters, they have a server it’s called AMS server, it’s getting a real time feed from
the satellite of Reuters and through the server the feed is deliver to the end user, so it’s used extensively.
Plus, I have worked on Sun Solaris server, Unix – I have worked on IBM AIX 5.5. So these operating
systems I wouldn’t consider, I mean they’re are flavours of desktop environment, but this would be best
for a network operating system, and would support best for the infrastructure side rather than the desktop
environment.
Q. What about on portable devices? Similarly where you have just taken us through the desktop side of
things, could you just tell me a bit about how you have seen portable devices such as phones, tablets,
PDAs of yesterday – how they’ve sort of evolved until today, but not too in depth just briefly….
A. So in the mobile environment, as of now the stable operating system I would say is iOS by Apple.
Well considering the adaptability or flexibility or more features I’d still go for Android because I’d go
for a lot of features you can add, there’s a lot of features that are user friendly which is not allowed in
117
iOS but you can see in Android. So I’d say iOS in personal, but yes for general acceptance, Android is
best.
Q. Do you know what operating system kernels those 2 operating systems you’ve mentioned, iOS and
Android, what they are based upon?
A. Android, I’m not sure about the kernel system I think it is a Unix flavour. iOS I think it’s actually a
kind of, it comes from a Unix environment, or, I’m not sure about that.
Q. So if I told you that Android used the Linux kernel and iOS is using the BSD Unix kernel, well it’s a
hybrid of that and another kernel, but basically 1 is Linux and 1 is Unix, what’s your view on that?
A. Well Unix I would prefer; I prefer a stable operating system, rather than Linux. Linux would still be
considered, but it’s freeware, Unix is something that as far as I know is not a freeware operating system.
Q. Speaking about desktops, the sales of desktops are supposedly on the wane at the moment. You’ve
already touched upon that evolution from the traditional desktop system with a spinning drive, you’ve
talked about people are going towards portability. So do you think that desktops are still relevant
anymore?
A. Well still desktops in terms of the requirements, I would say that some of the users require large
amounts of storage, where expansion is required, additional expansion – like those who have high end
graphics requirements, for editing purposes. So you can purchase a graphics card which costs
substantially less than when you go for a branded high end laptop. So that probably, the desktop will
not phase out from the market, or for the end user completely but it will be based on the requirement, it
will be still there. The difference is the demand will not be the same as before.
118
Q. So you talked about how you’ve been involved in desktop support a lot throughout your career. So
what desktop operating systems have you installed? If you can just give me a list, some of them you
mentioned already but if you could just give me a list of the operating systems you’ve installed on
desktops…
A. In the initial days, on a 486 I have installed Windows 3.11. Moving on I have installed Windows 95,
then Windows 98 and 98 SE, and Windows 2000, then in-between there was an operating system,
Windows Millennium Edition.
Q. I remember it very well; it was a terrible operating system…
A. Windows 2000, Windows 7 and Windows 8 and 8.1, and Windows 10
Q. Have you installed Linux on any desktops?
A. I have installed Linux but I cannot recollect what the version was. I’ve installed Linux Red Hat,
that’s the only one I have – that was on a server. On the server level also Unix and AIX.
Q. So what do you think about Linux in general?
A. Well Linux it is basically, the operating system from a booting perspective, as well as the operating
system functionality, it is faster, with less bugs, but you need an advanced level of knowledge to
troubleshoot the environment if some issues are there. It is not the same, there is a big difference with,
it’s based on how you use it, but from an end user perspective I would say that it’s still Microsoft that
is the best operating system from an end user perspective.
Q. So talking now about Linux distributions, you mentioned one of them already which is Red Hat, what
others ones are you aware of?
119
A. As far as my knowledge goes only Red Hat, I had not installed, or had much experience with the
Linux environment because I have not installed it.
Q. So if I told you that there were 815 unique Linux distributions what do you think about that?
A. Well, I would say that Linux environment is freeware, with different companies developing the
operating system, well I wouldn’t say from a personal experience, yes because throughout the
companies like starting from I would say 1999 onwards, I have not seen any environment which is not
Linux or completely Linux orientated desktops. So with my experience and based on the knowledge I
have leaned towards the Microsoft environment where I have seen, from 1999 until 2016 I have worked
on full time, as well as visiting 200 plus companies, different companies – at any of these companies I
have not seen a full-fledged Linux environment, so I have worked on Linux environment where on the
infrastructure side they have the Linux environment but I am surprised to see that 800 variants or
different flavours, but there are probably some companies which are still using it, but I am sure, but I
have not seen. Even if there is so much popularity there amongst companies, why have I seen companies
that have not adopted?
Q. The next question is, in your opinion is Linux easy to use and why?
A. Well, I wouldn’t first of all say, Linux is easy to use in terms of the way the operating system is, the
architecture of the operating system, but from the operating perspective I don’t think it’s so easy to
operate, unless and until you are so trained with that you will know how to use Linux. I don’t think it’s
so easy.
Q. Ok that leads me to the next question where you have touched upon the training side of things, so in
some circles Linux is viewed as difficult to use and needs substantial training and time and effort to be
invested in order to be able to use it. What your view on that statement?
120
A. Well, first of all any organisation if they adopt the Linux environment it is not just the operating
system they have. The operating system is basically probably their stepping, or should I say the base
from where they need to work, they might be having variants, so for example a travel company – air
tickets – Wings, any small company where 15 or 20 people are there – they have a different application.
They have their, I mean they are not working on an operating system, they are working on an application
to generate business and the same way, that’s how they are generating money. So the operating system
is just the tool, it’s just a base where they can install a tool which is actually the application which with
they generate money through that. SO the limitations are there and the compatibilities of various
applications. They need to first study about that, if it’s really working on the Linux environment or if
they have a variant of the Linux environment and can say yes it definitely works, then all the users need
to have training sessions for usage, because it’s totally different from a Windows perspective.
Why I would say training is required, as you know everyone has a Windows laptop, its very common,
everyone has a Windows laptop, in a house at least 2 laptops are there - minimum 1 laptop is there in
any house, even if it’s those who are living with their parents. So the natural training, the daily training
when they get into that operating system (Windows) is sufficient to get on and work in an office
environment. You don’t need to have training for Windows unless it’s some specific functionality to be
honest. Nobody needs training for the Windows environment, you don’t need it because they are so
used to it.
But, if you have changed that environment to Linux, definitely you need to have training in place.
Continuous support and assistance is required when you are changing the operating system where
everyone has globally used (something else – i.e. Windows)
Q. So why do you think Linux is rarely preinstalled on a desktop or laptop?
121
A. I would say that if the general acceptance level of the consumer market with Linux, well the
consumer market has not accepted Linux, mass consumers have not adapted to the Linux environment.
Every user has adopted the Windows environment. If anyone buys a laptop, anyone would go for only
a Windows operating system. Even with consumers, any business that is selling in the market they
would rather sell the Windows environment than a Linux preinstalled piece of hardware, unless of
course it’s a mobile.
Q. So in your opinion and your experience, if you’re able to answer, could you discuss hardware
support with Linux, what you are aware of when it comes to hardware support…
A. As far as I know, the majority of the hardware supports Linux, the last time I installed on a PC, I
installed a simulator, a switch simulator, it works fantastic for that operating system, because it uses the
specific memory, process – it will be, I would say, for certain functionalities the Linux environment is
best, because you can setup memory usage, utilise the resources in an effective and efficient way in a
Linux environment. That is something which is very good, and the hardware support, the hardware is
best utilised in the Linux environment, I would say that.
Q. OK, so leading on from that question, do you think that Linux also performs well on obsolete
hardware? So where you have an old machine . . .
A. Absolutely, that would be, that is 100%, on hardware which is not a quad core, has less cores in the
processor, or has less memory – not 8 or 16 or 4 GB memory. It works perfect. The reason is not many
applications load, the operating system itself is very light, when I say light it’s not having more graphics,
rather less graphics, and as a result can utilise the hardware much more effectively compared to
Windows 7.
Q. This is more a hypothetical question, I don’t want to implicate you in any wrongdoing – but from 1
IT person to another, we’ve, we’re all in those situations where a person will come to us and say my
122
laptop is messed up can you reload the operating system for me, and obviously let’s say the hypothetical
person let’s call him Prathap for example – let’s say he has access to volume license keys, and could
reload say put the latest version of Windows on a laptop, which would save the user the license cost of
US$100 going to buy it themselves, when you could offer them a free operating system like Linux, so do
you think that being able to provide, or an IT person being able to provide pirated versions of Windows,
do you think that eats into Linux’s market share on the desktop?
A. Well, I think I should deviate from this subject with that, because nowadays nobody is really using
a pirated operating system, unless they don’t preserve the key or the source file. Any laptop or hardware
which is shipped by the manufacture, it is coming with an operating system – it is just coming with an
operating system. So the requirement for a pirated operating system, I have not seen nowadays, the
requirement for anyone. As an example I would say that any operating system, let’s say Android, or
iOS, Apple is not having any operating system market share – that’s something that is basic, the
hardware needs to function, that’s free. Android is free.
As you known today is the last day of the free Windows 10 upgrade, so even Microsoft is leaning
towards that direction, where the operating system is no more something which is a business generation
software. So in those terms, it’s like equal, if you consider Linux or iOS or Android, there is no
difference. So, if you have a laptop there must be an operating system, if you don’t have an operating
system key, then that is a different issue, then you will be working on pirated software. But then, the
control which is a sort of trick nowadays, as with a pirated operating system from Microsoft it is now
difficult to obtain the patches. Nowadays, nobody has pirated software. I feel like that. It’s my opinion,
nobody will be looking for any pirated software unless you are looking at the 3rd world country like
where they might be using a refurbished PC, or laptop, and it is difficult to load any operation system.
Q. Understood, so in your opinion, do you think it’s possible to do everything on a Linux desktop that
you could do on a Windows desktop?
123
A. It depends upon the compatibility. Some of the applications, the compatibility. The majority of the
applications are compatible with the Linux environment, which I do agree. Even SAP is also compatible
with the Linux environment. Office is compatible with the Linux environment. Any proprietary
software which is being developed they are also designing them in such a way that Linux is required,
the Linux environment is supported. So, interoperability is no more a problem for Linux I think. There
should not be any problem for the Linux environment.
Q. So you feel that the availability of applications on Linux is sufficient? That that isn’t a reason why
it might have less market share on the desktop?
A. Well to be honest, I have not enough experience on the Linux environment where I have tested many
applications. I have very limited experience where I have all the applications, but my, as far as my
information, yes the functionality and the application compatibility with Linux is better, it’s the same.
Any hardware will also support the Linux environment.
Q. Just a few more questions left then it’s over. Nowadays, you talked about the cloud yourself earlier
on, but with the prevalence of cloud applications and web applications, do you think that this app, or
potential lack of applications on Linux would be less of a problem now? - where everything is more
browser based and cloud based, do you think that makes life easier?
A. Absolutely, 100% I would say that everything is getting to a service level where you can slice, and
get a slice of service from a vendor, from a cloud. As an example I would say that Salesforce is basically
a large CRM software which is widely used by everyone, now you don’t need to have an application
server setup. So if you have a small business, probably for example like if you have a small business of
5 or 6 persons you can go for an application like Salesforce.com, you just register, create the users and
in the usage of the Linux environment it’s perfect, there’s no hindrance for any kind of application
which is installed – which is to say license costs, hardware – it’s the best way of managing, probably,
that is the fastest way of usage – Linux uses less hardware, you don’t need a beefy machine, or beefy
124
laptop for an end user, you just need basically some kind of hardware just to communicate with the
internet and I’d say that side is perfect. That’s a perfect combination.
Q. If you were going to set up a network of workstations from scratch, with a very limited budget, you
would then consider Linux?
A. 100%, 100%. As a rational decision, this is the best approach where you need minimum hardware,
less expenditure on systems, on operating systems, you don’t need any kind of operating system. Its
track record, Linux it will work for years without any issues, and one thing which I have not explained
during this discussion is when you compare with the hardware, from a hardware and operating system
perspective, when the operating systems crashes or malfunctions, the Linux environment compatibility,
the crash or hardware malfunction, on Linux is less than Windows. I’m not comparing Windows 8.1 or
Windows 10, I’m not in production with Windows 10, its still in the testing phase. You will face less
problems with this operating system (Linux) and it can enhance a business by reducing the overall cost.
Q. The last question I have is; do you think users care what operating system runs on their desktop or
laptop?
A. No.
Q. Why do you say no?
A. Well, I would say that probably, and if I ask any of the traders, unless if I have specified it to them,
they would not know what is the operating system (in use). It’s basically a start button, internet explorer,
application in the task bar, opening the application – what operating system, basically nobody is getting
into the operating system core capabilities. Their experience on functionality is based on core
application level experience not on the operating system. The operating system is only extensively used
for Internet Explorer. Just browsing. Everything is Internet. They just want to get Internet. Whether it
125
is Internet Explorer, whether it is Firefox, any operating system or browsing medium, it doesn’t matter,
they’re more inclined towards Microsoft’s Internet Explorer, or Google Chrome – any browser which
can have the functionality of a secure environment where users feel that it is safe to use, that is sufficient.
Nobody is looking at what operating system I am on, unless it is having an issue like say booting is
slow, then they want to know “what is my operating system?”
Q. So this is an unscripted question, but this topic came up in an earlier interview for me today, do you
think that there has been some point in the past a split where a computer user say in the early ‘90s,
even if they use applications they would also understand the operating system very well, whereas now
you’ve touched upon it with your users that they actually don’t have any interest in what the operating
system is, they just have a certain set of repetitive things that they are doing and they just understand
that. Do you think there’s been a split where you would have a user that knew everything about their
computer 20 or 30 years ago, and now that’s changed and you have some that know and some that
don’t know? Would you agree?
A. No, absolutely no I wouldn’t agree with that. There is no change in the user perspective because they
don’t want to know. For example, if someone comes to work first of all they boot up, once it boots up
they start working. The first thing is they open up at the application. In our office, Macsteel, what I
would say is that the application that is extensively used is press the button, wait for the operating
system to load. During that time, they must be looking around, looking at the phone, having some coffee
or something, they don’t care how it comes up., When it does, they just type something into Internet
Explorer, SAP, send that’s it. So they, I would say, when I started, where they had this work
environment it was exactly the same. There is no difference at all., There is absolutely no difference at
all apart from that the booting process has enhanced the operating systems, the way of handling
applications and how they open it is so fast. The transition during these years, its fast, everything is
getting so fast, that is actually enhancing the productivity of end users, other than that I don’t think that
anyone is really noting what is an operating system Back then they didn’t notice what was the operating
system and now also they are not knowing that.
126
Interview 5 – Glen Coutinho
Glen is an electronic messaging expert working for a global oil and gas industrial services company.
Glen has 18 years of IT industry experience, and is responsible for the world-wide messaging
infrastructure of the organisation he is employed by.
Q. You know pretty much how this works, but can I just ask how old you are please?
A. 38.
Q. Ok, and how long have you worked in IT for?
A. A little over 18 years.
Q. Can you describe your career from its beginnings until now?
A. Ok, so I think I started my career in IT support, on the helpdesk, working with tickets and all. Then
as I progressed through, I got into specific technologies around messaging in particular, and from there
I went into more of an architect role – so designing solutions. I still do level 3 support but only on
escalations. Now I’m more into managing hardware and designing of messaging and collaboration
solutions.
Q. During your career so far could you also discuss a little about the technological changes that have
occurred that you’ve seen?
127
A. Right, so I think when I started off, from a hardware perspective I mainly worked on desktop and
laptop support and some of the technological changes I’ve seen – well (Windows) Vista and before that
time, used to be Windows 95, and email and all those (technologies) were not very popular at the time.
But as we progressed, I got into servers, worked with Windows NT and went onto Windows 2000
servers and a lot of it now, a lot of things now have obviously become mobile, so the whole mobile era
started. Now we have BYOD, PCs and tablets and mobile phones all connecting and accessing offices
resources.
Q. That would have been a question further down the line, which would have been about portable
devices, so you’ve already touched on that, and that’s good. You’ve also mentioned that you’ve used
during the early part of your career Windows 95 and Windows NT – could you also tell me about any
other operating systems you’ve used either in a personal capacity on your own private machines, if we
stick to PCs or desktops – x86 stuff – that you may have also used with your employers?
A. So you’re looking at only Microsoft related? Different versions within Windows?
Q. Whatever your exposure has been, if you’ve used DOS – talk about DOS…
A. So I started off personally using DOS, it used to be DR-DOS, then MS-DOS, so then you had
Windows 95, as its own operating system. Then I’ve used Windows NT, XP, Windows 2000 – you
want just desktops and laptops, not servers right?
Q. Correct, yes….
A. Windows Vista, then Windows 7 if I remember them all. Then personally I use a Mac as well so a
little exposure to that. That’s pretty much it.
Q. So no other operating systems beyond that? So Unix or Linux?
128
A. Not extensively, but you know at the college we’ve had exposure to Linux. I did work a little on
Linux in my previous job when we were setting up a Squid server
Q. Oh the proxy, Squid proxy?
A. Yes exactly.
Q. So when looking a little bit again, talking about portable devices, can you just talk a little about how
you’ve seen them change from an operating system standpoint, in your time that you’ve been involved
with them?
A. So, portable devices I assume you are talking about mobile phones and tablets. So previously we had
those portable PDAs, we used to call them PDAs, they had Palm OS – they were not very intuitive they
used to be for one specific function like calendaring and emails, things like that. Obviously, the
graphics, the whole GUI hadn’t reached the point that we have now, but then from there, Nokia used to
have the Symbian OS, which was their downfall because other operating systems that came about where
much more intuitive. Now obviously iOS and Android are about.
Q. You’ve talked about Android and iOS. Are you aware of what operating system kernel those are
using?
A. Android uses the Linux kernel, and iOS is based on one of the Unix distributions – I think its BSD
Q. Focussing more on desktops, now we are going to talk about only specifically about desktops,. So
when I say desktops or PCs I’m referring only to x86 based machines going forward. Desktop PCs,
their sales are supposedly on the wane currently. Can you discuss that a little bit? Why do you think
that’s happening and do you think desktops are relevant anymore?
129
A. I think desktops are becoming less relevant now. I mean we’ve seen those charts in our classes –
desktops versus laptops sales and now mobile phones so as we going into this mobile era especially,
portable devices, laptops, there’s a bit of a market but we can see things declining there. Desktops are
losing their market share for sure. I mean people want mobility. There may be specific functions, maybe
something high end workstations where you are doing some sort of engineering or drawing – things
like that, which require a lot more resources and necessitate desktops. But I think people are shifting
more towards just getting their work done.
Q. During your career have you ever installed an operating system on a desktop or a laptop? And if so,
what operating systems have you installed? Personally, yourself that you’ve had to install, what have
you installed?
A. I’ve installed obviously the Windows versions mainly Windows 95, Vista, XP, Windows 7. I
normally use Windows to be honest. Mac once or twice, you know OS X. I think Linux, I installed
Fedora when we were doing the lab testing. The live USB. I tried installing, but didn’t have much luck.
It wouldn’t boot from the USB; I couldn’t get the thing to work.
Q. It depends on the hardware obviously. If you are doing it on the Mac, you would probably have
issues because they probably…
A. Yes it was on the Mac.
Q. I think that’s probably the answer. We’ve already now touched upon the next question, which is what
has been your exposure to Unix and Linux – you’ve touched upon it that you’ve used it for Squid, and
also for university purposes – so has there been any other exposure that you can recall?
A. No, I am pretty sure it was just these two instances.
130
Q. What do you think about Linux in general? What’s your opinion?
A. To me, all that I have heard, read, used a little bit, its obviously marketed as something very scalable,
secure because its built on Unix stability. So it’s always being compared to Windows and Windows
obviously has bugs and things like that. So Linux in general it’s a stable OS, the way it handles its jobs
and the way scheduling happens – its more optimal than Windows. Whenever I’ve used it, I don’t know
if it’s because I’m a Windows user – when it comes to user friendliness I find it about difficult moving
around, navigating through stuff, and I don’t know if that’s just because I’m not so familiar with Linux.
Q. Understood, so you touched upon Fedora of one of the distributions that you’ve used or you are
aware of. Are there any others that come to mind when you think about different Linux flavours.
A. Yes, so for the uni I tried Fedora, I tried Ubuntu. Previously I had tried Red Hat for the Squid proxy,
I think Red Hat is Fedora right?
Q. That’s correct, Fedora is the cutting edge distribution of Red Hat…
A. So those are mainly the 2 that I know.
Q. If I told you that there were 815 unique Linux distributions, what would you think about that if I told
you that there were 815 different distributions?
A. I knew there were a lot, but I didn’t expect it to be that many. Definitely over 100 but that’s amazing.
I think it’s a good and bad thing Adam to be honest. When I think in terms of standardisation, when I
was trying to use Fedora and Ubuntu, the instructions I was being given – so I downloaded 1 distribution
and was trying to follow the instrcutions then it wont work because it was only valid for this – so I think
its ok for an IT person, but in terms of a regular user I think they would find it difficult if there isn’t a
common standard across these distributions. To me it’s a good and bad thing. Each person has a flavour
131
for what they want, or want to try. So they have many options, but in terms of standardisation and
people having to keep track of different commands and different ways to do things, that could be a
downside to it.
Q. There’s a couple of questions which I am going to skip, because you are only really familiar with a
couple of distributions. There’s fragmentation when it comes to GUIs, desktops – I’ll leave those
questions as I don’t think you’ll be able to talk about them so much. Again, similarly, different
distributions for example that you mentioned like Ubuntu and Fedora/Red Hat they also use different
package managers so when it comes to updates and installing applications its also different so that’s
something we will leave to one side. When you were using Linux for setting up the Squid proxy and the
test bed you mentioned for university, did you overall, well you said you found some navigation issues,
would you say it was easy to use or what would you say if I asked you that?
A. I won’t say user friendly. It wasn’t difficult navigating, but because I have IT experience, I know
I’m looking for services from utilities – things like that. So I’d go through the menus and find that. I
think if it’s a layman he’d probably have issues, and again, I think just because they’re so used to the
Windows UI and that’s the popular one in use. To me as an IT person it wasn’t very difficult, kind of
moderate. I wouldn’t say it was extremely user friendly as well. Especially when we had to open up
files to find certain processes and things like that – to me that was complicated. In another OS you just
look up the processes and the services, the process ID. I found I had to go through files which was
tedious.
Q. Again, this feeds into asking the same question in a different way, though I’ll drop part of it, in some
circles Linux is viewed as needing substantial training time and effort to be invested, what do you think
about that statement?
A. I would say that’s fairly accurate. Especially to a person, coming from my background, trying to
study for the labs, or even setting up the Squid proxy, it did take some time to pick it up so there is some
132
training, even though it was self-learning. But if you are planning to deploy this, you know say in an
office place you would need some training to get used to it. Kind of like the Mac and OS X, people find
it difficult to manage. Over the years, just because they are so use to one OS it could be down to that.
Q. Moving on, which feeds into what you just touched upon. Why do you think Linux is rarely
preinstalled on a new desktop or laptop?
A. It’s a good question. I never thought about it, but now that you’ve mentioned it, I don’t know if
there’s some sort of OEM contract in place or something like that, but one guess I would have to make
would come down to user preference – what’s the most popular OS that they are used to. So if you had
to have Linux I think there are some machines if I am not mistaken which do have an option to have
Linux installed on them, but I think that would cater more to IT professionals. So this Linux thing has
become the choice of operating system for some professionals but for the masses if you look at it
everyone’s most familiar with or aware of is Windows – which I think is what sells. So someone’s
going out there to buy a laptop and they come with Linux installed I don’t think they have such a big
market share so I think that’s one of the things.
Q. I’m not sure if you are going to be able to answer the next set of questions, but I’ll ask them and if
you are not able to you can just pass on them. In your opinion and experience, can you discuss hardware
support with Linux? Did you have issues when you were setting up the Squid proxy from a hardware
standpoint?
A. Yes, the hardware was problematic. The drivers especially. You had to look for these compatible
drivers. It wasn’t plug and play, so everything at that time I had to try and download several drivers to
find one that would work. It was problematic.
Q. Can you remember what kind of device did you have issues finding drivers for?
133
A. I think mostly it was printers, network cards, I think graphic cards. I remember those days, I’m not
sure if that is still the case.
Q. I won’t answer that, in keeping with phenomenology – giving you any of my thoughts
A. Yes, don’t influence my answers!
Q. Exactly, it’s very tempting I must admit, it’s very hard. Again this one will only be a subjective
answer that you can give but do you think Linux performs well on obsolete hardware?
A. I wouldn’t really know to be honest
Q. Then I have a question that I hope you find quite interesting. Let’s say there is a hypothetical
colleague, someone will come to him, let’s call him Ranjan, someone will come to him and say I’ve got
a laptop, I need you to wipe it and reload it for me. As we both know in our positions we have access to
volume license keys where we can install countless numbers of Windows operating systems without
having a proper track of the number installs. Do you think this kind of scenario which is basically bring
it down to what it really is – software piracy, do you think software piracy could be a reason, or have
an effect on the user base of Linux on a PC? So you put Windows for free instead of saying a license
cost of US$100 and offer Linux instead which is free. I wouldn’t want to give you US$100, do you think
that has an impact at all on the user base?
A. I think if you ask me, looking at it from a broader perspective Adam, I would think it has something
to do with the compatibility of the applications which we use. If you take even now, there’s software
piracy which is there, there’s certainly places where there is no copyright laws, but if you ask me if they
had the option to install something else – even if say its Mac OS X, they won’t know how to use it they
won’t be familiar with it. So they start off talking about ok what are the common applications we use,
so say Word, Excel etc., are they compatible with that? Can I use Word and Excel? Some of the features
134
are not as exactly the same as apples for apples and I think that’s the issue. Price wise I think there is a
factor of say that US$100, but I think the question they would ask if about applications – Can I do this?
Can I do that? Can I use Word and Excel? For me it’s a bit about the compatibility of the other
applications – Windows has that edge over the others.
Q. It’s like you anticipate the next set of questions, which is good that that is the case, as the next
question was going to be do you think it’s possible to do everything on a Linux desktop that one could
do on a Windows desktop?
A. For the most part, say I’ve got the Mac version of Office – there are a lot of functions that are missing
from that, that are only available in the Windows version right. From an admin perspective, yes you
can do probably more. But from a user perspective, I think the applications are quite limited. So you
have your own set of applications, I think Linux has it, but OS X has its own version of a word processor,
or a spreadsheet, things like that – but functionality wise it’s not up to the mark as some of the Windows
Office suite applications are.
Q. Looking now, at something you know quite well from our studies, with the prevalence of cloud and
cloud applications and webapps do you think this becomes less of an issue in the future?
A. Possibly, that’s a good point. It probably would, but I don’t know if it would have that big an impact
where you see Linux or Mac OS X taking such a big market share because people can use those
applications on the cloud. Again it comes down to the operating systems have evolved from say the
time of MS-DOS where there were only specific things they can do, but we have evolved to have
browsers built in and other bits and pieces – even the file managers – things like that. So, the problem
is the dominance of Windows has been there for so long that it becomes so familiar when using the
system. Just things like right click which on the Mac is a little different and people find that difficult,
so why I said I don’t think they will be widespread adoption is that people are so familiar with the
shortcuts and how to navigate through, I think that has an influence on their decision.
135
Q. Once again, you are anticipating the next question, we obviously sat together for too long. The next
question which has been partially answered already is, do users care what operating systems is running
on their desktop or laptop and why?
A. Yes. In my view they do, as they want to be able to get on with what they are trying to do. If they
are given an operating system that is challenging to them, it would probably become an issue to them.
Q. This is the final question, if you were on a limited budget, and had to set up a network of workstations
from scratch, would you consider Linux?
A. Yes, I would consider Linux. But, it’s not only about considering things plainly based on cost, or
being cost-centric. There are other hidden costs, like training, bringing people up to speed and feeling
comfortable with the operating system. So I would consider it, but I would perform a cost benefit
analysis to really make sure that it was the right choice based on all the angles, not just on a direct cost
angle.
136
References ‘About the FreeBSD Project’ (no date). Available at: https://www.freebsd.org/doc/handbook/history.html (Accessed: 8 June 2016) ‘A Brief History of Debian’ (2015). Available at: https://www.debian.org/doc/manuals/project-history/ch-intro.en.html (Accessed: 8 June 2016) Airoldi, E.M., Blei, D.M., Fienberg, S.E., Goldenberg, A., Xing, E.P. and Zheng, A.X., (2008) ‘Statistical Network Analysis: Models, Issues, and New Directions: ICML 2006 Workshop on Statistical Network Analysis’, Springer, Pittsburgh, PA, United States. Ajila, S.A. and Wu, D., (2007). ‘Empirical study of the effects of open source adoption on software development economics’. Journal of Systems and Software, 80(9), pp.1517-1529. Anand, M., (2015). ‘BeMi: An interactive GUI for GXbuntu’. Computing for Sustainable Global Development (INDIACom), 2015 2nd International Conference on, pp. 1301-1305. IEEE. ‘Announcing Fedora Core 1’ (2003). Available at: http://www.redhat.com/archives/fedora-announce-list/2003-November/msg00000.html (Accessed: 11 June 2016) Auger, B., (2004). ‘Living with Linux’. Library Journal, pp.16-18. Aul, G., (2016) ‘Announcing Windows 10 Insider Preview Build 14316’. Available at: https://blogs.windows.com/windowsexperience/2016/04/06/announcing-windows-10-insider-preview-build-14316/ (Accessed 7 August 2016) Bach, M.J., (1986). ‘The design of the UNIX operating system (Vol. 5)’. Prentice-Hall, Englewood Cliffs, NJ, United States Bean, L., Barlow, J. and Hott, D.D., (2004). ‘Windows Woes? You May Be Ready for Linux’. Journal of Corporate Accounting & Finance, 15(5), pp.13-22. Bhartiya, S., (2016) ‘Linux Torvalds still wants Linux to take over the desktop’. Available at: http://www.cio.com/article/3053507/linux/linus-torvalds-still-wants-linux-to-take-over-the-desktop.html (Accessed 7 August 2016) Bokhari, S.H., (1995). ‘The Linux operating system’. IEEE Computer, 28(8), pp.74-79. Boykin, J. and LoVerso, S.J., (1990). ‘Guest Editor's Introduction: Recent Developments in Operating Systems’. IEEE Computer, (5), pp.5-6. Brenton, C. and Hunt, C., (2006) ‘Mastering Network Security’, John Wiley & Sons, Hoboken, NJ, USA. Bretthauer, D., (2002). ‘Open source software: A history’. Information Technology and Libraries, 21(1), p.3. Brodkin, J., (2013) ‘Linux is king *nix of the data center – but Unix may live on forever’. Available at: http://arstechnica.com/information-technology/2013/10/linux-is-king-nix-of-the-data-center-but-unix-may-live-on-forever/ (Accessed 11 June 2016) Brothers, J.L., (1995). ‘Linux’. Linux Journal, (12es), p.1.
137
Byfield, B., (2007) ‘KDE v GNOME: Is One Better?’. Available at: http://www.datamation.com/osrc/article.php/12068_3671906_2/KDE-vs-GNOME-Is-One-Better.htm (Accessed 10 June 2016) Cai, J.Y., Nerurkar, A. and Wu, M.Y., (1998). ‘Making benchmarks uncheatable’. In Computer Performance and Dependability Symposium, 1998. IPDS'98. Proceedings. IEEE International (pp. 216-226). IEEE. Cameron, R., Woodberg, B., Giecco, P., Eberhard, T., and Quinn, J., (2010) ‘Junos Security’, O’Reilly Media Inc, Sebastopol, CA, USA. Campbell-Kelly, M. (2011) ‘Dennis Ritchie Obituary’. Available at https://www.theguardian.com/technology/2011/oct/13/dennis-ritchie (Accessed: 27 May 2016) ‘Can you explain the Net market Share methodology for collecting data?’. Available at: http://www.netmarketshare.com/faq.aspx (Accessed: 14 March 2016) Carbone, N., (2011) ‘Not So High-Tech Anymore: The First Website Ever Celebrates Its 20th Birthday’. Available at: http://newsfeed.time.com/2011/08/06/the-first-website-ever-celebrates-its-20th-birthday/ (Accessed 17 March 2016) Casadesus-Masanell, R. and Ghemawat, P., (2006). ‘Dynamic mixed duopoly: A model motivated by Linux vs. Windows’. Management Science, 52(7), pp.1072-1084. Chau, P.Y. and Tam, K.Y., (1997). ‘Factors affecting the adoption of open systems: an exploratory study’. Mis Quarterly, pp.1-24. Chaudri, A. and Patja, V., (2004).’ Windows v Lindows–have Microsoft won the battle only to lose the war?’ Computer Law & Security Review, 20(4), pp.321-323. Cheung, W.H. and Loong, A.H., (1995). ‘Exploring issues of operating systems structuring: from microkernel to extensible systems’. ACM SIGOPS Operating Systems Review, 29(4), pp.4-16. Claiborne Jr, C., (2001). ‘Making Inodes Behave’. Linux Journal, 2001(82es), p.2. ‘Company History | SUSE (2016). Available at: https://www.suse.com/company/history (Accessed: 10 June 2016) Conlon, M.P., (2012). Open Source Software in the Vertical Market: An Open Niche?. Journal of Information Systems Applied Research, 5(1), p.16. Cooke, D., Urban, J. and Hamilton, S., (1999). Unix and beyond: An interview with Ken Thompson. IEEE Computer, (5), pp.58-64. Corbató, F.J., Saltzer, J.H. and Clingen, C.T., (1972) ‘Multics: The first seven years’. Proceedings of the May 16-18, 1972, Spring Joint Computer Conference (pp. 571-583). ACM. Corbató, F.J. and Vyssotsky, V.A., (1965), ‘Introduction and overview of the Multics system’. Proceedings of the November 30--December 1, 1965, Fall Joint Computer Conference, part I (pp. 185-196). ACM. Coyle, K., (2008). ‘Learning to love Linux’. The Journal of Academic Librarianship, 34(1), pp.72-73.
138
Curwen, P. and Whalley, J., (2014). ‘Mobile Telecommunications Networks: Restructuring as a Response to a Challenging Environment’. Cheltenham, United Kingdom: Edward Elgar Publishing Daley, R.C. and Dennis, J.B., (1968). ‘Virtual memory, processes, and sharing in Multics’. Communications of the ACM, 11(5), pp.306-312. Dawkins, R., (1986) ‘The Blind Watchmaker’. W.W. Norton & Company, Inc New York City, USA.
Decrem, B., (2004). ‘Desktop Linux: Where Art Thou?’ Queue, 2(3), p.48. Dedeke, A.N., (2009). ‘Is Linux better than Windows software?’ IEEE software,26(3), p.104. Dedrick, J. and West, J., (2004). ‘An exploratory study into open source platform adoption’. System Sciences, 2004. Proceedings of the 37th Annual Hawaii International Conference on (pp. 1-10). IEEE. Dedrick, J. and West, J., (2003). ‘Why firms adopt open source platforms: a grounded theory of innovation and standards adoption’. Proceedings of the workshop on standard making: A critical research frontier for information systems (pp. 236-257). Seattle, WA. Delozier, E.P., (2009). ‘The GNU/Linux desktop: an open access primer for libraries’. OCLC Systems & Services: International digital library perspectives, 25(1), pp.35-42. Dettmer, R., (1999). ‘Liberte, fratenite and Linux [Unix compatible OS kernel]’. IEEE review, 45(3), pp.115-120. ‘Distrowatch: Put the fun back into computing. Use Linux, BSD’ (2016). Available at: http://distrowatch.com/weekly.php?issue=20160530#stats (Accessed: 11 June 2016) Dougherty, W.C. and Schadt, A., (2010). ‘Linux is for everyone; Librarians included!’ The Journal of Academic Librarianship, 36(2), pp.173-175. Dukan, P., Kovari, A. and Katona, J., (2014). ‘Low consumption and high performance Intel, AMD and ARM based Mini PCs’. Computational Intelligence and Informatics (CINTI), 2014 IEEE 15th International Symposium on (pp. 127-131). IEEE. El Khamlichi, M., (no date) ‘Linux Mint History and Development’. Available at: https://www.unixmen.com/linux-mint-history-development/ (Accessed 11 June 2016) Ellis, J.R., 1998. Objectifying Real-Time Systems (No. 2). Cambridge University Press, UK. Evers, J., (2004) ‘Novell counters Microsoft’s Linux ‘facts’ with ‘truth’’. Available at: http://www.infoworld.com/article/2682039/operating-systems/novell-counters-microsoft-s-linux--facts--with--truth-.html (Accessed 10 June 2016) Evers, J., (2005) ‘Microsoft gets more ‘facts’ for anti-Linux campaign’. Available at: http://www.infoworld.com/article/2668821/operating-systems/microsoft-gets-more--facts--for-anti-linux-campaign.html (Accessed 10 June 2016) ‘Fact Finding: Things Microsoft Doesn’t Want You To Know’ (2005). Available at: https://support.novell.com/techcenter/articles/nc2005_02c.html (Accessed: 10 June 2016)
139
Finley, K., (2013) ‘French National Police Switch 37,000 Desktop PCs to Linux’. Available at: http://www.wired.com/2013/09/gendarmerie_linux/ (Accessed 11 June 2016) Foley, M. J., (2007) ‘Microsoft kills its ‘Get the Facts’ anti-Linux site’. Available at: http://www.zdnet.com/article/microsoft-kills-its-get-the-facts-anti-linux-site/ (Accessed 10 June 2016) Gibbs, S., (2014) ‘From Windows 1 to Windows 10:29 years of Windows evolution’. Available at: https://www.theguardian.com/technology/2014/oct/02/from-windows-1-to-windows-10-29-years-of-windows-evolution (Accessed 7 August 2016) Giera, J. and Brown, A., (2004). ‘The Costs and Risks of Open Source’. Forrester Research. ‘The GNU General Public License v3.0 – GNU Project – Free Software Foundation’, (2007). Available at: http://www.gnu.org/licenses/gpl-3.0.en.html (Accessed: 16 March 2016) Goth, G., (2005). ‘Open source business models: ready for prime time’. Software, IEEE, 22(6), pp.98-100. ‘Groklaw – SCO v. IBM Timeline’ (no date). Available at: http://www.groklaw.net/staticpages/index.php?page=20031016162215566 (Accessed: 10 June 2016) Guerrini, F., (2014) ‘City of Turin decides to ditch Windows XP for Ubuntu and €6m saving’. Available at: http://www.zdnet.com/article/city-of-turin-decides-to-ditch-windows-xp-for-ubuntu-and-eur6m-saving/ (Accessed 11 June 2016) Hachman, M., (2011) ‘Linux 3.0 Released; Linus Torvalds Explains Why You Shouldn’t Care’. Available at: http://www.pcmag.com/article2/0,2817,2388926,00.asp (Accessed 11 June 2016) Har-Even, B., (2009) ‘Head to Head: Windows 7 vs Windows Vista’. Available at: http://www.itpro.co.uk/617176/head-to-head-windows-7-vs-windows-vista (Accessed 10 June 2016) Harji, A.S., Buhr, P.A. and Brecht, T., (2011). ‘Our troubles with Linux and why you should care’. Proceedings of the Second Asia-Pacific Workshop on Systems (p. 2). ACM. Harji, A.S., Buhr, P.A. and Brecht, T., (2013). ‘Our troubles with Linux Kernel upgrades and why you should care’. ACM SIGOPS Operating Systems Review, 47(2), pp.66-72. Hars, A. and Ou, S., (2001) ‘Working for free? Motivations of participating in open source projects’. System Sciences, Proceedings of the 34th Annual Hawaii International Conference on (pp. 9-pp). IEEE. Hauben, R., (1994). ‘Unix and Computer Science’. Linux Journal, 1994(4es), p.7.
Hayward, D., (2012) ‘The History of Linux: how time has shaped the penguin’. Available at: http://www.techradar.com/news/software/operating-systems/the-history-of-linux-how-time-has-shaped-the-penguin-1113914/2 (Accessed 4 June 2016) Hilley, S., (2002). ‘Linux – to be or not be secure’. E-Commerce, Internet and Telecommunications Security, Elsevier. June 2002, pp.1-2. Hiner, J., (2008) ‘The top five reasons why Windows Vista failed’. Available at: http://www.zdnet.com/article/the-top-five-reasons-why-windows-vista-failed/ (Accessed 10 June 2016)
140
Houghton, S., (2014) ‘Microsoft, just admit it: Windows users don’t want Windows 8’. Available at: http://www.techradar.com/news/software/operating-systems/microsoft-just-admit-it-windows-users-don-t-want-windows-8-1218497 (Accessed 10 June 2016)
Hughes, P., (1997). ‘Stop the Presses: The Linux Trademark’. Linux Journal, (40es), p.16.
Husserl, E., (1970). ‘Logical Investigations’. Routledge, New York City, NY, United States.
‘IBM100 – Linux – The Era of Open Innovation’ (no date). Available at: http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/linux/ (Accessed: 4 June 2016) ‘IDC Reports Notable Growth in Shipments of Client Operating Systems in 1998’ (1999). Available at: http://www.prnewswire.com/news-releases/idc-reports-notable-growth-in-shipments-of-client-operating-environments-in-1998-73438012.html (Accessed: 10 June 2016) ‘IEEE SA – POSIX – Austin Joint Working Group’ (2016). Available at: http://standards.ieee.org/develop/wg/POSIX.html (Accessed: 4 June 2016) ‘Industry Leaders Announce Open Platform for Mobile Devices’ (2007). Available at: http://www.openhandsetalliance.com/press_110507.html (Accessed: 11 June 2016) ‘Interview with Linus Torvalds from Linux Format 163’ (2012). Available at: http://www.tuxradar.com/content/interview-linus-torvalds-linux-format-163 (Accessed: 11 June 2016) Johnson, S.C. and Ritchie, D.M., (1978). ‘UNIX time-sharing system: Portability of C programs and the UNIX system’. Bell System Technical Journal, The, 57(6), pp.2021-2048. Josey, A., (2015) ‘POSIX® 1003.1 Frequently Asked Questions (FAQ Version 1.15)’. Available at: http://www.opengroup.org/austin/papers/posix_faq.html (Accessed 4 June 2016) Kanaracus, C. and Jackson, J., (2010) ‘Microsoft-led group to pay $450m for 882 Novell patents’. Available at: http://www.computerworld.com/article/2514243/open-source-tools/microsoft-led-group-to-pay--450m-for-882-novell-patents.html (Accessed 11 June 2016) ‘KDE – KDE Project Announced’ (1996). Available at: https://www.kde.org/announcements/announcement.php (Accessed: 10 June 2016) Kernighan, B.W. and Ritchie D. M., (1978). ‘The C Programming Language’. Prentice-Hall, Englewood Cliffs, NJ, United States. Kernighan, B.W. and Ritchie D. M., (1988). ‘The C Programming Language (Second Edition)’. Prentice-Hall, Englewood Cliffs, NJ, United States. ‘Key Open-Source Projects’. (1999) PC Magazine, March 23 1999, p.172
Kirby, S., (2000). ‘Free to choose: the real power of Linux’. Library hi tech,18(1), pp.85-88. Kshetri, N., (2004). ‘Economics of Linux adoption in developing countries’. Software, IEEE, 21(1), pp.74-81.
141
Kshetri, N., (2007). ‘Increasing Returns and the Diffusion of Linux in China’. IT Professional, IEEE, 9(6), pp.24-28 Lalani, A. F., (2016) ‘The iPhone – why is it so successful?’ University of Middlesex, Dubai Campus, UAE. (Research Paper) Le Mignot, G., (2005) ‘The GNU Hurd’. Extended Abstract of Talk at Libre Software Meeting, Dijon, France. Leibovitch, E., (1999). ‘The Business Case for Linux’. IEEE software, 16(1), p.40. Lewis, T., (1999). ‘The open source acid test’. Computer, 32(2), pp.128-125. Leonhard, W., (2012) ‘Windows 8 review: Yes, it’s that bad’ Available at: http://www.infoworld.com/article/2618073/microsoft-windows/windows-8-review--yes--it-s-that-bad.html (Accessed 10 June 2016) ‘The Linux Foundation’. Available at: http://www.linuxfoundation.org/ (Accessed: 10 March 2016) ‘linux-historic-scripts’ (1992). Available at: https://github.com/kernelslacker/linux-historic-scripts/blob/master/changelogs/0.12.txt (Accessed: 4 June 2016) ‘LINUX is obsolete’ (1992). Available at: https://groups.google.com/forum/#!topic/comp.os.minix/wlhw16QWltI%5B1-25%5D (Accessed: 4 June 2016) Loscocco, P. and Smalley, S., (2001). Integrating flexible support for security policies into the Linux operating system’. Proceedings of the FREENIX track: USENIX Annual Technical Conference. Love, R., Are, S.H.W., Linus, A.C., Kernels, L.V.C.U. and Begin, B.W., (2005). ‘Linux kernel development (second edition)’. Pearson Education, Upper Saddle River, NJ, USA. Macedonia, M., (2001). ‘Will Linux be computer games' dark horse OS?’ Computer, 34(12), pp.161-162. Maddox, J.R. and Putnam, K., (1999). ‘Linux for accountants’. CPA JOURNAL,69, pp.26-31. Maier, F., (2011) ‘LiMux Desktop Retrospective – Five Years of governmental Linux desktops in München’ Available at: https://desktopsummit.org/sites/www.desktopsummit.org/files/DS2011_LiMux_Desktop_Retrospective_2011-08-08.pdf (Accessed 11 June 2016) Malone, T.W. and Laubacher, R.J., (1999) ‘The dawn of the e-lance economy’. In Electronic Business Engineering (pp. 13-24). Physica-Verlag HD. Massey, B., (2005). ‘Opening the mainstream: O'Reilly Oscon 2005’. Software, IEEE, 22(6), pp.101-102. ‘Matt Welsh promoted to full professor; granted tenure’ (2010). Available at: http://www.seas.harvard.edu/news/2010/06/matt-welsh-promoted-full-professor-granted-tenure (Accessed: 5 June 2016) McLaren, S., (2000). ‘Linux: a viable alternative or desert mirage?’ Library hi tech, 18(1), pp.82-84.
142
Miller, M., (2004) ‘Interview: Michael Dell – Where Do We Go From Here?’ Available at: http://www.pcmag.com/article2/0,2817,1501476,00.asp (Accessed 18 July 2016) ‘mdw.la’ (no date). Available at: http://www.mdw.la/ (Accessed: 5 June 2016) Moritsugu, S., (2000) ‘Practical Unix’, Que Publishing, Seattle, WA, United States. Mowery, D.C. and Simcoe, T., (2002). ‘Is the Internet a US invention?—an economic and technological history of computer networking’. Research Policy, 31(8), pp.1369-1387. Mull, A.J. and Maginnis, P.T., (1991). ‘Evolutionary steps toward a distributed operating system: theory and implementation’. ACM SIGOPS Operating Systems Review, 25(4), pp.4-13. Munga, N., Fogwill, T. and Williams, Q., (2009), ‘The adoption of open source software in business models: a Red Hat and IBM case study’ Proceedings of the 2009 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists (pp. 112-121). ACM. Myers, M.D. and Avison, D. (2002). ‘Qualitative research in information systems: a reader’. Sage, New York City, NY, United States. O’Mahony, S., (2003). ‘Guarding the commons: how community managed software projects protect their work’. Research Policy, 32(7), pp.1179-1198. ‘Operating system market share’ (2016). Available at: https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=8&qpcustomd=1 (Accessed: 11 June 2016) Organick, E. I., (1975) ‘The Multics System: An Examination of Its Structure’, MIT Press, Cambridge, MA, USA. Orlowski, A., (2002) ‘The Register: Microsoft 'killed Dell Linux' - States’. Available at: http://www.linuxtoday.com/infrastructure/2002031901026NWMSLL (Accessed 10 June 2016) Ortega, G., (1999). ‘Linux for the international space station program’. Linux Journal, 1999(59es), p.8. ‘Our history in depth – Google company’ (no date). Available at: http://www.google.com/intl/en/about/company/history/ (Accessed: 10 June 2016) Paterson, N., (2013) ‘Is Microsoft’s Windows 8 As Bad As Critics Claim?’ Available at: http://news.sky.com/story/1087833/is-microsofts-windows-8-as-bad-as-critics-claim (Accessed 10 June 2016) Paul, R., (2008) ‘Ars Technica reviews KDE 4.0’. Available at: http://arstechnica.com/information-technology/2008/01/kde-40-review/ (Accessed 10 June 2016) Paul, R., (2007) ‘A visual timeline of the Microsoft-Novell controversy’. Available at: http://arstechnica.com/information-technology/2007/01/linux-20070128/ (Accessed 11 June 2016) Paul, R., (2012) ‘Linux kernel in 2011: 15 million total lines of code and Microsoft is a top contributor’. Available at: http://arstechnica.com/business/2012/04/linux-kernel-in-2011-15-million-total-lines-of-code-and-microsoft-is-a-top-contributor/ (Accessed 11 June 2016) Peng, X., Babar, M.A. and Ebert, C., (2014) ‘Collaborative software development platforms for crowdsourcing’. IEEE Software, (2), pp.30-36.
143
Pennington, H., (1999). ‘GTK+/Gnome Application Development’. New Riders, Indianapolis, IN, United States. Poole, H., (2005). ‘The Internet: Biographies’.ABC-Clio, Santa Barbara, CA, United States. ‘Project (no date). Available at: http://developer.linuxmint.com/projects.html (Accessed: 10 June 2016) Raymond, E.S., (1999). ‘A Brief History of Hackerdom: Open Sources,’ O’Reilly Media, Farnham, Surrey, United Kingdom. Reilly, C., (2016) ‘Facebook wants to be everywhere, except Blackberry’. Available at: http://www.cnet.com/news/facebook-whatsapp-messenger-pulls-app-support-blackberry-bb10/ (Accessed 7 August 2016) Richardson, M., (1997). ‘Stop the Presses: Ownership of Linux Trademark Resolved’. Linux Journal, (43es), p.26. Ritchie, D.M., (1996) ‘The development of the C programming language’. History of Programming languages---II (pp. 671-698). ACM. Ritchie, D.M., (1984) ‘Reflections on software research’. Communications of the ACM, 27(8), pp.758-760. Ritchie, D.M. and Thompson, K., (1974). ‘The UNIX Time-Sharing System’. Communications of the ACM, 17(7), pp.365-375.
Rumelt, R. (2011) Good Strategy/Bad Strategy: The difference and why it matters. London, United Kingdom: Profile Books Safee, S.B. and Viknesh, T.J., (no date). ‘Benchmarking Of Various File Systems And Evaluating Their Performances’. University of New Mexico (Research paper).
Salah, K., Calero, J.M.A., Bernabé, J.B., Perez, J.M.M. and Zeadally, S., (2013). Analyzing the security of Windows 7 and Linux for cloud computing. Computers & Security, 34, pp.113-122. Salus, P. (1994). ‘A Quarter Century of UNIX’. Addison-Wesley Publishing Company. Reading, MA, United States. Sanders, J., (1998). ‘Linux, open source, and software's future’. Software, IEEE, 15(5), pp.88-91. Schwarz, M. and Takhteyev, Y., (2011). ‘Half a Century of Public Software Institutions’. Journal of Public Economic Theory, 12(4), pp.609-639. Severance, C., (2014). ‘Andrew S. Tanenbaum: The impact of MINIX’. IEEE Computer, 47(7), pp.7-8. Shakespeare, W., (2011). ‘The Tempest’. Simon and Schuster, New York City, USA.
Shankland, S., (2002) ‘Linux shipments up 212 percent’. Available at: http://www.cnet.com/uk/news/linux-shipments-up-212-percent/ (Accessed 10 June 2016)
Shapiro, J.S., (2004) ‘Extracting the lessons of Multics’. login:TheUsenix Magazine
144
Shenkar, O. (2010) ‘Copycats: How Smart Companies Use Imitation to Gain a Strategic Edge’. Harvard Business Press, Boston, MA, United States. Smart, C., (2010) ‘The Three Giants of Linux’. Available at: http://www.linux-mag.com/id/7721/ (Accessed 4 June 2016) Spence, E., (2013) ‘Blackberry Has An App Problem With the Z10 and BB10’. Available at: http://www.forbes.com/sites/ewanspence/2013/03/20/blackberry-has-an-app-problem-with-the-z10-and-bb10/#228c16895c68 (Accessed 7 August 2016) Stallman, R., (1998). ‘The GNU project’. Available at http://org.noemalab.eu/sections/ideas/ideas_articles/pdf/stallman_eng.pdf (Accessed: 17 March 2016) Stallman, R., (2002) ‘My Lisp Experiences and the Development of GNU Emacs’. Available at: https://www.gnu.org/gnu/rms-lisp.html (Accessed 4 June 2016) Stallman, R., (2011) ‘Transcript of Richard M. Stallman's speech - Free Software: Freedom and Cooperation’. Available at: https://www.gnu.org/events/rms-nyu-2001-transcript.txt (Accessed 4 June 2016) Stanfield, V. and Smith, R. W., (2006) ‘Linux System Administration’, John Wiley & Sons, Hoboken, NJ, USA. ‘Stanford BackRub’ (1997). Available at: https://web.archive.org/web/19971210065425/backrub.stanford.edu/backrub.htm (Accessed: 10 June 2016) Stange, S., (2015). ‘Detecting malware across operating systems’. Network Security, 2015(6), pp.11-14. Sterling, T.L., (2002). ‘Beowulf cluster computing with Linux’. MIT Press, Cambridge, MA, United States. Tanenbaum, A.S., (1987) ‘A UNIX clone with source code for operating systems courses’. ACM SIGOPS Operating Systems Review, 21(1), pp.20-29. Then, E., (2009) ‘Xfce creator talks Linux, Moblin, netbooks and open-source’. Available at: http://www.slashgear.com/xfce-creator-talks-linux-moblin-netbooks-and-open-source-0633329/ (Accessed 10 June 2016) Thiruvathukal, G.K., (2004). ‘Gentoo Linux: the Next Generation of Linux’. Computing in Science and Engineering, 6(5), pp.66-74. Thurrott, P., (2002) ‘Linux Market Shrinks in 2001’. Available at: http://windowsitpro.com/systems-management/linux-market-shrinks-2001 (Accessed 10 June 2016)
Thurrott, P., (2016) ‘Windows Phone is Irrelevant Today, But It Still Has a Future’. Available at: https://www.thurrott.com/mobile/windows-phone/66491/windows-phone-irrelevant-today-still-future (Accessed 7 August 2016)
Toomey, W., (2011). ‘The strange birth and long life of Unix’. Spectrum, IEEE, 48(12), pp.34-55.
145
Torvalds, L. B., (1993) ‘The Choice of a GNU Generation An Interview With Linus Torvalds’. Available at: http://gondwanaland.com/meta/history/interview.html (Accessed 4 June 2016) Torvalds, L. B., (2014) ‘The merge window being over, and things being calm made me think I should try upgrading to F21’. Available at: https://plus.google.com/+LinusTorvalds/posts/Wh3qTjMMbLC (Accessed 17 July 2016) Tsegaye, M. and Foss, R., (2004). ‘A comparison of the Linux and Windows device driver architectures’. ACM SIGOPS Operating Systems Review,38(2), pp.8-33. ‘Unity (no date). Available at: http://unity.ubuntu.com/projects/unity (Accessed: 10 June 2016) ‘Usage statistics and market share of Unix for websites’ (2016). Available at: http://w3techs.com/technologies/details/os-unix/all/all (Accessed: 14 March 2016) Vaughan-Nichols, S. J., (2016) ‘How to get started with Ubuntu and Bash on Windows 10’. Available at: http://www.zdnet.com/article/ubuntu-and-bash-arrive-on-windows-10/ (Accessed 7 August 2016) Vaughan-Nichols, S. J., (2011) ‘Linus Torvalds would like to see a GNOME fork’. Available at: http://www.zdnet.com/article/linus-torvalds-would-like-to-see-a-gnome-fork/ (Accessed 10 June 2016) Vaughan-Nichols, S. J., (2011) ‘While you shouldn’t expect Windows to be open-sourced in your life-time, Microsoft –yes, Microsoft – is the fifth largest code contributor to Linux 3.0’. Available at: http://www.zdnet.com/article/top-five-linux-contributor-microsoft/ (Accessed 11 June 2016) Ven, K., Verelst, J. and Mannaert, H., (2008). ‘Should you adopt open source software?’ Software, IEEE, 25(3), pp.54-59. Vincent, J., (2015) ‘Android is now used by 1.4 billion people’. Available at: http://www.theverge.com/2015/9/29/9409071/google-android-stats-users-downloads-sales (Accessed 17 March 2016) Wallen, J., (2016) ‘Ubuntu convergence finally impresses me’. Available at: http://www.techrepublic.com/article/ubuntu-convergence-finally-impresses-me/ (Accessed 7 August 2016) Warren, T., (2015) ‘Windows Phone has a new app problem’. Available at: http://www.theverge.com/2015/10/23/9602350/microsoft-windows-phone-app-removal-windows-store (Accessed 7 August 2016) Waugh, R., (2015) ‘Flight chaos as airport admits its air traffic control PCs still run Windows 3.1’. Available at: http://metro.co.uk/2015/11/16/flight-chaos-as-airport-admits-its-air-traffic-control-pcs-still-run-windows-3-1-5505950/ (Accessed 7 August 2016) Welsh, M., (1995) ‘Linux Installation And Getting Started (2nd edition)’, Specialized Systems Consultants, USA. Welsh, M., (2003) ‘Running Linux’, O’Reilly Media Inc, Sebastopol, CA, USA. West, J. and Dedrick, J., (2001). ‘Open Source Standardization: The Rise of Linux in the Network Era’. Knowledge, Technology & Policy, 14(2), pp.88-112.
146
West, J. and Dedrick, J., (2001). ‘Proprietary vs. Open Standards in the Network Era: An Examination of the Linux Phenomenon’. System Sciences, 2001. Proceedings of the 34th Annual Hawaii International Conference on (pp. 10-pp). IEEE. West, J. and Mace, M. (2010) ‘Browsing as the killer app: Explaining the rapid success of Apple’s iPhone.’ Telecommunications Policy, 34(5), pp.270-286 Whittaker, Z., (2015) ‘A 23-year-old Windows 3.1 system failure crashed Paris airport’. Available at: http://www.zdnet.com/article/a-23-year-old-windows-3-1-system-failure-crashed-paris-airport/ (Accessed 7 August 2016) Wiegand, J., (1993), ‘The cooperative development of Linux’. Professional Communication Conference, 1993. IPCC 93 Proceedings.'The New Face of Technical Communication: People, Processes, Products' (pp. 386-390). IEEE. Wilkes, M.V., (1992). ‘The long-term future of operating systems'. Communications of the ACM, 35(11), pp.23-ff. Wu, J. and Holt, R.C., (2004). ‘Linker-based program extraction and its uses in studying software evolution’. Proceedings of the International Workshop on Foundations of Unanticipated Software Evolution (pp. 1-15). ‘Xfce Desktop Environment’ (2016). Available at: http://www.xfce.org/ (Accessed: 10 June 2016) Young, R., (1994) ‘Interview with Linus, the Author of Linux’. Available at: http://www.linuxjournal.com/article/2736 (Accessed 8 June 2016) Young, B., (1999). ‘Counterpoint: Why Linux Is Important to You’. IEEE Software, (1), pp.37-39. Zeichick, A., (2008). ‘How Facebook works’. Technology Review, Jul ‘08.