infoline editorial board - erode | tamil nadu | india editorial board ... s.arunkumar iii b.sc....

25

Upload: duongnguyet

Post on 20-Jun-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by
Page 2: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

INFOLINE

EDITORIAL BOARD

EXECUTIVE COMMITTEE

Chief Patron : Thiru A.K.Ilango B.Com., M.B.A., L.L.B.,

Correspondent

Patron : Dr. N.Raman M.Com., M.B.A., M.Phil., Ph.D.,

Principal

Editor in Chief : Mr. S.Muruganantham M.Sc., M.Phil.,

Head of the Department

STAFF ADVISOR

Ms. P.Kalarani M.Sc., M.C.A., M.Phil.,

Assistant Professor, Department of Computer Technology and Information Technology

STAFF EDITOR

Mr. K.Dhiyaneshwaran M.C.A., M.Phil.,

Assistant Professor, Department of Computer Technology and Information Technology

STUDENT EDITORS

B.Akilesh III B.Sc. (Computer Technology)

V.Mohan dass III B.Sc. (Computer Technology)

R.K.Kiruthika Shivani III B.Sc. (Information Technology)

S.Arunkumar III B.Sc. (Information Technology)

B.Mano Pretha II B.Sc. (Computer Technology)

A.Uthaya Sriram II B.Sc. (Computer Technology)

P.Deepika Rani II B.Sc. (Information Technology)

R.Pradeep Rajan II B.Sc. (Information Technology)

D.Harini I B.Sc. (Computer Technology)

V.A.Jayendiran I B.Sc. (Computer Technology)

S.Karunya I B.Sc. (Information Technology)

S.Ranjith Kumar I B.Sc. (Information Technology)

Page 3: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

CONTENT

Virtual Private Networks 1

Workforce Management Strategy for 2017 3

Smartphone Trends 5

Angry Birds Maker is Hatching a Game Service on Android 7

Facebook's Latest Experiment 8

Defer Updates in Windows 9

The best Wi-Fi Router for a Home Office 10

Microsoft puts Quantum Computing Higher on its Hardware Priority List 13

More than Half the World's People Still off the Internet 14

New supercomputer with x86, power9 and arm chips 15

Server-Based Open Networking 17

Top Reasons for Network Downtime 19

Page 4: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

1

VIRTUAL PRIVATE NETWORKS

A virtual private network is a secure

tunnel between two or more computers on the

internet, allowing them to access each other as

if on a local network. In the past, VPNs were

mainly used by companies to securely link

remote branches together or connect roaming

employees to the office network, but today

they're an important service for consumers too,

protecting them from attacks when they

connect to public wireless networks.

VPNs are good for your privacy and security

Open wireless networks pose a serious

risk to users, because attackers sitting on the

same networks can use various techniques to

sniff web traffic and even hijack accounts on

websites that don't use the HTTPS security

protocol. In addition, some Wi-Fi network

operators intentionally inject ads into web

traffic, and these could lead to unwanted

tracking. In some regions of the world,

governments track users who visit certain

websites in order to discover their political

affiliations and identify dissidents - practices

that threaten free speech and human rights.

By using a VPN connection, all of your

traffic can be securely routed through a server

located somewhere else in the world. This

protects your computer from local tracking and

hacking attempts and even hides your real

Internet Protocol address from the websites and

services you access.

Not all VPNs are created equal

There are different VPN technologies

with varied encryption strengths. For example,

the Point-to-Point Tunnelling Protocol (PPTP)

is fast, but much less secure than other

protocols such as IPSec or Open VPN, which

uses SSL/TLS(Secure Sockets Layer/Transport

Layer Security). Furthermore, with TLS-based

VPNs the type of encryption algorithm and key

length used is also important.

While Open VPN supports many

combinations of ciphers, key exchange

protocols and hashing algorithms, the most

common implementation offered by VPN

service providers for Open VPN connections is

AES encryption with RSA key exchange and

SHA signatures. The recommended settings are

AES-256 encryption with a RSA key that's at

least 2048 bits long and the SHA-2 (SHA-256)

cryptographic hash function, instead of SHA-1.

It's worth noting that VPNs introduce overhead,

so the stronger the encryption is, the bigger the

Page 5: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

2

impact will be on the connection speed. The

choice of VPN technology and encryption

strength should be made on a case-by-case

basis, depending on what kind of data will be

passed through it.

The security needs of corporations are

different than those of most consumers, who

typically only need to protect themselves

against opportunistic traffic snooping attacks

unless they're concerned about mass

surveillance by the U.S. National Security

Agency and similar intelligence agencies, in

which case very strong encryption is needed.

VPNs can bypass geo blocking and firewalls

Consumers also use VPNs to access

online content that's not by available in their

region, although this depends on how well the

content owners enforce restrictions. VPN

service providers usually run servers in many

countries around the world and allow users to

easily switch between them. For example, users

might connect through a U.K.-based server to

access restricted BBC content or through an

U.S. based server to access Netflix content

that's not available in their region. Users in

countries like China or Turkey, where the

governments regularly block access to certain

websites for political reasons, commonly use

VPNs to bypass those restrictions.

Free Vs paid

Many companies set up their own

VPNs using special network appliances,

consumers have a wide selection of

commercial and free VPN services to choose

from. Free VPN offerings usually display ads,

have a more limited selection of servers, and

the connection speeds are slower because those

servers are overcrowded. However, for the

occasional user this just might be enough.

Another downside of free VPN servers, though,

is that that it's more likely that the IP addresses

they use will be blocked or filtered on various

websites: Free VPN services are commonly

abused by hackers, spammers and other ill-

intentioned users.

Commercial VPN services work on a

subscription-based model and differentiate

themselves by an absence of download speed

throttling or data limits. Some of them also

pride themselves on not keeping any logs that

could be used to identify users. A few antivirus

vendors also offer VPN services and these

could serve as a middle ground between free

and the more expensive commercial solutions,

as users could get better deals if they also have

antivirus licenses from those vendors. Also

these VPN solutions already have reasonably

secure settings, so users don't have to worry

about configuring them themselves.

Page 6: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

3

Build your own VPN

There's the option to run your own VPN

server at home so you can tunnel back and

access services and devices on your home

network from anywhere. This is a much better

option than exposing those services directly to

the internet, which is how hundreds of

thousands of internet-of-things devices have

recently been compromised and used to launch

distributed denial-of-service attacks.

The general rule is that the fewer ports

are opened in your router, the better. You

should disable UPnP (Universal Plug and Play)

so that your poorly designed IP camera, for

example, doesn't punch a hole through your

firewall and becomes available to the whole

world. Some consumer routers have built-in

VPN server functionality these days, so you

don't even have to set up a separate dedicated

VPN server inside your network. Although, if

your router doesn't have this sort of feature, a

cheap minicomputer like Raspberry Pi can do

this job just fine.

J.S.RAJAMOORTHY,

III-B.Sc. (Computer Technology)

WORKFORCE MANAGEMENT STRATEGY FOR

2017

In 2016, organizations realized the

importance of organizational efficiency,

employee well-being and workplace wellness

as well as engagement, flexibility, career

growth and planning. To asked three experts - a

chief product officer, a senior technical

recruiter, and a partner at a workforce

management consulting firm to share their

thoughts on what lies ahead for in workforce

management.

1. Technology in the driver's seat

Technology will continue to impact

workforce management and HR in incredible

ways, says Karen Williams, chief product

officer for workforce management solutions

company Halogen Software. "Especially

around the area of data analytics, technology is

helping to drive the conversation around

employee sentiment, happiness, engagement

and organizational performance. It's all about

making sure CIOs and other C-level executives

Page 7: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

4

understand how to leverage data to be more

effective," Williams says.

This is important because human

resources, as a general rule, has been slow to

adopt technology that improves its capability to

find, screen and hire talent, she says. That has

been changing in recent years, and 2017 will

see the further adoption of new tech, Williams

says. "Some of it stems from HR not being

willing to move out of its comfort zone, some

of it is because of organizations not being

willing to invest in new technology for their

HR and recruiting departments - but now, as

talent is recognized as critical, technology is

seen as a way to enable things like better and

faster hiring, retention, and once people are on

board, performance management," she says.

2. Focus on team intelligence

Until recently, individual performance

and growth have been the focus for gauging

talent within organizations, says Jeanne

Meister, founding partner of Future Workplace,

an HR and recruiting consultancy and research

firm. But now, many companies are realizing

that teams are the heart of increased

performance, efficiency and effectiveness;

that's driving many mergers and acquisitions as

larger companies poach whole teams from

competitors, according to Meister.

"We've now realized that it takes high-

performing teams to produce the kinds of

results organizations want. So, future-focused

companies will look at what makes a great

team? How they communicate, how to reward

and recognize them, how to push intact teams

through growth and development," Meister

says. This is a pretty major mindset shift for

many organizations, so expect the emphasis on

teams to continue through 2017 and beyond,

she says.

3. User experience in the workplace

User experience has become an

important metric for judging products, but look

for user experience to become a major part of

how companies are gauging their workplaces

too says Meister.

"Chief Marketing Officers were once

the only ones concerned with the experience of

users. Now, though, heads of HR are

leveraging marketing tools and approaches like

design thinking and sentiment analysis to

create a compelling employee experience,"

Meister says. That includes new positions like

chief employee experience officer, a role that

encompasses areas as diverse as real estate,

technology and marketing to make sure that

employees are as engaged, motivated and

productive on the job as possible. Part of the

emphasis on user experience includes using

technology tools like mobile and video both for

hiring and screening of candidates and for

enabling remote work and flexibility.

Page 8: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

5

4. The gig economy heats up

The gig economy continues to play a

significant role in the workforce, especially in

IT, says Mondo's Avalos. It's a great way for

companies to scale their workforce based on

demand, but also for workers who want to

quickly add new skills to their resume by

taking on short-term projects. But some

companies also are developing their own,

internal pool of contingent labour, which is a

new twist on the trend, says Future

Workplace's Meister. For example, Price water

house Cooper's Talent Exchange allows

freelancers and independent professionals to

sign up for available projects with the firm, and

it benefits both sides. For the company, there

might not be enough ROI to hire a full-time

employee, and maybe for independent

contractors, they want the flexibility and

freedom to be able to work for themselves,

Meister says.

J.GOWTHAM,

III-B.Sc. (Computer Technology)

SMARTPHONE TRENDS

Smartphone buyers have a lot to look

forward to in 2017. Devices will be thinner,

faster, and perhaps a bit more intelligent than

you'd like. Virtual reality will spread to budget

smartphones, and they will also have better

graphics, higher resolution screens, and more

storage. More than ever, you'll be using your

smartphone to pay for products and log into

websites. Deep learning could help

smartphones get a fix on user behaviour and

improve the mobile experience. To see a

renaissance in smartphone designs, and

wireless audio could replace headphone jacks

in more handsets. USB-C will replace older

connector and charging cables.

Smartphone trends to watch out for in 2017

New designs: The rumour mills are filled with

new smartphone designs. The prominent

rumours include Apple giving a facelift to its

decade-old iPhone design and Samsung coming

out with a folding smartphone. It's not certain

these will happen, but like every year, expect

something new. This year, the hot trend was

customizable smartphones like Moto Z and LG

G5, which were partly inspired by Google's

now defunct Project era. Some innovations

were unveiled this year, including Lenovo's

CPlus, a prototype folding smartphone that can

be worn like a watch. LG and Samsung have

also talked about smartphones with folding

displays.

Page 9: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

6

Faster chips: Graphics will be smoother, and

applications will run much faster on next year's

smartphones. Qualcomm has already

announced the Snapdragon 835, which could

be installed in some premium Android

smartphones from top mobile companies. Some

may opt for Mediatek's Helio X30, which has

10 CPU cores, the highest number among

mobile processors.

Virtual reality: The point of speeding up

mobile devices is to allow them to run

applications like virtual reality, which demand

heavy resources. It'll be possible to plug

handsets into Google's Day Dream View VR

headset to watch movies, play games, or roam

VR worlds. VR is now limited to a few

handsets like Samsung's Galaxy S7, but it'll

come to more high-end and mid-range phones

next year. The VR smartphones will need to

have high-resolution displays to deliver a

stunning visual experience.

Faster LTE: LTE speeds will get a serious

boost with new modem technologies.

Smartphones like the Galaxy S7 and Apple's

iPhone 7 can download data over LTE

networks at a maximum speed of 600Mbps

(bits per second), and upload data at 150Mbps.

Download speeds could reach close to 1Gbps

with Qualcomm's new Snapdragon X16

modem, which should reach devices in the

second half of 2016. Achieving that speed also

depends on the network capabilities of a

carrier.

USB-C: This is the year USB-C will replace

the aging micro-USB 2.0 ports in Android

handsets. USB-C is extremely versatile --

beyond charging, it can be used to connect

mobile devices to high-definition monitors,

headphones, flash drives, and external storage

devices.

Wireless audio: There's a good chance a

majority of smartphones will still have

headphone jacks, but like Apple, some may

make muster up the "courage" to remove it.

Those handsets will switch to Bluetooth

earphones. That means the extra headache of

buying and recharging wireless headsets, but

getting rid of the headphone jack could result

in thinner and lighter handsets. Some LeEco

and Motorola smartphones already have moved

forward with wireless audio.

Quicker charging: Smartphones will charge

much faster with USB-C cables, which can

carry more power to a battery. There's also

technology like Qualcomm's Quick Charge 4,

which will help smartphones run for five hours

after just five minutes of charging.

Device smarts: Lenovo's Phab 2 Pro

augmented reality smartphones can recognize

objects, map out rooms, and present relevant

information about objects in sight on a

handset's screen. That's a good example of how

Page 10: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

7

smartphones will evolve to enrich the user

experience.

Deep-learning techniques in

smartphones could also contribute to making

smartphones friendlier. For example, a device

could learn how hardware is being used by a

specific application, and over time, better

modulate power usage to improve battery life.

Smartphones can already recognize images and

speech recognition via online services, but

deep-learning enhancements could bring those

capabilities offline.

Bluetooth 5: Devices could soon get the new

Bluetooth 5 wireless specification, which will

have two times the speed and four times the

range of its predecessor, Bluetooth 4.2. A

Bluetooth 5 connection could stretch up to 400

meters in a clear line of sight, but with

obstructions, a 120-meter range is considered

more realistic by analysts. You will be able to

use a mobile device to operate a wireless

Bluetooth speaker or unlock or a car from a

longer distance.

Storage: Extra storage on a smartphone never

hurts, be it to store videos, photos, or games.

Currently, internal storage tops out at 256GB

and SD storage at 512GB.

M.KAVIN,

III-B.Sc. (Information Technology)

ANGRY BIRDS MAKER IS HATCHING A GAME SERVICE ON

ANDROID

Rovio’s Angry Birds heyday may long

be over, but it’s not out of the game just yet.

Starting next year, spin-off company Hatch will

launch a new subscription streaming service on

Android that looks to change the way we play

games on our phones. Instead of downloading

what you want to play, users will select from a

variety of games streaming inside the Hatch

app. About 100 titles are promised at launch-

including Badland, Cut the Rope 2, Leo’s

Fortune, and Monument Valley, as well as

some Hatch originals and there will never be

any need to update or unlock via in-app

purchases. If you’re worried about the

performance on your Galaxy S7, the company

promises “highly-advanced cloud-based server

technology” will keep games running smoothly

as you move through levels.

Hatch app is designed to be a true social

experience. Since everything is streamed,

players can join at any time, and any single-

player game can be turned into a multi-player

one, where gamers can collaborate and

Page 11: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

8

compete, as well as broadcast their sessions.

The service will be available in two tiers: free

with ads or as a paid subscription with

additional benefits. As far as how developers

will get paid, Hatch founder and CEO Juhani

Hokala simply says, “Leave the monetization

to us.”

M.VETRIVEL,

III-B.Sc. (Computer Technology)

FACEBOOK'S LATEST EXPERIMENT

Facebook says it’s not a media

company, but it just might be turning into a

Wi-Fi finder service. Users of the social

network’s iOS app report seeing a new feature

in the more section that lets them find nearby

public Wi-Fi access points. The feature does

not appear to be widely available at the

moment, which means this is probably

something Facebook is only testing. The social

network tests numerous features all the time

but this one is particularly notable.

The Next Web points out, helping users

find public Wi-Fi could enable more people to

use Facebook Live. If your cellular connection

isn’t strong, a nearby Wi-Fi location can be a

big help unless, of course, your Facebook Live

broadcast is dependent on your specific

location.

There could be other uses for finding Wi-Fi

beyond live video broadcasts. If you’re

desperate to upload a photo or recorded video,

then locating the closest public Wi-Fi point

helps. On top of that it’s just one more reason

to open the Facebook app, which Facebook

obviously wants to encourage as much as

possible. Check where the nearest Wi-Fi

hotspot is, see that unread notifications

indicator at the top of the screen, and before

you know it you’re engrossed in the news feed.

Wi-Finding

Facebook’s Wi-Fi finding feature

proves accurate and taps into a database large

enough to be useful. The Next Web points out

that this new Wi-Fi feature comes shortly after

Facebook started asking businesses with pages

to voluntarily contribute Wi-Fi access point

information. The database may also have come

from aggregating access point information

from the phones of Facebook users all over the

globe. That’s just speculation, but it’s not

uncommon.

Microsoft’s Wi-Fi Sense feature uses

crowd sourced information for its database of

public Wi-Fi access points. On top of that,

Page 12: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

9

building a Wi-Fi database is something most

major technology companies do in order to

help their device’s location services. Google

did it using its Street View cars and, later,

Android phones, and Apple collected location

data from users’ iPhones, iPads, and Macs etc.

B.HARI SRUTHI,

III-B.Sc.(Computer Technology)

DEFER UPDATES IN WINDOWS 10

Forced updates in Windows 10 have

their appeal. For Microsoft, it helps keep the

majority of its users on the same build of

Windows 10, reducing legacy support issues.

For users, it keeps your system up-to-date and

reduces the chances of getting hit with malware

that takes advantage of unpatched systems. But

some folks resent the idea of having updates

forced on them especially when some of those

updates cause problems or won’t install

properly. If you’re running Windows 10 Home

you’re at the mercy of Microsoft’s update

schedule. Windows 10 Pro and Enterprise

users, however, have the opportunity to defer

certain types of updates.

How to defer Windows 10 updates

First, click on Start and select the

Settings cog icon on the left side of the Start

menu to open the Settings app. Now go to

Update & Security > Windows Update, under

the “Update settings” sub-heading, select

Advanced options. A new advanced options

screen will pop up. From here, click the Defer

feature updates check box.

Adjusting this setting puts you on a

special update channel for Windows 10 known

as the “Current branch for business.” This

version of Windows 10 doesn’t receive feature

upgrades as quickly as everyone else. Security

updates, however, are delivered on the same

schedule regardless of branch.

Microsoft says that when you defer

upgrades you won’t be forced to install feature

updates (such as the upcoming Creators

Update) for “several months.” It’s not exactly

clear how long that is. It could be just two

months or, as ZDNet’s Ed Bott reported in

July, it could be around four months. It should

be enough time to make sure all the major bugs

are worked out on Windows 10 Home systems,

Page 13: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

10

however. Sorry Windows 10 Home users. You

may not be able to defer upgrades, but there are

a couple of tricks you can employ to make

forced updates more tolerable. Check out our

tutorial on how to schedule when Windows 10

updates are installed, as well as a trick to

prevent upgrades from downloading

automatically by setting your Wi-Fi as a

metered connection.

M.VIGNESWARAN,

II-B.Sc. (Computer Technology)

THE BEST WI-FI ROUTER FOR A

HOME OFFICE

Does your home office Wi-Fi router's

lacklustre performance hamper your

productivity? Do you constantly deal with

wireless connectivity issues that drive you

crazy? And do the problems get worse as you

add more wireless devices to the network? If

you answered yes to any of these questions, an

upgrade of your aging, overloaded Wi-Fi router

may be the only guaranteed solution.

Before you buy that bargain basement

router or even splurge on the most expensive

model, it's wise to make sure you understand

the technologies behind your in-home wireless

so you can pick the best router for your

workspace or small office.

Wi-Fi standards and technologies

Mobile devices that support Wi-Fi

today conform to the 802.11 family of

protocols for wireless communication. That

family includes the trusted 802.11a, 802.11b

and 802.11g standards, and newer Wi-Fi

devices support the much faster 802.11n and

802.11ac, as well.

To enable speedier data transfer,

802.11n and 802.11ac leverage Multiple Input

Multiple Output(MIMO) technology, which

uses different antennas to send multiple

streams of wireless data for better performance.

In theory, 802.11n devices can offer as many as

our separate spatial streams, and 802.11ac

supports up to eight streams.

The latest "Wave 2" generation of

802.11ac Wi-Fi routers support Multi User-

Multiple Input Multiple Output (MU-MIMO)

technology. This new tech lets Wi-Fi routers

transmit to multiple devices simultaneously

instead of sequentially, which significantly

speeds things up on crowded wireless

networks.

Page 14: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

11

Before you rush out to buy an

expensive Wi-Fi router with MIMO (also

known as single-user MIMO), you should

know that to utilize that speedy wireless your

Wi-Fi devices must also support the tech.

Unfortunately, the majority of today's Wi-Fi

devices, including smartphones and tablets,

only support one or two spatial streams, and

they won't be able to take full advantage of Wi-

Fi routers with more streams. The same thing

applies to MU-MIMO routers, because only a

handful of mobile devices available today

support the tech.

Some cases, it may make sense to buy

a more affordable Wi-Fi router that delivers

optimal performance with your existing

devices, and then later opt for a more advanced

(and likely more expensive) router when you

upgrade your mobile devices to phones, tablet

or computers that support MIMO.

Dual-band Vs tri-band Wi-Fi routers

The days of single-band 2.4GHz

support are in the past, and today's Wi-Fi

routers typically offer dual-band support for

both the 2.4GHz and 5GHz bands. Some of the

most advanced modern routers tout tri-band

capabilities, with simultaneous use of a single

2.4GHz and two separate 5GHz bands.

Any new Wi-Fi router you buy today

should support dual-band wireless, so it's

backwards compatibility with older 2.4GHz

devices, as well as any devices that use the

5GHz band, which is less prone to interference.

Unless you plan to simultaneously use a dozen

or more 5GHz devices, a cutting-edge, tri-band

router probably isn't worth the money. Most of

today's mobile devices can only use one band

at a time, so it may be a better idea to buy a

second Wi-Fi router, or roll out a business-

grade Wi-Fi system to better support large

numbers of Wi-Fi devices.

To make the whole thing even more

confusing, the manufacturers of some Wi-Fi

routers combine the maximum theoretical

speeds of the two or three bands their products

support to come up with highly misleading

performance numbers, such as AC1200,

AC1750 and AC3200. A Wi-Fi router that

offers AC1750, for instance, really supports

just 450Mbps on 2.5GHz and 1,300Mbps on

5GHz (450 + 1,300 = 1,750). You may never

actually be able to get 1,750Mbps on a single

stream from such a router.

Wi-Fi management features and

functionality

Finding the right Wi-Fi router for you is

about more than a simple performance

evaluation. A good, easy-to-configure user

interface can make a big difference, as well.

For instance, features that let you share Wi-Fi

access with visiting friends or relatives without

having to reveal a security passphrase can be

very convenient. You might also want to check

Page 15: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

12

to see if the router can set its guest network to

either the 2.4GHz or 5GHz band, isolate guest

devices from the rest of the gadgets on the

same network, and limit the number of guests

or the bandwidth they can use.

Quality of Service (QoS) settings,

which let you prioritize latency-sensitive

applications such as VoIP calls or streaming

media, can be very important in a home or

small office setting. They help ensure that the

router knows how to prioritize specific types of

traffic, so video or audio playback is as smooth

as possible when the network is stressed.

Wi-Fi router performance and limitations

Different Wi-Fi routers funnel data

from broadband Internet connections out at

different speeds. The specialized chipsets that

make Wi-Fi routers work come from a handful

of suppliers, but a lot of the chips that process

and route data packets come from additional

vendors.

The hardware specifications can vary

significantly and have a real impact on Internet

speeds. This might be less of an issue if you

have a slow broadband connection, but users

with broadband speeds in excess of 50Mbps

could see performance degradation over Wi-Fi.

Sites such as SmallNetBuilder.com offer

extensive Wi-Fi benchmarking results that can

provide insight on hardware limitations. People

who have gigabit Internet will unfortunately

find Wi-Fi to be a bottleneck, because even the

fastest Wi-Fi routers used in optimal

environments cannot compare to the speed of

wired gigabit connections.

key considerations

Despite the relentless push towards an

all wireless worlds, it still makes sense to

connect certain devices to the web via wires,

such as Network-Attached Storage (NAS)

appliances and desktop PCs. Doing so reserves

valuable wireless bandwidth for wireless-only

devices, and it can reduce intermittent issues

that stem from wireless interference. Wi-Fi

routers with adequate switch ports can

eliminate the need for a standalone network

switch, as well.

Many of the latest Wi-Fi routers have a

USB port or two, which can be used to connect

USB-based printers or portable storage drives,

among other things. It's unlikely that newer

routers will use anything older than USB 3.0,

but you might want to keep an eye out for and

avoid any slower USB 2.0 devices.

You may also want to look into a router

with dual-WAN support, which would let you

use two WAN internet connections as a way to

help ensure consistent network reliability. The

Synology RT1900ac [ or find if on Amazon],

for example, supports dual WAN in both

active-passive and active-active mode. The

former tech uses only one WAN port at a time,

Page 16: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

13

but can automatically switch to the second

WAN port should a connection drop. The latter

allows for simultaneous use of both WAN

ports.

The placement of your Wi-Fi router is also

crucial to good Wi-Fi coverage in your home

workspace or small office. Your router should

be positioned in an elevated, central location

that's set apart from potentially RF-dampening

barriers, such as thick concrete beams or walls,

and metallic fixtures.

The drive toward ubiquitous wireless

means Wi-Fi tech will continue to evolve in

leaps and bounds, and device makers will

release more powerful and full-featured routers

in the months and years ahead. To take

advantage of all the advances and evolutions,

you need to stay up to date on the various Wi-

Fi tech and related hardware featured here.

N.MOHANAPRIYA,

II-B.Sc. (Computer Technology)

MICROSOFT PUTS QUANTUM COMPUTING HIGHER ON ITS HARDWARE PRIORITY LIST

Microsoft is accelerating its efforts to

make a quantum computer as it looks to a

future of computing beyond today's PCs and

servers. Microsoft has researched quantum

computing for more than a decade. Now the

company's goal is to put the theory to work and

create actual hardware and software.

To that effect, Microsoft has put Todd

Holmdahl , who was involved in the

development of Kinect, HoloLens, and Xbox

to lead the effort to create quantum hardware

and software. The company has also hired four

prominent university professors to contribute to

the company's research. Quantum computers,

in theory, can significantly outperform today's

supercomputers. The ultimate goal is to create

universal quantum computers that can run all

existing programs and conduct a wide range of

calculations, much like today's

computers. Early quantum computers can be

used to run only a limited number of

applications.

Page 17: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

14

Companies like IBM, D-Wave, and

Google are researching quantum computing.

IBM researchers have said a universal quantum

computer is still decades out, so their focus is

on creating hardware targeted at solving

specific problems. D-Wave and IBM have

created quantum computers based on different

theories, and the companies have bashed each

other's designs. D-Wave is trying to get more

programmers to test its hardware so it can be

used for more applications.

It's not known when Microsoft's

quantum hardware will come out. Like others,

Microsoft will have to make quantum circuits

on which it can test applications and tackle

issues like error correction, fault tolerance, and

gating. Practical hardware will be released only

after a number of quantum computing issues

are resolved. But Microsoft is already offering

a simulation of quantum computers via a

software toolkit. Conventional computers

represent data in the forms of 1s and 0s, but

quantum computers are far more complex. At

the center of quantum computers are qubits,

which can harness the laws of quantum

mechanics to achieve various states. A qubit

can hold a one and zero simultaneously and

expand to states beyond that.

Qubits allow quantum computers to

calculate in parallel, making them more

powerful than today's fastest computers. But

qubits can be fragile, and interference from

matter or electromagnetic radiation can wreck a

calculation. Researchers at Microsoft are

working on an entirely new topological

quantum computer, which uses exotic materials

to limit errors. There are still questions about

the viability of such materials and outcomes, so

it could take a long time for Microsoft to make

practical quantum circuits.

Interest in quantum computing is

growing as it becomes difficult to manufacture

smaller chips to speed up PCs and servers.

Neuromorphic chips and quantum circuits

represent a way to move computing into the

future. Microsoft's new hires include Leo

Kouwenhoven, a professor at the Delft

University of Technology in the Netherlands;

Charles Marcus, a professor at the University

of Copenhagen, Matthias Troyer, a professor at

ETH Zurich and David Reilly, a professor at

the University of Sydney in Australia.

V.MANIKANDAN,

II-B.Sc. (Information Technology)

MORE THAN HALF THE

WORLD'S PEOPLE STILL OFF

THE INTERNET

Page 18: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

15

Less than half of the world's population

still isn't using the Internet, although the

numbers are improving, according to a United

Nations report. A report released this week by

the United Nation's International

Telecommunication Union (ITU) found that

47.1% of the population is online, an increase

from 2015's figure of 43%.

The spread of mobile networks around

the globe has played an important role in

increasing Internet connectivity, the report said.

Mobile-broadband networks cover 84% of the

world's population this year, but the number of

users, at 47.1%, is well below those who have

access. While infrastructure deployment is

crucial, high prices and other barriers remain

important challenges to getting more people to

enter the digital world," the report stated. "This

suggests that many people are yet to benefit

fully from the opportunities brought by the

Internet."

Zeus Kerravala, an analyst with ZK

Research, said the percentage of those using

the Internet is too low. "The Internet is one of

the great equalizers in life, and the world needs

to work together to get everyone on the

Internet. It's very frustrating. There are pockets

of activity of bringing the Internet everywhere,

but it tends to be at a country level not a global

one." While companies like Facebook and

Google are working on technologies to provide

Internet connectivity to rural and poor areas.

"For example, in Canada they deploy fiber to

every school, and then the school connection

feeds the town where people don't have it," he

added. We need a plan like that in India or rural

China. While much of the growth in Internet

usage comes from developing countries, many

people in those areas do not own or use a

mobile phone. South Korea had the highest

levels of Internet connectivity and usage, while

the African nations of Niger, Chad, Guinea-

Bissau and South Sudan had the lowest.

Affordability still is the main barrier for

people to own a mobile phone, with the cost of

the phone itself being more of a challenge than

receiving the service. The ITU also reported

that the people most often left offline are

disproportionately female, elderly, less

educated, poor and living in rural areas.

N.SENTHIL KUMARAN,

I-B.Sc. (Computer Technology)

NEW SUPERCOMPUTER WITH

X86, POWER9 AND ARM CHIPS

Page 19: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

16

For once, there will be a ceasefire in the

war between major chip architectures x86,

ARM and Power9, which will all be used in a

supercomputer being built in Barcelona. The

Mare Nostrum 4 is being built by the Barcelona

Supercomputing Center, and will have three

clusters, each of which will house Intel x86,

ARM and Power9 chips. Those clusters will be

linked to form a supercomputer that will

deliver up to 13.7 peta flops of performance.

The three chip architectures are

fundamentally different. An application written

to take advantage of a specific architecture

won't work on another, but server architectures

are changing so different types of systems can

coexist. Linux supports x86, ARM and Power,

so it's possible to write applications to work

across architectures.

Emerging networking and throughput

interfaces like Gen-Z and Open CAPI also

make it possible for companies to install

servers based on different architectures in one

data center. Those standards are meant to break

the stranglehold of a single architecture, and

also provide a blueprint to build a multi-

architecture supercomputer like Mare Nostrum

4. BSC's goal is to make a supercomputer using

emerging technologies that can be used for all

kinds of scientific calculations, the research

institution said.

The computer will let researchers

experiment with all sorts of alternative, cutting-

edge computing technologies, said Scott Tease,

executive director for Lenovo's Hyper Scale

and High Performance Computing group, in an

email. One such technology involves low-

power ARM chips, which dominate

smartphones, but are not yet used in

supercomputers. The system will share

common networking and storage assets, Tease

said. Lenovo is providing server and chip

technologies for Mare Nostrum 4. However,

the performance of Mare Nostrum 4 isn't

overwhelming, especially when compared to

China's Sunway Taihu Light, which is the

world's fastest computer. Taihu Light delivers

93 peta flops of peak performance.

BSC has knack for developing

experimental supercomputers like Mare

Nostrum 4. Starting in 2011, BSC built

multiple supercomputers using ARM-based

smartphone chips. The Mont-Blanc and

subsequent Pedraforca computers were rooted

in the premise that supercomputers with

smartphone chips could be faster and more

power efficient than conventional server chips

like Intel's Xeon or IBM's Power, which

dominate high-performance computing.

Last year, ARM developed a new high-

performance computing chip design with

Fujitsu that will be implemented in Mare

Nostrum 4. The chip has a heavy dose of vector

processing, which has been a staple of

supercomputers for decades. The other

Page 20: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

17

ingredients of Mare Nostrum 4 include Lenovo

server cabinets with Intel's current Xeon Phi

supercomputing chip, code-named Knights

Landing, and upcoming chip code-named

Knights Hill. It will also have racks of

computing nodes with IBM Power9 chips,

which will ship next year. The supercomputer

will be implemented in phases, and replace the

existing Mare Nostrum 3. It will have storage

capacity of 24 petabytes.

V.KARTHIK,

I-B.Sc. (Computer Technology)

SERVER-BASED OPEN

NETWORKING

Networking using commercial off the

shelf (COTS) servers has been around for

several years, thanks to the proliferation of

Linux-based servers and network technologies

like Open v Switch (OVS). The hope is that

the switch world follows the servers’

successful path, hence the birth and popularity

of the term “open networking.”

Network devices like top of rack (TOR)

switches have traditionally been closed the

operating systems and protocols that run on the

switches were proprietary, could not be

disaggregated from the hardware and were not

open source. Switches got disaggregated a bit

when the switch vendors adopted switch ASICs

from merchant silicon vendors like Broadcom.

Next Open Flow and Open Flow-based SDN

controllers like Floodlight, which proposed that

the switch control plane protocols be removed

from the switch and placed in an open source

controller. This in some ways disaggregated the

OS from the switch box.

Subsequently, switch operating systems

like Cumulus Linux came to market. These can

be installed and run on switch boxes from

multiple vendors, like Quanta and Dell. But

such disaggregated switch OS are not

necessarily open source. More recently, open

source switch operating systems like Sonic and

Open Network Linux have been in the news.

The open source controller ecosystem has

further evolved as well, focusing on feature

completeness and carrier grade reliability (i.e.,

Open Daylight and ONOS). All in all,

significant action and news in the realm of

open networking has been related to switches,

geared toward helping the industry manage the

switch supply chain more effectively and

deploy efficiently, similar to the COTS server

model.

Page 21: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

18

Open networking on servers

What seems to get overlooked in these

discussions is how open networking on servers

(or server-based open networking) has evolved

and enabled open networking on switches.

Over the last several years, TOR switches have

become simpler because data center traffic

patterns have changed and network

infrastructure efficiency requirements have

increased. When using leaf (TOR) and spine

switches, the imperative has shifted to moving

east-west traffic most efficiently, which

requires more bandwidth, more ports and lower

latency. As a result, the feature requirements in

hardware and software in leaf and spine

switches have been reduced to a simpler set.

This has made open networking in switches

easier to implement and deploy.

However, the smarts of networking did

not disappear they just moved to the server,

where such smarts are implemented using the

virtual switch preferably an open one such as

OVS and other Linux network features like IP

tables. Many new features related to network

security and load balancing have been added to

OVS. Open Stack, as an open source and

centralized cloud orchestration tool, has rapidly

come to prominence, with more than 60% of

Open Stack networking deployed today using

OVS (with Open Stack Neutron). Server-based

open networking has evolved relatively quietly

compared to open networking in switches, but

it has made major contributions toward

bringing deployment efficiencies and

flexibility.

Today, in many high growth cloud,

SDN and NFV applications, server-based open

networking is running into server sprawl and

related TCO challenges. As the networking

bandwidths increase and the number of VMs

proliferates on servers, OVS processing is

taking up an increasingly large number of CPU

cycles, which is limiting the number of CPU

Page 22: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

19

cycles available for processing applications and

VMs. Data center operators cannot

economically scale their server-based

networking using traditional software-based

virtual switches. So implementing server-based

networking in x86 architectures and software is

a double whammy: it increases costs as too

many CPU cores are consumed, and it lowers

performance as applications are starved for

resources.

Offloading network processing to

networking hardware is an option that has

worked well in the past. However, software-

defined and open source networking is

evolving at a rapid pace; such innovation stops

the moment data center operators look to

inflexible network hardware for performance

and scale.

The solution to this challenge is to

offload OVS processing to an intelligent server

adapter (ISA). With an ISA handling OVS

processing, performance is boosted by up to

5X, and the data center operator frees as many

as 11 CPU cores from network-related

processing, enabling greater VM scalability

and lower costs. An ISA is programmable and

hence can evolve rapidly with new features,

preserving the pace of innovation. Although

server-based networking by itself can cause

server sprawl, ISAs are making the case for

efficient and flexible open networking from the

COTS server side.

A.RANJITHA PRIYA,

III-B.Sc. (Information Technology)

TOP REASONS FOR NETWORK

DOWNTIME

New research paints a somewhat

bleak picture of network performance. Outages

are frequent. Hours typically pass before an

issue is reported and resolved. Protective

Page 23: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

20

measures are manual and error prone. The

source of the data is a survey of 315 network

pros at midsize and large enterprises. The

survey was sponsored by Veriflow, a San Jose,

Calif.-based start-up that aims to minimize the

risk of network vulnerabilities and outages.

Veriflow’s software is designed to catch

network problems before they happen by

predicting possible network-wide behaviour

and continually verifying that a network model

adheres to an enterprise’s security and

resilience policies. The survey results are

interesting (with the acknowledgement that the

sponsor of the survey makes software to

combat network outages). Here are some of the

key findings.

Incompatible changes

Network changes that are not properly

evaluated are another common cause of

incidents. The impact on the business varies. At

the high-impact end, 5% of respondents said

that network changes lead to a network outage

or performance issue on a daily basis, and 7%

said it happens several times a week. At the

low-impact end, 2% said it never happens and

7% said it’s a “once every couple years” event.

The most common answer, cited by 44% of

respondents, is that network changes lead to

outage or performance issues “several times a

year.”

Manual

How do IT teams verify that the network

is functioning properly after making a network

change? The approach is often manual,

Veriflow finds. Among respondents, 69% said

they rely on manual processes, such as

inspecting devices via the command line

interface, inspecting configurations, and

performing manual trace routes or pings (see

chart below for more

details).

Predictive monitoring: room for

improvement

There’s a lot of room for improvement

when it comes to network monitoring tools’

predictive capabilities. Just 6% of respondents

said that between 90% and 100% of their

network performance issues and outages are

Page 24: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

21

predicted by their network monitoring tools.

Another 15% said their tools predict 70% to

90% of network performance issues and

outages, and 13% said tools predict 50% to

75% of those issues. The rest of the

respondents said that their monitoring tools

predict less than half of all network issues:

21% of respondents said 25% to 50% of issues

get predicted; 25% of respondents said 1% to

24% of issues get predicted; 15% of

respondents said their tools don’t predict any

issues; and 5% of respondents don’t have a

network monitoring solution.

Resolution time

When asked how long it takes to find

and resolve a network issue after it’s reported,

some IT pros reported speedy results: 21% of

respondents said it takes, on average, less than

an hour to resolve networking issues.

Compliance conundrum

Roughly 76% of survey respondents

said their organization has network compliance

requirements in place to ensure privacy and

security of data and systems. But many

respondents are doubtful that their network is

always compliant: 56% called themselves

moderately confident; 19% said only slightly

confident; and 6% said not confident at all. Just

20% said they’re highly confident that their

network is always compliant.

Network segmentation

Network is divided respondents is

network segmentation. When respondents were

asked if they believe that network security and

segmentation are properly implemented

throughout their company’s network, 59% said

yes and 41% said no.

T.DHARINI, II-B.Sc. (Information Technology)

Page 25: INFOLINE EDITORIAL BOARD - Erode | Tamil Nadu | India EDITORIAL BOARD ... S.Arunkumar III B.Sc. (Information Technology) ... quickly add new skills to their resume by

1