gautam_2016_latest

7
Kumar Gautam Current Address – A4 202, Mirchandani Bellagio, Undri, Pune, India Mobile. No: - +91 992-390-5172 Email-Id: - [email protected] VISA Status – US B1 Visa (Valid till 2021, Multiple Entries) Objective: Senior technologist with strong business acumen and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy. Seeking a challenging solution development position, with a strong emphasis on Big Data and Machine Learning technologies, where I can use my current skill set and create the best solutions possible to become an invaluable asset to the company. Profile Sr Consultant- Big Data with 9.8 years of professional experience in Analysis, Design and Development of Enterprise grade Applications. MCSD (Mapr Certified Developer for Apache Spark) certified. Deep expertise in Spark and Hadoop ecosystem(MapReduce[MR1,YARN],HDFS, PIG, Hive, Sqoop, Oozie, HBase, Flume, Zookeeper) Experience in processing batch, interactive and live streaming data using SPARK Proficient with Core Java (Data Structures & Algorithms), PL/SQL and R programming, Python. Proficient in Apache Camel and Esper Complex Event Processing and Tableau. Expert in Order Management System (Order Care) and the whole BSS suite. Successfully delivered couple of initiatives (Implementation & development) on Big Data Analytics and large data processing using Hadoop ecosystem. Experience in column-family based Databases. (HBase) Proficient in development methodologies such as Agile, Scrum and Waterfall. Proven ability to excel in fast paced development environment using latest frameworks/tools. Expertise to deep dive into technical holes and coming out with solutions. Proven ability to learn quickly and apply new technologies. Innovative and enthusiastic. Worked with end users to formulate and document business requirements. Previous experience in architecture-design, database-design, and performance management. Exposure of working closely with Customers. Strong problem solving & technical skills coupled with clear decision making. Strong knowledge and understanding of data modeling concepts, UML diagrams and various types of Design patterns. Won the Top Innovator Award in Ericsson 2016 for presenting the smart Farming solution using IOT and ELK stack. Received appreciation and recognition from all previous employers. Having more than 8 months onsite exposure in USA and Switzerland. Experienced in complete SDLC. Highly skilled with object oriented architectures and patterns, systems analysis, software design, effective coding practices, databases, and servers.

Upload: kumar-gautam

Post on 12-Apr-2017

33 views

Category:

Documents


1 download

TRANSCRIPT

Kumar Gautam Current Address – A4 202, Mirchandani Bellagio, Undri, Pune, India Mobile. No: - +91 992-390-5172 Email-Id: - [email protected] VISA Status – US B1 Visa (Valid till 2021, Multiple Entries) Objective: Senior technologist with strong business acumen and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy. Seeking a challenging solution development position, with a strong emphasis on Big Data and Machine Learning technologies, where I can use my current skill set and create the best solutions possible to become an invaluable asset to the company.

Profile

• Sr Consultant- Big Data with 9.8 years of professional experience in Analysis, Design and Development of Enterprise grade Applications.

• MCSD (Mapr Certified Developer for Apache Spark) certified. • Deep expertise in Spark and Hadoop ecosystem(MapReduce[MR1,YARN],HDFS, PIG,

Hive, Sqoop, Oozie, HBase, Flume, Zookeeper) • Experience in processing batch, interactive and live streaming data using SPARK • Proficient with Core Java (Data Structures & Algorithms), PL/SQL and R programming,

Python. • Proficient in Apache Camel and Esper Complex Event Processing and Tableau. • Expert in Order Management System (Order Care) and the whole BSS suite. • Successfully delivered couple of initiatives (Implementation & development) on Big Data

Analytics and large data processing using Hadoop ecosystem. • Experience in column-family based Databases. (HBase) • Proficient in development methodologies such as Agile, Scrum and Waterfall. • Proven ability to excel in fast paced development environment using latest

frameworks/tools. • Expertise to deep dive into technical holes and coming out with solutions. • Proven ability to learn quickly and apply new technologies. • Innovative and enthusiastic. • Worked with end users to formulate and document business requirements. • Previous experience in architecture-design, database-design, and performance

management. • Exposure of working closely with Customers. • Strong problem solving & technical skills coupled with clear decision making. • Strong knowledge and understanding of data modeling concepts, UML diagrams and

various types of Design patterns. • Won the Top Innovator Award in Ericsson 2016 for presenting the smart Farming

solution using IOT and ELK stack. • Received appreciation and recognition from all previous employers. • Having more than 8 months onsite exposure in USA and Switzerland. • Experienced in complete SDLC. • Highly skilled with object oriented architectures and patterns, systems analysis, software

design, effective coding practices, databases, and servers.

Industry Wise Experience: - Senior Consultant and Data Engineer, Ericsson Global ,Pune, India (10th Feb 2014 -

Till Date) - Tech Lead/Application Expert, Amdocs India , Pune, India (12th Oct 2009 – 7th Feb

2014) - Senior System Engineer, Infosys Technologies , Pune, India (20th Nov 2006 – 10th

Oct 2009)

Education University Year Degree : Bachelor of Technology (ECE) from North

Eastern Hill University, NERIST, India CGPA(4.5/5.0)

2006

Higher Secondary : 90% NERIST 2002 Matriculation : 92% ICSE Board 2000 Certifications : Sun Certified Java Programmer (SCJP 5.0) 2008 : MAPR Certified SPARK Developer 2016

Experience Operating Systems : Windows, Linux & Unix Languages : Jdk 1.4/1.5/1.6/1.7, JavaScript, JSP, Servlet, PL/SQL, Unix

Shell, R programming, Python, Spark Hadoop Distribution : Apache, Cloudera, MAPR Big Data Technologies : Apache Hadoop (MRv1, MRv2), Hive, Pig, Sqoop, HBase,

Flume, Zookeeper, Oozie, ELK Stack, Spark stack Web Technologies : HTML, JSP, JSF, CSS, JavaScript, JSON & AJAX Server-Side Frameworks : Struts 2, Spring, Active MQ, Apache Camel, Hadoop IDEs : Eclipse, JBOSS, IBM Web Sphere, NetBeans Build Tools : Maven, Ant, Gradle Web Services : SOAP & RESTful Web Services Web Servers /App Servers : Apache Tomcat 6.0/7.0, IBM WebSphere 6.0/7.0, JBoss 4.3 Configuration Tool : SVN, GitHub, GIT, VSS, IBM Rational Clear Case 7.0, Gerrit Database : Oracle 8i/9i/10g/11g, MySQL, HDFS, Postgres, HBase Cloud Solution : Amazon Web Services (AWS), Defects Triage : Quality Center 9.2 , BugZilla, JIRA

Assignment Details Period : Ericsson Jan 2015 - Till date Domain : Telecom Team Size : 52

Role : Senior Product Developer Product Description : Ericsson Expert Analytics (EEA)

Ericsson Expert Analytics (EEA) is a multi-vendor, real-time customer-centric analytics product for mobile operators who want to capitalize on their network data. Unlike other Telecom Analytics systems, EEA measures the perceived customer experience of individual services for all customers, all the time, in real-time across the radio access and mobile core networks with high accuracy.

Ericsson Expert Analytics (EEA) is designed to be a solution that answers part of the question “What is the customer experience?” EEA collects and measures performance data and event records from the network and from subscribers' devices, translates the data into Quality Indicators (QIs), generates incidents based on defined rules, and provides various visualization.

Responsibilities : • Responsible for Development of Hbase Bulk Loading component and SLI Calculation.

• Responsible for data movement from Probes and Adaptors to HDFS

• Hbase Performance Benchmarking and tweaking the parameters to get the desired performance.

• Research and decide for the use of impala, drill or Spark SQL for the EEA Product.

• Working on the Drill and Impala Queries and their integration with Tableau for the desired Visualization

• End-to-end development data engineering side which required to plot graph and other outputs

Tools and Technologies : Spark, Hbase, Python, Java, Hadoop Core, Map Reduce 2, Oozie, GreenPlum, Redis, Docker,

Assignment Details Period : Ericsson Mar 2014 - Jan 2015 Domain : Telecom Team Size : 11 Role : Team Lead Project Description : Swisscom Correlator and Predictor

Swisscom Operations are responsible for end-to-end service and resource management in both IT services and network (wireline and wireless) domains. One of the main focus areas of Swisscom Operations is the improvement in proactive IT services assurance.

Reactive downtime management processes are perceived as reliable and provide satisfactory results. The new focus is on the improvement and development of uptime management processes. The objective is to develop

ability to predict business IT service degradation and act to prevent the degradation from happening.

Predictive analytics monitors business service components to tease known patterns out of related events and metrics.

Patterns of event and metric data are learned from the stored historical data. The prerequisite for having a pattern-based self-learning system is ability to:

Fetch relevant performance metrics and events from the related components of systems that provide a given business service

Correlate fetched metrics and event data both in the context of a given business service and in time

Responsibilities : • Involved in Design and Development of Correlator Application.

• Involved in Developing the Esper Rule Engine for Complex Event Processing

• Involved in Development of Tableau Dashboards for visualization

• Also visited client site in Bern, Switzerland to present the demo of the product.

• Involved in code review of Peers. Tools and Technologies : Java1.7, PostGresql, MySQl, Tableau, Esper Rule engine,

Apache Camel, Assignment Details Period : Amdocs August 2012- Feb 2014 Domain : Telecom Analytics and Big Data Team Size : 10 Role : Tech lead Project Description : EDW to big Data Platform- ATT Lightspeed is one of the

important strategic projects for ATT in which it provides the customer with the service of IPTV, VOIP and High Speed Internet via a single connection. OMS is one of the main applications for that which orchestrates the order flow right from the creation to the fulfillment of the order. OMS has to interact with multitude of downstream applications including CRM and Billing. The Data collected for each order in OMS is huge and there is a lot of reporting need on that data to see the historical trends and predict the latest customer demands, to predict the customer churning etc. TO get more value and analytics out the data. ATT decided the move the historical data from Teradata to HDFS system. It was more of pilot project for lightspeed.

Responsibilities : • Responsible for data movement from RDBMS to HDFS • Involved in Design and Development of technical

specifications using Hadoop technology.

• Involved in writing Map Reduce jobs and giving input to HDFS for further processing.

• Involved in writing Hive and Pig scripts for the analytics

• Exporting the required output. Tools and Technologies : Hadoop HDFS, Map Reduce 1, Hive, Pig, Scoop, oozie Assignment Details Period : Amdocs Oct 2009 - Aug 2012 Domain : Telecom Team Size : 15 Role : Senior Developer & Tech lead Project Description : ATT Lightspeed OMS - ATT Lightspeed is one of the

important strategic projects for ATT in which it provides the customer with the service of IPTV, VOIP and High Speed Internet via a single connection. OMS is one of the main applications for that which orchestrates the order flow right from the creation to the fulfillment of the order. OMS has to interact with multitude of downstream applications including CRM and Billing. Amdocs OMS consists of two main modules: Order management and Product management. Order management responsibilities consist of: a) Driving the end to end ordering process, including negotiation, delivery, and notification through a build in business processes engine. b) Order tracking, monitoring, and error handling. c) Interfaces to BSS and OSS external systems. Product management is responsible for offering a complete solution for the broad and flexible definition of products. All the products and services marked by the service provider are defined in the product catalog, which is the service provider’s repository of product information related to ordering.

Responsibilities : • Worked as Senior Data Analyst for OMS. My responsibility included finding the inconsistent data from the DB, finding the root cause of the bad data, fix the issue, also present the complete order flow and the count of stuck data to the ATT higher Management.

• During my stay at Site, worked as the site co-coordinator to understand the requirements from the client, preparing the High Level design documents and Detailed Design Documents and provide the solutions to the team at offshore during the

implementation. • Worked in the area of integration of OMS with the

Product Catalog. This involves very complex product compatibility rules implementation and also the setting the products attributes on run time.

• During my last few months I have also worked as the data analyst and business analyst for AT&T to study the various design gaps and then drive meetings with the corresponding teams to come to the solution.

• Also worked on the integration of OMS with the billing System (Enabler and Telegence for AT&T), and largely have been responsible for analysis of out of sync issues between the two systems.

• Worked on Core Java Implementations of the Different Modules of Order Management System

Tools and Technologies : Java / J2EE, , JSP, SQL Developer, Tomcat, Ant, CVS,

Eclipse, Oracle 10g, Pl/Sql, Toad,

Assignment Details Period : Infosys Nov 2006 - Oct 2009 Domain : Manufacturing Team Size : 8 Role : Offshore Developer, Defect prevention Analyst,

configuration Controller Project Description : Boeing, USA Disco! – Distribution Control System.

The purpose of Distribution Control System (DisCo!) is to provide a well-managed, secure, single source of distribution information concerning technical publications provided by Boeing. The documents provided by Boeing along with its Aero plane sale is huge in number and it needs to be properly managed and this where Disco application comes in to play. It is also a good source of income for Boeing as any further and updated request of documentation by the client are charged duly.

Responsibilities : • Worked as Offshore Developer, Defect prevention Analyst, configuration Controller

• The Online application was developed with InJArch Architecture (Infosys Java Architecture, a customized Model-View-Controller architecture on the lines of J2EE Blue Print).

• Responsible for the development and software deliverables to clients

Tools and Technologies : Java/ J2EE, Clear Case, Clear Quest, Eclipse, SQL Developer, Tomcat, Ant

Tools and Technologies : Java / J2EE, Struts 1.1, AJAX, HTML, JSP, CSS, XML, Toad, JSP Reports, Tomcat, Ant, VSS, Eclipse, Oracle 10g

Other Information: I am Meditation Practitioner for last 10 years, I conduct stress management workshop for the students, Corporates and the general Public. I am also associated with the global Organization named Sahaja Yoga Foundation. Personal Detail: Nationality Indian Marital Status Married Passport No. G8257778 (Valid till Apr 02, 2018) Date of Birth 30th Mar 1984 Languages Known: English, Hindi References: Available on request