dhruv arora

8
DHRUV ARORA (647) 677-6340 [email protected] m PROFESSIONAL SUMMARY • 16 years of experience in designing architecture, analysis, design and development of database development, database administration, maintenance using Microsoft SQL Server and BI suite (SSAS, SSIS, SSRS) and application development using dot net technologies (c#,asp.net, wcf, winforms, web services). I am Certified Cloudera Hadoop Administrator • Extensive experience in Relational and Dimensional Data modeling of Data ware house for creating Logical and Physical Design of Database and ER Diagrams using data modeling tools like MS Visio, Erwin. • Expertise in merging data from various heterogeneous data sources like Excel, text files, SharePoint Lists, data extracted through Cosmos(Big data) and merging, populating dimension and fact tables in data warehouse, Cleansing and Standardizing data loaded into OLTP and DW databases using SSIS. • Cleansing of data when extracted from different sources like flat & csv files, data conversion, error handling and logging and inserting into staging tables and then in Data Warehouse. • Maintenance of historical data using Slowly Changing Dimensions (SCD 2) and marking of records as inferred members in case of early arriving facts. • Experience in creating package configurations like XML, SQL Server Configurations and Logging using text, XML and Windows event log. • Expert in using Data Profiling task in cleaning the data. • Scheduling of ETL packages through SQL Agent jobs. Skilled in SQL 2005 Business Intelligence tools, Data Warehousing and ETL processes & strategies using SQL server Analysis and Integration Services. • Created and Configured Data Source & Data Source Views, named query, named calculations and Dimensions, Cubes, Measures, Partitions, KPI’s & MDX Queries using SQL Server 2008 Analysis Services. • Excellent knowledge in Developing SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, and Deploying and Processing SSAS objects. • Created and Configured Data Source & Data Source Views, Dimensions, Cubes, Measures, Partitions, KPI’s & MDX Queries using SQL Server 2008 Analysis Services. Internal

Upload: dhruv-arora

Post on 11-Apr-2017

22 views

Category:

Software


2 download

TRANSCRIPT

Page 1: Dhruv arora

DHRU V ARORA (647) 677-6340 [email protected]

PROFESSIONAL SUMMARY

• 16 years of experience in designing architecture, analysis, design and development of database development, database administration, maintenance using Microsoft SQL Server and BI suite (SSAS, SSIS, SSRS) and application development using dot net technologies (c#,asp.net, wcf, winforms, web services). I am Certified Cloudera Hadoop Administrator

• Extensive experience in Relational and Dimensional Data modeling of Data ware house for creating Logical and Physical Design of Database and ER Diagrams using data modeling tools like MS Visio, Erwin.

• Expertise in merging data from various heterogeneous data sources like Excel, text files, SharePoint Lists, data extracted through Cosmos(Big data) and merging, populating dimension and fact tables in data warehouse, Cleansing and Standardizing data loaded into OLTP and DW databases using SSIS.

• Cleansing of data when extracted from different sources like flat & csv files, data conversion, error handling and logging and inserting into staging tables and then in Data Warehouse.

• Maintenance of historical data using Slowly Changing Dimensions (SCD 2) and marking of records as inferred members in case of early arriving facts.

• Experience in creating package configurations like XML, SQL Server Configurations and Logging using text, XML and Windows event log.

• Expert in using Data Profiling task in cleaning the data.• Scheduling of ETL packages through SQL Agent jobs. Skilled in SQL 2005 Business Intelligence tools, Data Warehousing

and ETL processes & strategies using SQL server Analysis and Integration Services.• Created and Configured Data Source & Data Source Views, named query, named calculations and Dimensions, Cubes,

Measures, Partitions, KPI’s & MDX Queries using SQL Server 2008 Analysis Services.• Excellent knowledge in Developing SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, and Deploying and

Processing SSAS objects.• Created and Configured Data Source & Data Source Views, Dimensions, Cubes, Measures, Partitions, KPI’s & MDX

Queries using SQL Server 2008 Analysis Services.• Experience in maintaining database, including updating statistics, table partitioning, re-indexing, structure

modifications, and index analysis.• Proficiency in database backup and recovery methodologies.• Experienced in database administration activities - log shipping, replication and mirroring.• Experience in database performance monitoring and tuning.• Well experienced in executing the project management life cycle (PMLC). Planning and executing the project

successfully.• Involved in making System Requirement Specification (SRS) and Functional Technical Specification (FTS), to make

sure, task is achieved and meets the requirements.• Working knowledge of Power View, Tableau.

HADOOPRecently cleared Cloudera Hadoop Administrator Exam and have hands on experience in the following skills:-

1. Hive/Impala 2. Squoop3. Oozie (Have made workflows to ingest data from MySQL to HDFS and Hive and made jobs and Scheduling)4. HBase (NoSQL database)

Internal

Page 2: Dhruv arora

5. Setting up cluster of machines on AWS 6. Cloudera Director for Hadoop auto provisioning. Through this we can automate the setting up of Hadoop cluster with as many as nodes as we want and also automating running of scripts pre and post installation. Used it to automate the setting up of 10 machines on AWS. 7. Performance testing of MapReduce and HDFS using TeraGen and TeraSort.8. Addition/Removal of data nodes. Setting up Secondary name node. Checking logs for errors and monitoring jobs. Load balancing. Adding new services.9. Squoop job scheduling10. Setup HIVE ODBC drivers on Data Analyst machines.11. Upgrading Hadoop cluster with new version.

EDUCATION

• MCA (Master of Computer Applications) from Indira Gandhi National Open University, New Delhi, India• Bachelor of Business Administration (BBA) from Annamalai University, Tamilnadu, India

STATUS IN CANADA: Permanent Resident (PR)

BI Consultant Rogers, Brampton Nov 2016 – Till DateBusiness Performance Management: This project is increasing the performance of the existing SSAS cubes.

My Role: - My role is to analyze the 8 of the pre-existing cubes and increase the performance of the cubes and rectifying some of the critical issues.Technologies: - SSAS, SQL Server, SSIS

BI Consultant TD Bank, Wellington St Toronto Apr 2016 – Oct 2016

RISE Risk Analysis: The Risk Information Strategy for the Enterprise (RISE) program was launched by Enterprise and Operational Risk Management to conduct current state assessment, develop and implement a future state vision and high-level strategic roadmap for TD’s risk information and regulatory compliance with the BCBS Principles. The RISE program aims to develop an institution-wide ability to manage risk within TD’s Risk Appetite through more effective and efficient use of data and technology while developing capabilities to improve its BCBS compliance requirements. This project has 3 phases. Phase 1 and 2 were completed in 2013 and phase 3 started in 2014 involves implementation of enhancements for key gaps including Golden Source data consolidation, the approval and rollout of the Enterprise Risk Data Aggregation Framework and refined governance of the credit risk data dictionary.

My Role: -(i) Designing of View Switching used to change the database source for report users while the other database is loading data.(ii) Designing of ETLs using SSIS to populate data in dimensions and Fact Tables.(iii) Writing stored procedures and user defined functions.

BI Consultant Canadian Tire, Yonge & Eglinton Toronto

Feb 2016 – Apr 2016

• FBI (Finance Business Intelligence) AOC (Analysis of Change): Canadian Tire is known for retail chain in Canada and has five business units – Canadian Tire Retail, Canadian Tire Petrol, Marks, Sports Check, and FGL etc. This project is about making an engine that calculates the impact of change between the two periods and translates it into the outcome metric’s unit of measure. There are driver metrics which are related to each other in parent-child way. Every metric has its mathematical formulas based on other metrics. For example, Total Revenue is the parent metric and its child metric is Organic Revenue and Revenue from Canadian Tire is child metric of Organic Revenue.

Internal

Page 3: Dhruv arora

My Role: -(i) Requirement gathering from business.(ii) Designing of Data Warehouse (Dimensions and Fact Tables).(iii) Designing of ETLs using SSIS to populate data in dimensions and Fact Tables.(iv) Writing stored procedures and user defined functions to calculate AOCs for metrics.

BI Consultant CAA SCO, Markham, ON Jul 2015 – Dec 2015• EDW (Enterprise Data Warehouse): CAA is an insurance company which deals in the insurance of vehicles and property

as well as road site assistance. This project is about maintaining and adding new features to the existing dataware house. Guidewire emits out xml files for policies, claims and billing information and which are later processed in Dataware House. On the top of DW, SSAS cubes have been maintained which are consumed by reports using SSRS.My Role: -(i) Designed SSIS ETLs to process data from XML files into staging tables and in dimensions and fact tables.(ii) Made SQL jobs to schedule ETLs.(iii) Designed reports using SSRS, which takes its input data from DW and cubes.(iv) Wrote stored procedures and fine-tuned SQL Queries.(v) Designed SSAS cubes using STAR schema.

BI Consultant CAMH,Toronto, ON Feb 2015 – May 2015• HSP360: Health Service Providers (HSPs) in Ontario rely heavily on Local Health Integration Networks (LHINs) to collect

and compile data, and produce the reports needed to facilitate system-wide planning. Recently, the Centre for Addiction and Mental Health (CAHM) built and implemented the a “Shared Knowledge and Performance Information” system (MS SQL Server 2012 Data Warehouse and a SharePoint 2013 web based portal), on behalf of the Toronto Central (TC) LHIN called HSP360.HSP360 provides easier access to more robust reporting in a timely manner, while significantly reducing the time, cost, and effort associated with processing data and populating the reports. There are 135 communities and 18 hospitals under TCLHIN.My Role: -(i) Designed SSIS packages and maintained existing ETL packages.(ii) Scheduling SSIS ETL jobs and troubleshooting in case jobs fail.(iii) Writing stored procedures and maintenance of Data ware house.(iv) Designing of new cubes using STAR schema and maintenance of existing SSAS cubes and responsible for

checking processing.(v) Used MOLAP and ROLAP partition storage modes. (vi) Deployed cubes on production using XMLA file.(vii) Making dashboards using SSRS and reports using data from cubes using MDX queries and uploading reports

on SharePoint portal.(viii) Backup and restoration of database.(ix) Fine tuning of database and improving the performance of db.

BI Consultant TD Bank, Markham, ON Sep 2014 -- Dec 2014• Performance Mgmt.: The goal of the O&T Balanced Scorecard is to have a set of metrics that will provide a consistent

method for gaining unified insight on the health of business, on the experiences of our customers, and on the level of performance of our people.My Role: - Writing stored procedure, designing reports using SSRS, maintaining existing SSIS ETL packages and SQL Server jobs.

Technical Manager HCL America Inc.,USA Mar 2010 - Aug 2014

Internal

Page 4: Dhruv arora

• RnR BingAds: This project accounts for revenue generated for Microsoft from Bing Search Engine. Revenue is generated from advertisement given on Bing and Yahoo search engine. Yahoo also uses Bing search engine behind. Data are fetched from different data sources which include cubes and Cosmos. This data is collected in staging tables later pushed to Data Ware house. There is a dashboard which is maintained using HTML5 and reports made using aspx are published every morning to stakeholders by mail. My Role: -- (i) Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.(ii) Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data from cubes and flat files (from COSMOS [Big Data Microsoft Proprietary]) into data warehouse.(iii) Maintenance of Slowly Changing Dimensions Type 2 and handling of Early Arriving Facts in Dataware House.(iv) Prepared the complete data mapping and Report Definitions for the project.

• Appex Telemetry: This is part of metro applications (like Travel, Sports, and News etc. (shipped with Windows 8 release)). Like Google Analytics, Appex Telemetry also logs the usage data like which user used which application and for how long. This data is processed through COSMOS (internal to Microsoft, same as Hadoop) and is pushed into database using SSIS and from there to SSAS cubes. Reports are made using SSRS and excel. My Role: -- (i) Used ETL tools like SSIS/DTS for data flow from source files like XML, Excel, Tables and views to other databases or files with proper mapping.

(ii) Based on business requirement, developed the Complex SQL queries with Joins and T-SQL, Stored procedure, Views, Trigger to implementing the business rules and transformations.

(iii) Monitored performance and optimized SQL queries for maximum efficiency.(iv) Designing of cubes and automated the processing of cubes using SSIS.

• SQL Sustained Engineering for Microsoft: Allows Microsoft to analyze issues faced by customers using SQL products across the globe. Information is captured; incident is assigned a ticket based on client subscription type of customer. Depending upon the information, a decision is taken whether the problem has any workaround or needs a hot fix i.e. it is a RFC or RFH. Used SSAS,SSIS, SSRS and SQL Server, Silverlight 4.0, Entity FrameworkMy Role: -- (i) Used SSIS to unite data from existing system and performed transformations on MS SQL 2008.(ii) Designed Cubes with Star Schema using SQL Server Analysis Services 2008 (SSAS). Created several Dashboards and (iii) Scorecards with Key Performance Indicators (KPI) in SQL Server 2005 Analysis Services (SSAS).(iv) Automated cube processing and partition creation using SSIS packages.(iv) Enhancing and deploying the SSIS packages from development server to production server.(v) Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2008 in Business Intelligence Development Studio (BIDS)

• SGVM Scorecard for Microsoft: SGVM is an intranet vendor performance measuring application and has SSRS reporting engine. It implements compliance matrix to calculate and apply weightage to determine compatible score. Designed cubes using SSAS and used as input in SSRS. Data movement from OLTP Database to Data warehouse is done by ETL (SSIS) packages. Business Data is processed to OLAP storage by implementing cubes. My Role: -- (i) Designed OLTP and Detaware House.(ii) Designed SSIS Packages to extract, transfer, load (ETL) existing data into SQL Server from different environments for the SSAS cubes.(iii) Involved in Analyzing, designing, building &, testing of OLAP cubes with SSAS 2008 and in adding calculations using MDX.(iv) Designed and implemented data mart, facts, dimensions and OLAP cubes using dimensional modeling standards in SQL Server 2005/2008 that maintained data. Involved in the complete data warehouse development life cycle. (v) Actively supported business users for change requests.

Eli Research India Eli Research India Oct 2009 – Feb 2010Project:-

Internal

Page 5: Dhruv arora

• Beckett.com: This site is about games card business, which is very popular in USA. People trade, their cards on this site. Also they can get their cards graded through this site. Grading is the process by which the condition of the card is ascertained and given some points. A person can get subscription on this site to view pricing of cards of different types.

My Role: -- (i) Designed and developed SSIS Packages to extract data from various data sources such as Access database, Excel Spreadsheet and flat files into SQL server 2005 for further Data Analysis and Reporting.(ii) Responsible for performing T-SQL tuning, optimizing queries for reports which take longer time in execution with SQL Server 2008.(iii) Designing of cubes and using MDX queries for SSRS reports.

Team Lead GenX Info Technologies, India Apr 2008 – Apr 2009

• LSS (Loan Servicing Solution): Loan Servicing Solution (LSS) provides a back office portal for the day to day administration and special servicing of live mortgage loan accounts. LSS employs Service-Oriented Architecture, with all core components communicating using XML Web Services. It employs core DPR components for across layer communication, data retrieval and push. It provides one point abstract implementation for loading and saving database entities.My Role: --

(i) Responsible for creating stored procedures, views, user defined functions, triggers, indexes. (ii) ETL packages for pushing data into Data Warehouse’s dimensions and fact tables using Slowly Changing Dimension (SCD) transformation. (iii) Worked on MDX Queries, Calculated Members, Aggregations and KPI’s after building Cubes in SSAS.

Sr. Software Engineer FIS Chandigarh, India Aug 2006 – Mar 2008

• Partsearch Application Management (CSR): Partsearch has developed the industry's largest and most comprehensive Master Parts Catalog (MPC) that centrally aggregates data from hundreds of manufacturers across multiple product categories.(i) Involved in Design and Data Modeling using Star schema.(ii) Created several SSIS master child packages for loading data from various heterogeneous sources, cleaning data and populating the data marts.(iii) Optimized packages minimizing use blocking transformations, adjusting buffer size, executing packages in control flow, and using proper event handing, loggings and check points.

Sr. Software Engineer Interglobe Technologies Pvt. Ltd, India Sep. 2005 – Aug 2006

• Trip Manager (www.tripmanager.com): This website caters to the international travel related need of corporate companies like flight, hotel, car, train reservation and ticketing.(i) This project was in ASP.Net and used C#, JavaScript, Oracle 9i, xml and xslt.

(ii) I was primarily involved in architecture designing and coding

In all other previous companies, I worked as software developer and used dot net technology and vb6.

Software Engineer Iris Software Pvt. Ltd, India Jan 2004 – Sep 2005

Software Engineer Five Dimension Pvt. Ltd, India Jan 2001 – Jan 2004• Languages and Technologies

• C#, vb.net, vb6, WCF, Sql Server 2012, SSAS,SSIS,SSRS, JavaScript, JQuery, AJAX, Oracle 9i, COSMOS• Visual Studio; Microsoft SQL Server; Eclipse; TFS

Internal