• Architect/Design ETL to merge 100,000 Databases into Central Database in AWS RedShift cluster by using Py, SSIS and Shell Scripting.
• AWS: Worked on Cloud Formation, S3, Data pipeline, DynamoDB, RDS, RedShift, Lambda Function, SNS, VPC.
• API Integration from different data sources (SalesForce, Internal MB API, Stripe) with millions of data rows loaded daily.
• Data Analysis: Wrote complex Advanced SQL queries to perform Data analytics and hand over to end user.
• Data Modelling: Dimensional modelling and 3NF on Databases for better performances of central databases and Reporting (Tableau)
• QA: Python to write complex ETL script and analytical function to perform QA and Testing AWS RedShift Database.
- Data Warehouse Engineer II at MindBody Inc
- Technology Analyst at Bank of America, Infosys Limited
- Research Analyst at Florida International University
- System Engineer at Bank of America, Infosys Limited
1 year, 2 months at this Job
- Masters of Science - Information System
- Bachelors of Technology - Computer Science
Data representation Lead, Data warehouse intern team Lead
• Designed and developed data metric framework and runtime pivoting tools with high responsiveness providing the management a deeper insight into data.
• Being part of Data Engineering team, I helped design and develop data pipeline using Talend Integration Studio, Java and MYSQL partitioning to create a near real-time ETL process. It included reading streamed log data, staging, and creating snowflake schema.
• As Data Warehouse Intern lead, I was responsible for hiring, on-boarding, training, and mentoring interns for their projects. Internship's ultimate goal was to ensure, interns have the best time while learning and delivering projects/tools that enhance/speed up data engineers' velocity.
• Responsible for cross team communication and co-ordination for data pulls and other stories and projects.
• Good team player and a quick learner
- Data warehouse engineer at Pinger Inc
- Data warehouse intern at Pinger Inc
- Software engineer at Maven systems Pvt. Ltd
2 years, 6 months at this Job
- MS in Software Engineering - Software Engineering
- BE - Computer Engineering
Technology Stack: PLX script, Google SQL, Dremel SQL, PLX workflow, Tableau, Big Query
9.5, 10, Python
● Responsible for implementation and onboarding of Google DCstat BI system.
● Conducted workshops with clients for Requirements Gathering and Analysis
● Design and Development of ETL mappings using PLX script.
● Scheduling workflow through PLX workflow script.
● Writing Complex SQLs using Analytical Functions, List Aggregate Function, Virtual Function and Inline views in Google SQL and Dremel SQL.
● Data Analysis using Complex SQL.
● Created dashboards using Tableau version 9.5 and 10.
● Created python scripts using python libraries like NumPy, Panda and Beautiful Soup etc.
- Data Warehouse Engineer at Google
- Senior Consultant at Oracle India Pvt. Ltd
- Senior Software Developer at Incedo
- IT Analyst at Tata Consultancy Services
9 months at this Job
- Bachelor of Engineering in Computer Science - Computer Science
• Successfully developed a framework for loading customers' aggregated Smart Phone and SDK TV usage info into MySQL relational databases for Reporting, Dashboarding and Ad-Hoc Analyses, which revealed ways to lower operating costs and offset the rising cost of programming.
• Created brand new, redesigned and converted existing reports to Tableau Dashboards and conducted Tableau Server administration tasks.
• Successfully designed and created complete dynamic SSIS ETL Packages to validate, extract, transform and load data from LeEco Le-Mall Online Store transaction database to Data Warehouse System.
• Evaluated business requirements to come up with Informatica mapping design that adheres to Informatica standards.
• Designed and created complete dynamic SSIS ETL Packages to validate, extract, transform and load data from LeEco Le-Mall Online Store transaction database to Data Warehouse System.
• Collaborate and work with business analysts and data analysts to support their data warehousing and data analysis needs.
• Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved critical technical problems.
- Senior Data Warehouse Engineer at LeEco Technology Inc
- Senior ETL Engineer at First Republic Bank
- Senior Data Warehouse Consultant at Hewlett Packard
- Senior Data Warehouse Engineer at HuaChun POS Systems Inc
2 years, 6 months at this Job
- Master of Science in Management Information Systems - Management Information Systems
- BA in International Economics - International Economics
Business Intelligence Developer
• Orchestrate the data movement from Microsoft On-Premise SQL Server to Azure Blob Storage, utilize PolyBase for data loading from Blob Storage into Azure SQL Data Warehouse, and deliver transformed data into Azure SQL Database using Elastic Query.
• Create and maintain Cognos/UltiPro HR demographics reports per end user specifications.
• Utilize SSIS to consume McKesson ANSOS OneStaff Scheduling file to load, transform, and generate output of data per UltiPro specifications.
• Designed and developed SSIS (ETL) packages to process Excel and Flat File into databases for internal customers usage.
• Deploy code changes on a weekly basis using Visual Studio, DBBest, and fit repository.
- Data Warehouse Engineer at Steward HealthCare Network
- Senior Advisor at EMC Corporation
- Principal Developer at EMC Corporation
- Principal Developer at EMC Corporation
2 years at this Job
- Master of Science in Management Information Systems - Management Information Systems
- Bachelor of Science
My position at DAU encompasses several roles:
➢ The bulk of my work, as an Informatica Developer, consists of updating existing mappings and workflows to reflect changes in the underlying database schema or to satisfy new requirements for various processes.
➢ I modify sources, targets, and transformations to expose or create new data elements.
➢ In some cases, I use these new data elements to tune the mappings and workflows to run in a more efficient manner by eliminating superfluous lookups and streamlining the mapping logic.
➢ I also create new mappings and workflows to satisfy new reporting capabilities as needed. Developed a process that provides updated data to the reporting system 'Qlik'.
➢ Developed a process to identify API interfaces for kaltura media space analytics and loaded the kaltura historical data into Qlik reporting system using Informatica. Data Mart Analyst:
➢ As a Data Mart Analyst, I investigate issues in data mart performance and data quality.
➢ Worked on service requests made by the customer on issues in the Data Mart, such as a record or set of records that has incorrect data. I have isolate and correct the records with either MSSQL queries or an ad hoc Informatica mapping.
➢ Then, I work with the member or members of the Data Integration Team or the customer who reported the issue to ensure the changes I have made have resolved the issue. Informatica System Administrator:
➢ As an administrator for the Informatica system at DAU, I work with the Database Administrators to keep the Informatica PowerCenter/PowerAnalyzer software in sync with the supporting repositories and data sources and targets.
➢ In addition, I work with the Data Integration Team to resolve Informatica process failures to keep the DAU databases and the reports built off of them accurate.
➢ I have involved in migration of metadata from Informatica Power center 9.5 to Informatica Power center 10.1
➢ I have displayed willingness and ability to learn complex existing processes in order to support continuing operations. Environment: Informatica Power center 9.5, 10.1, Toad, SQL server
- Senior Informatica Developer/DATA Warehouse Engineer at Qbase/ Defense Acquisition University
- ETL Developer/ Programmer Analyst at Level 3 Communications, Inc
- Developer/Analyst at University of Maryland
3 years, 2 months at this Job
- Master of Science
Designed, developed, tested and implemented numerous SSIS packages that backfill CVS data with resume data. Database Administrator support. Backup, restore databases and create new ones utilizing Database and Service MasterKey elements. Production support of nightly SQL jobs. Fix/upgrade SQL Server jobs so they complete without issues. 50% of jobs needed tuning. Data modeling and UML experience redesigning new databases for TalentAI application. Analysis of fuel report from Bullhorn Azure tables utilizing COGNOS.
- Database and Data Warehouse Engineer at Motion Recruitment Partners
- Sr. SSIS Consultant at BAE Systems
- Sr. SSIS Consultant at
- Sr. Data Warehouse Consultant and DBA at Harvard University
6 months at this Job
- Bachelor of Science - MIS
- Data Warehouse Engineer II at GolfNow
- Data Engineer at GolfNow
- Software Engineer at INCHARGE DEBT SOLUTIONS
- Technical Consultant at QuickPivot
8 months at this Job
- Bachelor's - Computer Science
• Involved in development and implementation of SSIS, SSAS and SSRS application solutions for various business units across the organization
• Developed SQL scripts for administration task
• Maintained SQL scripts, indexes, complex queries for data analysis and extraction
• Manage database and transaction log files, auto loans log files using scheduled script
• Upgraded MS SQL Server 2008 R2 database to MS SQL Server 2012& MS SQL Server 2014, by using Detach/Attach, Back up & Recovery
• Expertise in creating SQL data models/Queries for Power BI and Ad-hoc Reports.
• Used various SSIS tasks such as Conditional Split, Derived Column, which were used for Data Supported in creating KPI, calculation members and measure groups, and MDX queries
• Designing and developing reports using Tableau.
• Develop, Design & Implement end to end BI solutions using Microsoft BI suite, Tableau & PowerBI
• Experience in configuring and deploying SSRS reports onto Microsoft Office SharePoint Server
• Utilized Tableau & Power BI server to publish and share the reports with business users.
• Created data models using QlikView scripting based on a 3-tier architecture.
• Worked on AWS - Amazon Web Service applications.
• Integrated SAP ECC, SAP BW with IBM Information Server.
• Maintenance and manage of all the Linux Servers.
• Involved in developing database objects such as procedures, triggers, constraints, indexes and views. Used T-SQL to build dynamic stored procedures. Installation of netezza performance portal
• Communicated between Oracle Database and Vertica Database and also create compatible query to fetch the data from Vertrica Database as Oracle Database.
• Extensively worked in Service Oriented Architecture (SOA) through WCF, RESTful Services and C#.NET.
• Implementing logical and physical data modeling with STAR schema using Erwin in Data Mart.
• Built application prototype(s) using SharePoint Out of the Box features
• Installation, configuration of SharePoint Farm
• Planned and executed migration of large data from Lotus Notes to SharePoint 2010
• Developed SSIS packages for File Transfer from one location to the other using FTP task
• Used ETL to implement the Slowly Changing Transformation, to maintain Historical Data in Data warehouse
• Performed Teradata and Informatica performance tuning
• Working on MS-SQL Server, Vertica, and Amazon Redshift as Database Tools.
• Design and development of SSAS Tabular cubes to satisfy the end user reports by creating different types of Reports and Dashboards in Power BI.
• Created complicated DAX measures in the Power BI reports as per the requirements of the Business Users.
• Extensively created and used various Teradata Set Tables, Multi-Set table, global temporary tables, and volatile tables.
• Developed several Tableau reports for Sales using Sales Data for Different Vendor.
• Developed modules of application in ASP.NET and involved in writing C# classes.
• Involved in development of UI and server side code (behind file) using asp.net coding
• Writing/maintaining UNIX scripts for process monitoring.
• Created Power BI Dashboards with interactive views, trends and drill downs along with user level security
• Development, creation and sharing Reports, Charts, and Dashboards by applying different filters from SSAS cubes using Microsoft Power BI tool and Cube View Reporting tool.
• Worked on ASP.NET validation controls for validating the personal information provided by the Customer.
• Created SSAS OLAP Cubes and Partitioned Cubes. Developed Multi-Dimensional Objects (Cubes, Dimensions) using MS Analysis services (SSAS). Created Sub-Reports, Drilldown-Reports, Summary Reports, and Parameterized Reports in SSRS
• Generated ad-hoc reports using MS Reporting services and Crystal reports
• Worked on SharePoint website to communicate project release notes and technical documentation.
• Involved in Deploying the project by using AWS (i.e., Amazon Web Service)
• Worked on Erwin Data Model Tool and good knowledge of Relational Database Modeling/Loading and Dimensional database modeling concepts. Developing & deploying apps with Splunk.
• Database Programming experience using Oracle 10g, 9i, 8.x/7.x, DB2, MS SQL Server 8.0/7.0/6.5, SQL, and PL/SQL.
• Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza.
• Undergone complete life cycle implementation with experience in Data Modeling, Extraction, Loading, Scheduling, Monitoring, Reporting and Performance Tuning. Used ADO.NET strongly typed datasets to insert, update and retrieve data from a SQL Server database.Used ADO.NET and Data grid for data manipulation. In-depth knowledge on Tableau Desktop, Tableau Reader and Tableau Server.
• Generated DDL Scripts from Physical Data Model using technique of Forward Engineering in Erwin.
• Responsible to add data and maintain the data on Splunk Server. Created data models using QlikView scripting.
• Perform server and application checks for any alerts, warnings errors and resolving the issue. Use of the monitoring tools such as Splunk and Log Parser to understand the logs.
• Running Automated Regression tests on SQLServer, Infobright, Redshift
• Written Hive query to fetch data
• Host the SSRS reports on SharePoint. Exposure to UNIX commands & scripting.
• Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
• Experience in Using QlikView Functions (Date and Time, Join, String & input Fields etc )
• Created UNIX and PL/SQL scripts for automating resolution of various data related issues and user administration
• Involved in designing and developing logical and physical data models to best suit the requirements.
• QlikView Tool connectivity with different DBs like POSTGRE SQL, MSSQL, MYSQL.
• Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS. Created reports using SSRS from OLTP and OLAP data sources and deployed on report server.
• Involved in analyzing source systems and designing the processes for Extracting Transforming and Loading the data to Teradata database. Extensively used Erwin for developing data model using star schema methodologies.
• Developed charts and Graphs using Power Pivot.
• Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
• Developed Parameterized SSRS Reports and PowerPivot workbook using Data of Tabular model and Cube Data model to prove the functionality of Database. Environment: Windows 2003 Server/XP, WCF, UNIX, SQL Server 2014, Linux, Teradata, Vertica, DDL, SQL Integration Services(SSIS), Power BI, Netezza, SPLUNK, SQL Reporting(SSRS), SSAS, Business Intelligence Studio, Visual Studio 2008/2005, Team foundation Server, Tableau, VB.NET, OLE DB, Qlikview, DTS Package, SharePoint MOSS 2007, Crystal Reports, Erwin 4.0, VB.Net, HTML.
- Senior SQL Developer/ Data Warehouse Engineer at IBM
- SeniorSQL Developer/ Data Warehouse Engineer at ACS Xerox Service Solutions
- SQL Developer/Data Warehouse Engineer at First Care
- ETL Developer/ MSBI Developer (SSIS, SSAS, SSRS) at First National Bank
3 years at this Job
• Developed ETL data pipelines for different kafka topics ( web clickstream data - page views/clicks/checkout, relevance, tokens, subscriptions, push notifications, mobile events etc) to perform data ingestion, unfurl JSON, into a warehouse schema, consume data from raw logs through kafka, transform and load into TD via TPT, Hive on reporting/ETL clusters.
• Designed and Developed complete attribution pipeline for the orders, subscriptions, claims placed on Groupon platform and attribute different type of transactions (orders, claims, subscriptions etc) to corresponding paid and unpaid marketing channels by analyzing the traffic events (from web and mobile platforms) for the user. The ETL pipelines were built on spark/scala/pyspark framework and data was pushed to Hadoop ETL clusters and Teradata for end user reporting.
• Built data pipeline for marketing incentives project by extracting data from APIs using Python.
• Optimized the existing ETL pipelines by converting the hive based data pipelines to more efficient spark/scala based data pipelines. Also worked on tuning Spark/Scala, hive and Teradata based ETL jobs.
• Test ETL jobs via home-grown zombie runner framework, executing python scripts, hive/SQL/spark tasks written in YAML.
• Troubleshoot and remediate issues impacting processes in ETL framework. Modifying existing code to provide defect fixes for existing ETLs. Coordinate and Collaborate with ETL team and business users to implement all ETL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications.
• Utilizing scrum/agile methodology [Weekly deployment] constantly deliver development, maintenance, on-call support tickets in various domains.
• Scheduling changes in OPWISE, 24X7 on-call support in rotation model. Environment: Big Data, Hadoop (Cloudera & Hortonworks), Spark, Scala, Hive, Sqoop, Python, Teradata, MYSQL, Unix, Git.
- Data Engineer at Groupon
- Senior Data Warehouse Engineer at Intuit Melno Park
- Informatica Consultant at Sales Force
- Senior Data Warehouse Engineer at Saama Technology Inc
3 years, 6 months at this Job
- Bachelor of Engineering - Electrical and Electronics Engineering