Project: One Store Imgration to Exact Target saleforce and ForCAST B2B
Mattel, known worldwide as the home of Barbie® and Hot Wheels® and countless other successful toy franchises and the Mattel family of companies is the worldwide leader in design, manufacture and marketing of toys and family products. The objective of this project is to get the Customers Abanded items which failed during processs of the purchase from all the Franchises around the world and also assigning each customer Email_address a single promo code and load it into the Data warehouse for analysis The data flow is from different sources (flat files, Oracle).
• Interact with users and business analysts to gather the requirements on incidents and service requests and to work with the offshore team to get the issues and incidents resolved
• Analyzed the source data and business related to it so that the new data sources can be seamlessly integrated to the existing data warehouse.
• Worked on Data analyses and gap analyse between Silverpop and Salesforce Exact Target , provided the inputs for the migrations from the Silverpop to Salesforce Exact Target
• Gathered Requirement and also involved in the solution discussions with stakeholders.
• Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
• Create high complexity detailed technical design specifications, Develop detailed analysis, design, construction and testing specifications ensuring technical compatibility and integration for the ExactTarget
• Designed Incremental loading process to load data into staging tables.
• Used Dynamic Filter in the mappings. And Debugger to test the mappings and fixed the bugs.
• Worked on XML sources to load the data in to the .CSV files and wrote unix script for deleting the special charaters from the XML Sources.
• Developed mapping to generate EMAIL notification and parameterized the Email address.
• Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
• Developed complex mappings with different transformation like Router , Lookup, sorter, Stored procedure, Normalizer, Filter, Update Stragey , Aggregaor etc.
• Wrote the shell scripting for accessing the flat files from the FTP location and archiving the files.
• Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs and to perform operations like gunzip, remove and archive files.
• Extensively used an Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
• Investigating and fixing the bugs occurred in the production environment and providing the on-call support.
• Create & review unit, integration test plans/scripts. Create & review project deliverables according to the software development life cycle methodologies.
• Written documentation to describe program development, logic, coding, testing, changes and corrections.
• Wrote SQL logic for extracting the data from the Oracle DB.
• Developed mapping where sales are sent as the input from different business and the forcast is populated as the output.
• Create the Data Extension in the Exact Target and Import the files for testing.
• Participate in client calls and prepare the clarification list to seek clarifications.
• Prepare the list of requirements and seek review inputs from the key stakeholder.
• Provided excellent support during QA/UAT testing by working with multiple groups
• Monitoring of batch jobs using job scheduling system UC4 and CA.
• Document and maintain code using change control software like PVCS.
• Provided Off-hours/weekend support during the Production deployment and UAT
• Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
• Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
• Designed and performed all activities related to migration of ETL components between environments during development and deployment (Eg. Dev to SIT, SIT to UAT and UAT to PROD repositories). Environment: Informatica Power Center 9.6, UNIX, Windows server 2003, Windows server 2008 R2 , Unix shell programming, Cognos, Oracle 11g/10g, SQL, PL/SQL, SQL Developer, Silverpop, Salesforce Exact Target, CA, UC4, IDQ 8.6.1
- Sr ETL Developer / Informatica Developer at Mattel
- ETL Developer at State of Georgia Eligibilty System IES
- Sr ETL Developer / Informatica Developer at Dunkin Brands, MA
- Sr ETL Developer / Informatica Developer at Applabs
3 years, 4 months at this Job
- - Databases
• Responsible for developing and maintaining ETL jobs, including ETL implementation and enhancements, testing and quality assurance, troubleshooting issues and ETL/Query performance tuning
• Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
• Writing technical documentation and routine production ETL process support.
• Develop new components in Informatica Data Integration Hub (DIH), a latest tool by Informatica and good understanding of DIH components, concepts.
• Develops Cloud Services tasks (Replication/Synchronization/Mapping Configuration) to load the data into Salesforce (SFDC) Objects.
• Designing, developing, maintaining and supporting Data Warehouse or OLTP processes via Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
• Responsible for User administration & maintaining the Informatica Cloud Services - Secure Agent on Unix Server for Dev/QA environment.
• Developed informatica mappings, mapping configuration task and Taskflows using Informatica cloudservice (ICS).
• Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
• Understand and perform Data Analysis, requirement gathering and Design and Development of code
• Involved in Performance tuning for sources, targets, mappings and sessions.
• Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.
• Worked with Master Data Management (MDM) team to load data from external source systems to MDM hub.
• Manage and expand current ETL framework for enhanced functionality and expanded sourcing.
• Utilization of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
• Translate business requirements into ETL and report specifications. Performed error handing using session logs.
• Analyzed data using complex SQL queries, across various databases.
• Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
• Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently.
• Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
• Wrote reports using Tableau Desktop to extract data for analysis using filters based on the business use case.
• Code reviews of ETL and SQL processes. Worked on upgrading Informatica from version 9.6.1 to 10.1.
• Developed UNIX Shell scripts to execute the workflows using PMCMD utility and used Autosys scheduler for automation of ETL processes.
• Scheduling the Informatica Cloud Service jobs using Informatica Cloud task scheduler.
• Teradata SP/View/BTEQ development and involving in the code review meetings.
• Involved in Implementation of SCD1 and SCD2 data load strategies.
• Designed and developed several SQL Server Stored Procedures, Triggers and Views. Environment: Informatica Power Center 9.6.1/10.1, Informatica Cloud, Informatica Data Quality(IDQ) 9.6.1, Informatica MDM 9.5 , Oracle 11g, DB2, Teradata 14, Tableau Desktop 9.3, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, Autosys, Cognos , Erwin Designer ,SQL, PL/SQL, UNIX, MS SQL Server 2016.
- Sr. ETL Developer at Sr. ETL Developer
- ETL Developer at ACS Honeywell, NJ
- Informatica Developer at Bank of America, NJ
- ETL Developer at
1 year, 10 months at this Job
- M.Tech in Control - systems and Instrumentation Engineerin
- B.E - Electrical & Electronic Engineering
- ETL DEVELOPER at ACCENTURE SERVICES PVT LTD
- Repository Manager, Designer, Workflow Manager & Workflow Monitor at FANNIE MAE
- ETL DEVELOPER at HEXAWARE TECHNOLOGIES LTD
- MIDAS, LMWS, Cash Manager, AR Billing PeopleSoft at FREDDIE MAC
2 months at this Job
- B.TECH. - GALGOTIA
Malta, NY Jan 2017 - Till Date
Employer Name -GlobalFoundries Inc.
Role - Oracle PL SQL Developer / Senior ETL Developer
Project Title - GlobalFoundries Manufacturing Integration Warehouse
MT department in GF maintains semiconductor manufacturing route data for whole life of wafer. A
globally deployable, secure, standard data services platform that facilitates access to all raw, cleansed,
enriched and calculated engineering + manufacturing data from a standard architecture and a common
● Development of sophisticated data warehouse solutions based on good understanding of overall system architecture and data flow and usage within GLOBALFOUNDRIES as Application Owner
● Oracle Database design and development, including logical and physical schema design
● Oracle Database application performance tuning and optimization
● Oracle Performance Tuning using Oracle Exadata new features
● Define architecture and design approaches to address customer requirements.
● Manage requirements backlog (including bug fixes) against applications. This includes providing / verifying effort estimates on proposed solutions and managing a release roadmap.
● Oversee implementation work done by the external partner (prioritize maintenance work, hold design and code reviews)
● Assure creation / maintenance of required documentation, including design specifications for fixed price projects, architectural diagrams.
● ETL development work with Ab-Initio
- Oracle PL SQL Developer / Senior ETL Developer at Data Solutions Group
- at Clarity One
- Oracle PL SQL Developer / Senior ETL Developer / Data Modeler at M. D. Anderson Cancer Center
- at EResearch
2 years at this Job
• Created and Modified PL/SQL packages, Triggers, Procedures, Functions and Cursors.
• Created /updated tables, Indexes, materialized views, synonyms and sequences per requirement.
• Analyzed Database of Source 9i and target 11g database before and after migration.
• Responsible for analysis of the issue, creation of Functional Specification Document and Design Documents. After proper unit testing of the code moving the code to Quality and after user acceptance in QA move the code to Production environment. Thus, involved in all the stages of SDLC.
• Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.
• Extracted, Transformed and Loaded data into Oracle database using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
• Used DBMS_SQLTUNE.REPORT_SQL_MONITOR package to generate SQL monitoring report and tune the queries.
• Performance Tuning of complex SQL queries using Explain Plan to improve the performance of the application.
• Working on the SQL tuning and optimization of the Business Objects reports. Generated AWR reports for performance analysis and tuning opportunities
• Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
• Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
• Created Records, Tables, Objects, Collections (Nested Tables and Varrays), and Error Handling.
• Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable.
• Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.
• Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES for loading, updating and deleting huge data.
• Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, and LISTAGG & ROW_NUMBER Functions for sorting data.
• Used SQL*Loader to load data from Excel file into temporary table and developed PL/SQL program to load data from temporary table into base Tables.
• Used PUTTY & secure shell to connect to UNIX machines.
• Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.
• Developed complex reports using multiple data providers, user defined objects, charts, synchronized queries, and created star schema in SSAS to develop ad-hoc reports for the clients as per their requirements using SSRS in MS SQL 2005.
• Experience in deploying forms and reports on the web.
• Experience in developing forms based on views, tables and procedures in tabular and form layouts.
• Worked on Several Unix/Linux Wrapper shell scripts like .ksh, .Csh , .sh.
• Written shell scripts for processing of the files calling the PL/SQL packages and SQL scripts.
• Used SQL*loader to load the input files from various external systems into the database staging tables and storing it for further processing.
• Active participation in release, deployment, Data migration and Production Cut-Over activities work.
• Responsible for creating PLSQL Programs and UNIX Scripts for Data Validation and Data Conversion.
• Prepared UNIX shell script for the transfer of files from FTP to SFTP. Environment: Oracle 10g, TOAD, Windows XP, Pl/SQL, SQL, Cognos8.4, UNIX Shell Scripting, Informatica, Putty, HP Quality Centre, SSRS, SSAS, Business Objects XIR3.1
- Dean Health at ETL Developer
- ETL Developer at Home Away
- Data Warehousing Designer & Developer at Wolters Kluwer
- ETL Developer at Wells Fargo
1 year, 5 months at this Job
- B.S - Engineering
Microsoft Visual Studio 2008/2010/2012.
• Created ETL Metadata Driven Framework to maintain standard across all ETL Developer and company standard.
• Responsible for performance tuningofstored procedures, Database Tables using TablePartitioning, SQL Profiler and Database tuning wizard.
• Hands on experience overallETL (Extract Transformation & Load) process.
• Skilled in High Level Design of ETL DTS Package for integrating data from heterogeneous sources (Excel, CSV, Oracle, MySQL, PostgreSQL, flat file, Text Format Data).
• Hands on experience in MS SQL Server Integration Services (SSIS), MS SQL Server Analysis Services (SSAS) and MS SQL Server Reporting Services (SSRS) using Business Intelligence development studio (BIDS).
• Worked on upgrading from DTS to SSIS packages.
• Experience in designing Database Models using Microsoft Visio and creating class diagrams, activity diagrams, use cases diagrams, sequence diagrams and flow charts using UML.
• Experience in Design, Development, Implementation and Documentation of business requirements in Microsoft .NET framework 2.0/3.5 using Microsoft ASP.Net, Microsoft ADO.NET, C#.Net, VB.Net, Web Applications, Windows Applications and XML.
• Worked on Notification services in setting up the Scheduled jobs and alerts.
• Hands on experience on Data WarehouseStar Schema Modeling, Snow-Flake Modeling, FACT& Dimension Tables, Physical and Logical Data Modeling.
• Proficient in implementing the business logic to design/develop the Cubes,Aggregation, Masures using SQL Server Analysis Services (SSAS).
• Expertise in generating reports using SQL Server Reporting Services tool,Crystal Reports andPivot Charts/Tables in MS Excel Spreadsheet.
• Extremely motivated, diligent, conceptually strong team player with ability to take new roles and adapt quickly to new technology.
• Detail-oriented, results-driven, excellent verbal and written communication skills with interpersonal and conflict resolution skills and possesses strong analytical skills.
- Sr. BI developer/ SSIS/ SSAS / SSRS at ETL Developer and company
- Sr. BI developer at SSIS/ SSAS / SSRS
4 years at this Job
- Bachelor's Degree - Computer Science
Client 1: Amex - Export Blue
Duration: July 2016 till 15th Nov 2017
Environment: SQL, UNIX Shell scripts, Informatica 9.6.1, Oracle
Role/Designation: ETL Developer/Tech lead
• Understanding requirements and major shareholder in planning migration strategy.
• Worked as a team and met the tight deadlines of development and delivering
• Followed the informatica best practices during development and provided proper solution to client to overcome issues.
• Involved in Data Extraction and Loading from Flat files, Oracle Sources to Teradata and Oracle tables using Informatica.
• Proper documentation of Unit Testing for the Developed Code and logic.
• Used Pushdown Optimization for loading through Informatica.
• Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
• Effectively worked on Onsite and Offshore work model.
• Pre and post session assignment variables were used to pass the variable values from one session to other.
• Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
• Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
• Performed unit testing at various levels of the ETL and actively involved in team code reviews.
• Identified problems in existing production data and developed one time scripts to correct them.
• Fixed the invalid mappings and troubleshoot the technical problems of the database.
• Develops ETL processes using Informatica, ETL control tables, error logging, auditing, data quality, etc.
• Implements data cleanup procedures, transformations, scripts, stored procedures, and execution of test plans for loading the data successfully into the targets
- ETL Developer/Tech lead at IBM India Pvt Ltd
- Data Specialist at IBM India Pvt Ltd
- BTEQ, SQL Assistant at IBM India Pvt Ltd
- BTEQ, SQL Assistant, Multiload, FastLoad at Teradata India Pvt Ltd
1 year, 4 months at this Job
- Bachelor of Engineering - Engineering
GENESIS is a strategic data acquisition hub and data warehouse of Contracts, Positions and EOD Balances designed to be used by Finance, Risk, Compliance, Treasury and Regulatory Reporting functions. For the first time in Citi, all granular data from a variety of product processors and referential data sources globally for a given product set are uniformly represented with the GENESIS standard model for that product. GENESIS has simplified and optimized the system architecture by reducing the number of systems interfaces, data reconciliation points and associated costs.
Ab Initio developer implementing the Business requirements in the Genesis application.
• Create new data environment to incorporate the business requests.
• Converts the business requirements to detailed technical structure for Genesis Data mart loading.
• Design & Develop ETL Ab Initio (Co>Op Sys 3.1.5) jobs to transform and load data from multiple source systems into Teradata Datamart as per Specifications mentioned in Low-Level, High-level and Functional design documents using Ab Initio GDE (version 3.1.2).
• Involved in Ab Initio Metadata Hub implementation to maintain the reference tables in the MHUB Oracle DB.
• Involved in scheduling jobs through Operational Console and migrating it to higher environment through Ab Initio EME tag. Tools and Technologies used:
• Ab Initio Co>Op Sys 3.1.5, GDE 3.1.2, Ab Initio Metadata Hub, Oracle, Netezza, Operational console, Unix Shell Scripting, ETL, Putty, Windows 7 Project: 3
- ETL Developer at Citibank - New Jersey
- at Citibank - New Jersey
- at Citibank - New Jersey
10 months at this Job
- Bachelor's in Computer Science & Engineering - Computer Science & Engineering
• Worked as an ETL developer for Randstad Enterprise Reporting project to track business aspects like Revenue, Manager activities, Recruitment activities, Financials from Franchises and different applications of Randstad IT Services.
• Worked with Pentaho BI suite to build warehouse required for reporting Randstad business aspects.
• Worked with Pentaho dashboards and created Datamarts required for certain dashboards.
• Provided ETL solutions using PDI (Pentaho Data Integration) tool to fulfill business requirements for reporting and analysis.
• Experience with Business Analyzer tool to analyze Mondrian schemas and cubes for different reports.
• Maintained Facts and Dimensions and incremental loads to feed different dashboards and reports which contain reports integrated from various applications and sources.
• Worked with database sources like SQL Server, DB2 and Oracle to bring in data and integrate for reporting as an enterprise.
• Extensively worked with Oracle SQL developer and wrote complex oracle sql queries to fulfill the business requirements involving fiscal year metrics analysis.
• Wrote and modified PL/SQL procedures and functions to handle some adhoc processes like creating batchids for daily batch of Data Warehouse load and reporting datamarts load.
• Worked with CRM tools like PeopleSoft and BullHorn FO applications to analyze their backend database sources and pull data from their replicated databases into reporting data warehouse
• Involved in POC to migrate to GCP(Google cloud Platform) and loaded data to Big Query.
• Created Pentaho Jobs and transformations using 8.1 (with connectors for bigQuery) to load data from oracle to BigQuery datasets and tables as part of conversion from on premise to cloud database of existing Enterprise Reporting datawarehouse.
• Worked end-to-end on setting up the Google cloud bigQuery connections on pentaho using json authentication files and setting up environment variables and explored options to execute bigQuery database queries using standard and legacy sqls.
• Was able to master pentaho BI suite in short span and developed datamarts for at least 4 existing dashboards on pentaho BI server in Randstad Enterprise Reporting today. Environment: Pentaho BI suite, Oracle 11g/12c, SQL Server, Google Cloud Platform, DB2, Informatica 9.x, SSIS, SSRS
- BI & ETL Developer at Randstad USA
- ETL Developer - Informatica at Bank of America
- ETL Developer - Informatica at Inspirix Technologies ,MO
- Informatica ETL Developer at CSC
1 year, 2 months at this Job
- Masters in Computer Science - Computer Science
• Lead ETL Developer with 12 years of IT experience and portrayed skills in implementing custom ETL solutions for enterprise data warehouses.
• Strong background in Database development and Data warehousing using Informatica Power Center 10.2/9.x/8.x/7.x, Informatica Data Quality (IDQ) 9.x/8.x, Informatica MDM (Master Data Management) 9.x and SSIS.
• Developed complex mappings in Informatica using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup - Connected & Unconnected, Sequence Generator, Filter, Sorter, Source Qualifier, Expression, Union, Stored Procedure transformation etc.
• Hands-on experience with Informatica Data Quality toolset and proficiency in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
• Strong exposure to Master Data Management using Informatica MDM.
• Experience in the Informatica Data Director (IDD) tool for creating Tasks and writing User Exit functions.
• Proficiency as Informatica Administrator to setup new Informatica environments and Upgrade existing Informatica environments to new versions.
• Strong knowledge of Hadoop and HBase, Pig, Hive, HDFS and Big Data.
• Hands on experience with Informatica Big Data Edition (BDE).
• Expertise in Data mart, ODS, OLTP and OLAP implementations.
• Strong technical knowledge on Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Fact and Dimension, Physical and Logical Data Modeling.
• Proficiency in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and CDC (change data capture).
• Expertise in working with relational databases such as Oracle, SQL Server, Teradata, DB2-UDB, DB2-BLU and MS Access.
• Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using Oracle PL/SQL and TSQL.
• Strong exposure to the National Healthcare Coding standards and implementing Oracle Virtual Private Database (VPD) rules to secure healthcare data.
• Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
• Worked on the JMS Message Queues and Web Services using Informatica.
• Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
• Experience in using Automation Scheduling tools like Autosys, Control-M and CA ESP Scheduler.
• Expert in all phases of Software development life cycle(SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
- Lead ETL Developer at Daimler Trucks North America LLC
1 year, 1 month at this Job
- Bachelor's - ESP workload automation