tool, SQL Developer, Putty, WinSCP Database Oracle 10g, Teradata Languages SQL, PL/SQL, T-SQL and Unix Shell Scripting Team 6 Member Team. Project Name: PAS & RMB Role: ETL Developer Project Description: Marsh is one of the leading insurance company in US which mainly deals as brokerage Company. Policy Administration System and Revenue Master Billing (PAS & RMB) module is a part of admin and billing report development for marsh. PAS & RMB used to load data in to Operational Data Store (ODS). It will pull data from various sources like Data bases and flat files and load in to ODS. It also used to pull the data from the Operational Data store for various applications on daily, weekly, Monthly basis and send it to MassMutual. The aim of this project is to Send flat files (Reporting) to MassMutual and develop reports from ODS. Responsibilities: ❖ Primarily responsible to convert business requirements (BRD's) into ETL Specifications. ❖ Worked as Onsite Coordinator. ❖ Experience in working on extracting data from different sources likes IBM MQ, Oracle, MS Access, Flat files,T-SQL and Teradata. ❖ Extensively used Informatica Power Center client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager. ❖ Developed complex mappings in Informatica to load the data using different transformations and maintained Informatica naming standards. ❖ Implemented various Transformations: Expression, Joiner, Sorter, Aggregator, Lookup, Filter, Router, Update Strategy and Transaction Control Transformations. ❖ Prepared unit test cases for developed mappings, workflows ❖ Written complex SQL queries to analyze problems. ❖ Used Data Analyst tool for performing data validation, running profiles and score cards. ❖ Developed Mappings using IDQ tool and ran some Data Profiles using IDQ tool. ❖ Involved in Data Cleansing using Informatica and Data Analyst tool. ❖ Data modeling using Erwin to design the Physical and conceptual layer. ❖ Experienced in Dimensional Modelling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin. ❖ Developed Shell scripts to transfer files from one server to other server. ❖ Developed Shell Scripts for the generation of reports and transferring over the email. ❖ Performance tuning for mappings, Sessions where ever applicable. ❖ Supported UAT & Fixed the Issues raised by users ❖ Developed UNIX shell scripts to automate daily loads and used Informatica scheduler to Schedule workflows. ❖ Raised Clear Quest request (CQ) to migrate codes, Tables (new), Grant requests for tables, from dev to QA, UAT and Production. ❖ Worked in Incident and change management processes (SLA).
- Informatica ETL Developer at Marsh USA Inc
- Sr. Software Engineer at Wells Fargo
- Software Engineer at AT&T
- Software Engineer at Humana Inc
3 years, 7 months at this Job
- B.Tech in Electronics and Communication Engineering - Electronics and Communication Engineering
• Responsible for Business Analysis and Requirements gathering.
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
• Worked with heterogeneous source to Extracted data from Oracle database, XML and flat files and loading to a relational Oracle warehouse.
• Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
• Performed tuning of SQL queries and Stored Procedures for speedy extraction of data, to resolve and troubleshoot issues in OLTP environment.
• Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
• Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
• Troubleshooting of long running sessions and fixing the issues related to it.
• Worked with Variables and Parameters in the mappings to pass the values between sessions.
• Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
• Developed mapping parameters and variables to support SQL override.
• Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and OLTP systems.
• Worked with Services and Portal teams on various occasion for data issues in OLTP system.
• Worked with the testing team to resolve bugs in ETL mappings before migrating to production.
• Creating the weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.
• Involved in meetings with production team for issues related to Deployment, maintenance, future enhancements, backup and crisis management of DW.
• Worked with production team to resolve data issues in Production database of OLAP and OLTP systems.
• Designed Informatica Mappings and Workflows be able to restart after a failure.
• Designed tables required for the execution of the ETL processes using ERwin.
• Extracted data stored in a multi-level hierarchy using Oracle Stored Procedures.
• Loaded Customer data in multiple levels (rows) using Oracle Stored Procedures and Cursors.
• Used XML Parsers and Generators to handle inbound and outbound XMLs in Informatica.
• Loaded Dimension, Fact and Exception tables and automated email generation when exceptions occurred.
• Provided excellent support during QA/UAT/Beta testing by working with multiple groups.
• Optimized the SQLs and Informatica mappings which handled millions of records.
• Improved performance using Oracle Partitions, Indexes and other Performance Tuning techniques.
• Developed re-usable components in Informatica, Oracle and UNIX
• Actively participated in Install/Deployment plan meetings.
• Proved accuracy in doing sanity checks, smoke tests after the install of new scripts used by ETL.
• Prepared Data Mapping documents, process documents, Re-startability Matrix to help on-call team reduce the turnaround time when there is an issue.
• Efficiently handled multiple projects during resource crunch.
• Provided Training to the team with the functional and technical knowledge. Environment: Informatica PowerCenter 9.6, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, Toad, SQL Developer, Oracle 11g, SQL loader, PL/SQL, Erwin, Linux, Teradata, Micro Strategy and Tableau.
- Informatica ETL Developer at USAA
- Informatica Developer at Liberty Mutual Insurance Group
- Informatica/ IDQ Developer at IGT
- Informatica/ MDM Developer at Catholic Health Services of Long Island
5 months at this Job
- Bachelors of Technology - Technology
Description: Nationwide, a Fortune 100 company based in Columbus, Ohio, is one of the largest and strongest diversified insurance and financial services organizations in the U.S. Company provides a full range of insurance and financial services, including auto, commercial, homeowners, farm and life insurance, public and private sector retirement plans, annuities and mutual funds; banking and mortgages; pet, motorcycle and boat insurance,
Worked for CRM application used by customer service as a senior ETL Informatica/ Salesforce developer.
• Involved in full project life cycle from analysis, requirements to production implementation.
• Created detailed documentation which explain in deep about the jobs and IRD Incident resolution document for operation teams to maintain Jobs.
• Developed and maintained Integration between salesforce and Seibel with complex mappings using xml, MQueue to maintain data integrity between both systems.
• Designed mappings for real time changes to reflect in salesforce.
• Currently working with Informatica power center 10.2
• Assisted in designing and implementation of data warehouse solutions along with data integration and analysis.
• Provided technical assistance for architecture and framework of Informatica based ETL solutions.
• Implemented procedures for suggestion and implementation of ETL processes and architecture enhancements.
• Coordinated with technical teams for estimation and translation of client and business requirements into specific systems.
• Supported technical team members for deployment of data integration processes in production systems.
• Executed processes for designing application modules to produce required products.
• Participated in development and integration of application into company products.
• Prepared detailed technical documentation relating to designs, mappings and system administration.
• Created Integration jobs to extract the data from different internal teams and feed to salesforce.
• Worked on source to target mapping documentation.
• Identified and solved the existing tech debt's.
• Conducted business meeting's "show and tell" to explain and demonstrate the developed code with business.
• Maintained code versioning using GIT.
• Developed Unix shell scripts for processing files, SLA Monitoring, scheduling jobs with scheduling tool.
• Performed ETL code reviews to maintain the Standards.
• Wrote Unit Test cases and executed unit test scripts successfully.
• Supported during QA/UAT/PROD Phases.
• Involved for resolving critical production incidents tickets with operations team.
• Actively participated in monthly release to ensure successful code migrations. Environment: Informatica 10.1, Salesforce, Seibel, IBM Info sphere, PL/SQL, Linux, Oracle, Win Sql, SQL Developer, ESP.
- Sr. Informatica ETL Developer at Nationwide Finance and Insurance
- Sr. Informatica ETL Developer at Ameriprise Financial Services
- Coach at Coach, Inc
- Informatica Developer/ IDQ Developer at First American Financial Corporation
1 year, 6 months at this Job
Bank of the West is a regional financial services company chartered in California, It provides a wide range of personal, commercial, wealth management, and business banking services through hundreds of locations in 23 states and digital channels. Bank of the West is a BNP Paribas Company, which has a presence in 74 countries.
Worked on various projects like SAM (Suspicious Activity Monitoring), Data Quality and Transaction Recon, which includes the FRAUD detection and AML activities and it plays a major role in Data Security and Data Placement. The project is to extract data from Oracle and flat files systems with business rules for sourcing the data and loading the data to Data WareHouse which in turn sends to Actimize(it's a tool which helps in identifying the FRAUD and AML activies) which are suspicious to the bank. The data in data mart is used by the business users for analyzing and for reporting purposes.
• Collected the requirements from Business people.
• Worked on Data Recon Projects.
• Designed and developed the required tables in Oracle database.
• Designed and developed Informatica Mappings based on the business requirements.
• Used reusable mapplets developed in IDQ for the Data Quality Project.
• Performing Technical Data Quality Checks.
• Validate the data in the target tables once it was loaded.
• Attending the daily scrums and giving the updates.
• Fine-tuned existing mappings for performance optimization.
• Cleansing the data with Informatica IDQ tool.
• Profiling the data with IDQ.
• Unit and integration testing in both DEV and QA Environments.
• Worked on bug fixes on existing mappings to produce correct output.
• Developed workflows and sessions associated with the mappings using Workflow Manager.
• Running the workflows daily, quarterly and scheduling the workflows using tidal and monitoring them.
• Involved in writing complex queries for generating different reports as per client needs.
• Worked with all phases of the Software Development Life Cycle (SDLC).
• Extracted the data from Mainframe, Flat files, Tables, Sap system and loaded the data into Data warehouse using Informatica Power center.
• Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
• Created shortcuts for source and target.
• Created Reusable Transformations in Shared folder.
• Worked on Informatica Power Center - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
• Various kinds of the transformations were used to implement simple and complex business logic. Transformations used are: Procedure, connected & unconnected lookups, Router, Expressions, source qualifier, aggregators, filters, sequence Generator, etc.
• Developed application support documentation accepted by systems changes into production.
• Designed and developed a custom database (Tables, Views, Functions, Procedures, and Packages).
• Working with Business Analyst, translating business requirements into functional requirements. Environment: Informatica IDQ, Informatica (Power center 9.x), Micro strategy Desktop, SQL, Netezza, Unix, Tidal for scheduling, Windows XP.
- Sr. Informatica/ETL Developer at Bank of the West, San Roman, California
- Sr. Informatica / ETL Developer at American Express
- Sr. Informatica / ETL Developer at Petsmart
- Sr. Informatica / ETL Developer at Farmers Insurance
1 year, 2 months at this Job
Domain: Health care
Description: The project was to design, develop and maintain a data warehouse to understand the Claims, undertake trend analysis and provide better services to Existing customers & New customers. The whole process of maintaining a centralized Warehouse for the Subject Areas has been automated and the entire Extraction, Transformation and Loading (ETL) process is achieved with Informatica.
• Development of scripts for loading the data into the base tables in EDW using Fast Load, Multi Load and BTEQ utilities of TERADATA.
• Worked on TERADATA SQL, BTEQ, M Load, Fast Load, and Fast Export for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, Fast Load or Fast Export. Created numerous Volatile, Global, Set, Multi Set tables.
• Developed tools to automate base tasks using Python, Shell scripting and XML.
• Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.
• Developed data mappings between source systems and target system using Mapping Designer.
• Developed shared folder architecture with reusable Mapplets and Transformations.
• Created batch jobs for Fast Export.
• Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.
• Design and developed Amazon Redshift databases
• Worked with XML, Redshift, Flat file connectors.
• Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
• Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
• Used Teradata Utilities fast load, multiload, t pump to load data.
• Created SSIS ETL packages to get data from different sources like Flat files, MS Excel, MS Access.
• Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
• Responsible for optimization of SQL queries, T-SQL and SSIS Packages.
• Loading data from large data files into Hive tables.
• Importing and exporting data into HDFS and Hive using Sqoop.
• Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA's, operations and business units to build and deploy.
• Used Data stage to manage the Metadata repository and for import /export for jobs.
• Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
• Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
• Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
• Created data synchronization tasks & amp; Tasks flows to extract the data from Salesforce and loading the data from files to Salesforce.
• Used Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
• Working with an Agile, Scrum methodology to ensure delivery of high quality work with every monthly iteration.
• Worked on production tickets to resolve the issues in a timely manner.
• Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing. ENVIRONMENT: Informatica Power Center 9.x, IDQ, Hadoop, Hive, Oracle 10g, Metadata, datastage 8.7, Teradata, SQL Server 2008, SSIS, SSRS, T-SQL Toad, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access ,UNIX, PYTHON, Tivoli Job Scheduler, Windows Azure.
- Informatica ETL Developer at Molina Health Care
- Informatica developer at Santander Bank
- Informatica Developer at Philips Electronics North America
- Informatica/ETL Consultant at Max New York Life Insurance
1 year, 5 months at this Job
- Bachelors of Engineering in Computer Science - Computer Science
• Develops PowerCenter Workflows and Sessions, and also sets up PowerExchange connections to database and mainframe files
• -Develops logical and physical data flow models for ETL applications
• -Provide System and Application Administration for Informatica PowerCenter and PowerExchange
• -Designs, develops, automates, and supports complex applications to extract, transform, and load data
• -Identifies and manages interfaces, service levels, standards and configurations
• -Demonstrates working knowledge of Informatica PWC & PWX 8.6.1 or 9.x, SQL Server 2005/2008 or equivalent databases, SQL, Oracle/GoldenGate, Windows server 2003/2008, Autosys Scheduling, Mainframe data files, Web Services and XML
• - Plans work and leads team of others; mentors others on the team with product functionality
• Worked with IBM Maximo Application
• Worked with Birt users
• Working closely with Onshore and offshore application development leads
• ETL Mappings, Mapplets, Workflows, Worklets using Informatica Powercenter 9.x
• Identify efficiencies and ways to improve design and development processes
• Identify ways to increase efficiency of production support - Find solutions that allow operations to better do their job without involving development resource time
• Strong knowledge of Informatica ETL and Oracle/DB2 database technologies
• Strong analytical skills and SQL proficiency
• Strong experience in DWH Technologies (Informatica, Netezza, SQL, Unix, Autosys)
• Strong oral and written communication skills
• Experience with Tableau
• Ability to identify and appropriately escalate issues/risks to management for direction
• Strong understanding of operational data staging environments, data modeling principles, and data warehousing concepts
• Strong time management and task prioritization
• Strong Data modeling and understanding of key business elements
- Sr. Informatica/ETL Developer at CITY OF AUSTIN
- Sr. Informatica/ETL Developer at BLUE CROSS BLUE SHIELD
- Sr. Informatica/ETL Developer at CHURCH PENSION GROUP
- Informatica Developer at THE VITAMIN SHOPPE
7 months at this Job
- Bachelor of Science - Computer Science
• Created reusable transformations and mapplets and used them in various mappings.
• Used various transformations like lookup, update strategy, router, filter, sequence generator, source qualifier/joiner on data and extracted according to the business rules and technical specifications.
• Prepared document for Root Cause Analysis (RCA).
• Extensively used ETL to load data using Power center/Power connect from source systems like flat files and excel files into staging tables and load the data into the target database oracle.
• Design and development of ETL routines, using informatica Power center within the informatica mappings, usage of lookups, ranking, stored procedures, functions, SQL overrides usage in lookups and source filter usage in source qualifiers.
• Debugging and troubleshooting of informatica mappings.
• Designed and developed stored procedures using PL/SQL and tuned SQL queries for better performance.
• Assisted in knowledge transfer to client personal for maintaining the environment after implementation.
• Written test scripts and test cases for testing and enhancements which were used by QA for unit testing and user acceptance testing.
• Configure the workflow, tables and indexes using oracle data warehouse administration console (DAC) and monitor the job runs for full and incremental loads.
• Coordinated User Acceptance Testing (UAT) with the business users.
• Written shell scripts in Unix to execute the workflow, created workflow, worklets and tasks to read parameters from parameter files.
• Documented the complete mappings, system delivery specification, test scripts and technical systems design. Environment: Informatica Power center 10.x/9.x, ETL, PL/SQL, Oracle11g, and Unix shell scripting.
- INFORMATICA ETL DEVELOPER at AMERICAN EXPRESS
- INFORMATICA ETL DEVELOPER at AMERIPRISE FINANCIAL
- Sorter at INFORMATICA ETL DVELOPER, THERMO FISHER SCIENTIFIC
- INFORMATICA DEVELOPER at HALCYON TECHNOLOGIES
6 months at this Job
- MASTER'S - COMPUTER INFORMATION SYSTEMS
- Bachelor's - computer science
Roles & Responsibilities
• Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target.
• Created project plan for Development, QA, Informatica PowerExchange and Production PowerCenter upgrade.
• Used Informatica Power Exchange for loading/retrieving data from mainframe system.
• Extensive experience with Healthcare, Energy and Media domains.
• Working on Informatica object Migration using Repository Manager and Monitoring Informatica repository backups.
• Designing and developing the ETL frame work and processes using Informatica 9.6.1/8.6, to run on the Domain, Integration Services and Nodes
• Extracted data from various source systems like Oracle, XML and flat files and loaded into relational data warehouse and flat files.
• Effectively worked on Onsite and Offshore work model.
• Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
• Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
• Involved in analyzing the service requests given by end user and fixing the production problems and data changes in the database and program changes for the existing systems
• Post implementation support includes solving the service request raised by the end user.
• Worked on relational databases Oracle 10g/11g, DB2, SQL Server, Teradata, Greenplum, and Amazon AWS Redshift.
• Estimates and planning of development work using Agile Software Development.
• Used Automation Scheduling tools like AutoSys and Control-M.
• Worked on AbInitio Co-Operating System, application tuning, and debugging strategies.
• Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
• The system was used as a verification/standardization and geo-coding tool for postal addresses.
• Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure and more.
• Used T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
• Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
• Develop mapping to load data into normalized and de-normalized data model.
• Defined BI Business Intelligence reporting and data integration strategy, architected a new high-availability global BI infrastructure to support business, managed BI project implementation with new directions and best practices, and built user confidence on this new system and it became a core enterprise analytic reporting systems.
• Responsible for creating batches and scripts for implementing logical design to T-SQL.
• Involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures.
• Conducted design, code reviews with peers and QA teams.
• Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings
• Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
• Created complex SSIS packages using proper control and data flow elements with error handling.
• Worked on Informatica Metadata Manager.
• Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.
• Maintained good interaction with analysts, Project Managers, Informatica PowerExchange architects and testers to have efficient and better results.
• Worked in a support team rotation which performs night and weekend support for all production data integration services from client site/ home. Created and used the Power Exchange connection/s.
• Responsible for cleansing the data from source systems using Ab Initio components such as Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-by-Expression, Rollup.
• Used SQL tools like TOAD to run SQL queries and validate the data.
• Partially involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse.
• Designed and developed various kinds of reports based on user requirement. Provide weekly status report on support tasks, monthly reports and metrics on monitoring and support initiatives and quarterly reviews on existing processes.
• Provided production support for business users and documented problems and solutions for running the workflows. Environment: Informatica PowerCenter 9.1/8.6, Oracle 11g, TOAD, PL/SQL, Microsoft Visual Studio 2012/2008/2005,OBIEE, Flat Files, COBOL, MS SQL Server, Data Validation Option Tool ,SQL *Loader, UNIX Shell Scripting, Control-M, AutoSys, Tidal, Windows XP
- Sr. Informatica ETL Developer at Nestle
- Sr. Informatica ETL Developer at Alliance Bernstein
- ETL Developer/Analyst at EQ Financial Inc
- Informatica ETL Developer at Mede Analytics
3 years, 4 months at this Job
- Bachelor's Degree in Computer Science - Computer Science
• Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
• Extensively used Informatica Power Center 9.5.1 to extract data from various sources and load in to staging database.
• Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts for Data manipulations and Data Conversions.
• Created and Modified T-SQL stored procedures for data retrieval from MS SQL SERVER Staging.
• Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.
• Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Power Exchange, Repository server and Informatica server to load data from flat files, legacy data.
• Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
• Handled critical issues such as data masking of sensitive information
• Designed the mappings between sources (external files and databases) to operational staging targets.
• Involved in data cleansing, mapping transformations and loading activities.
• Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
• Implemented Database Mirroring, Log Shipping on SQL 2008 Servers for high availability and maintaining Disaster Recovery sites.
• Setting up and Automatic/Forced fail over for the production servers using Log Shipping, Database Mirroring and Replication.
• Scheduled the full backup, differential, transaction log backup of Database
• Proficient in writing SQL queries and creating PL/SQL stored procedures.
• Involved in creating and modifying UNIX Korn shell scripts and scheduling the UNIX scripts through Control-M.
• Experience with Extraction, Transformation and Load (ETL) tools such as Business Objects Data Services and SQL Server Integration Services
• Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
• Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes into the staging area
• Implemented Slowly Changing Dimension methodology for Historical data.
• Designing mapping templates to specify high-level approach.
• Extensive hands on XML import/export, deployment groups, query generation, migration using Informatica repository manager.
• Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
• Extensively worked on unit testing and implemented on transformations, mappings.
• Created Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
• Extensively worked on cloning between Stash and GIT-REPO the ETL and Data Base Codes
• Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Experience in working with Access management, Sales, Order management and fulfilling system of publishing company. Documented all old and migrated Store Procedures and scripts for future references.
• Coordinated between Development, QA and production migration teams.
• Coordinate Onsite and offshore team, managing various calls related to requirements, development, testing and migration and leading the team.
• Reports were generated using Business Objects for analysis.
• Outstanding knowledge of leading application server configurations services and capabilities Environment: Informatica Power Center 9.5.1 Informatica Power Exchange 9.5, Oracle 11.6, UNIX Shell scripting, TOAD, Excell-Macro, MKS Integrity Client 10, Win SCP, Control-M 7-0, PL/SQL
- Lead Informatica/ETL Developer at JPMC
- Senior Informatica/ETL Developer cum Lead at Nomura
- Informatica/ETL Developer at Ameriprise
- ETL developer at Beal Bank
2 years, 7 months at this Job
• Perform requirement analysis through coordination with business analysts and define business and functional specifications.
• As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.
• Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
• Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (oracle 11g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
• Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules.
• Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.
• Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
• Documented Informatica mappings in Excel spread sheet.
• Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center.
• Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
• Used Informatica Power Center to migrate the data from different source systems.
• Extensively used Autosys for Scheduling and monitoring.
• Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
• Extensively used various Performance tuning Techniques to improve the session performance.
• Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
• Created business rules in Informatica Developer and imported them to Informatica Power center to load the standardized and good format of data to staging tables.
• Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
• Designed and Created data cleansing, validation and loading scripts for warehouse using Informatica Power Center 9.1.1/8.6.
• Worked with offshore team to Develop and deliver the ETL tasks.
• Monitored workflows and session using Power Center workflows monitor
• Used Informatica Scheduler for scheduling the workflows
• Developed the ETL jobs to automate the Data validation process across the systems (Web trends and In house) using Informatica and notifying the variances to stakeholders.
• Extensively used Informatica client tools - Source Analyzer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.
• Created workflows and worklets for designed mappings.
• Provide documentation of ETL design and technical specifications, high level and low level design specifications.
• Design and build reconciliation process. Environment: Informatica Power Center 10.0/9.1.1, Teradata 14/12, Oracle 11g, My SQL, XML, Flat Files, SQL Assistant, Autosys, Toad, PL/SQL, Erwin, Unix shell scripting, Cognos, Unix, Windows
- Sr. Informatica ETL Developer at Sprint
- Sr. Informatica ETL Developer at Texas Medicaid Healthcare Partnership
- ETL Informatica developer at Asurion Mobile Insurance
- Informatica Developer at TMX Finance
1 year, 3 months at this Job