Devon Energy is one of the top Oil and Gas Companies in North America. Devon is implementing decision support center for its Production/Field operators to navigate through Oil Wells, help identify wells needing attention, so lease operators can focus on finding solutions instead of looking for problems.
• DSC helps detect underperforming wells. Lease operators receive daily reports showing any anomalies and fix faster and reduce unproductive time.
• Interacted with Data Modelers, Business Users, Data Architects on Understanding the data mappings.
• Developed Logical Data Objects, mappings, data services, applications using Informatica Developer.
• Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
• Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries.
• Changed and deployed MDM Hub for DSC portals in conjunction with user interface on IDD application.
• Responsible for designing, testing, deploying and documenting the data quality rules, mappings, mapplets
• Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
• Identify potential data risks and help/develop a mitigation plan to resolve potential issues
• Created Tidal Jobs to trigger mappings, refresh data services
• Performed multiple tasks effectively and involved in troubleshooting the issues
• Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 10.1 environment
• Re-designed multiple existing Power Center mappings to implement change request (CR) representing the updated business logic. System Environment: Informatica Developer 10.2.0, Informatica Power Center10.2.0, SQL Server, Oracle, Amazon Web Services, ServiceNow, Tidal Team Size: 5
- Informatica Developer at BC Forward
- Informatica Consultant at NTT Data Services
- Informatica Developer at IBM India Pvt. Ltd
- Informatica Architect at IBM India Pvt. Ltd
11 months at this Job
Responsibilities: ✓ Worked on the requirements with Business Analyst and business users also involved in working with data modelers. ✓ Worked closely with data population developers, multiple business units and data solutions engineer to identify key information for implementing the Data warehouses. ✓ Analyzed logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system. ✓ Parsed high-level design spec to simple ETL coding and mapping standards. ✓ Used Informatica power center as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources. ✓ Imported mapplets and mappings from Informatica developer (IDQ) to Power Center. ✓ Written Teradata BTEQs & as well Informatica mappings using TPT to load data from Staging to base. ✓ Fine-tuned Teradata BTEQs as necessary using explain plan and collecting statistics. ✓ Extensive exposure to data extraction, conversion loading from various sources including flat files, Oracle, SQL Server, DB2, CCR and CCD. ✓ Created and used the Normalizer Transformation to normalize the flat files in the source data. ✓ Extensively built mappings with SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables. ✓ Used Evaluate expression options to validate and fix the code using Debugger tool while testing Informatica code. ✓ Handled initial (i.e. history) and incremental loads in to target database using mapping variables. ✓ Worked with Workflow Manager for maintaining Sessions by performing tasks such as monitoring, editing, scheduling, copying, aborting, and deleting. ✓ Worked on performance tuning at both the Informatica level and Database as well by finding the bottlenecks. ✓ Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions, batches and scheduling workflows. ✓ Performed Unit testing and created unit test plan of the code developed and involved in System testing and Integration testing as well. Coordinated with the testers and helped in the process of integration testing. ✓ Heavily involved in Production support on rotational basis and supported DWH system using the ticketing tool for the issues raised by Business users. Environment: Informatica Power Center 9.5.1, Oracle, SQLServer2008/2012, Facets, Oracle Sql Developer, Tidal Scheduler, Windows.
- Sr. ETL/Informatica Developer at Metlife
- ETL/Informatica Developer at T-Mobile
- ETL/Informatica Developer at Bank of America
- ETL Developer at Vanguard
2 years, 6 months at this Job
• Gathering Business requirements and creating technical specifications along with creating Internal and External Design documents
• Create Technical specification documents and work with Business Analysts on analysis of multiple source systems data as part of ETL mapping exercise
• Created various mappings using different transformations in LDO's such as Union, Sorter, Joiner, Expression, Aggregator etc., transformations
• Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
• Created Batch scripts to call Informatica workflow's for number of times. Created different jobs using batch scripting to call the workflow by using Command tasks.
• Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer
• Executed Sessions, Sequential and concurrent batches for proper execution of mappings.
• Attended status meetings with project managers, escalated issues when necessary, attended meetings for issues resolution.
• Intimate about errors which is not relating to ETL and assigned to the concern person which is responsible for that error.
• Used parameters/variables for Powercenter mappings, sessions and workflows.
• Worked in Error Handling strategy. If any errors occurred the mapping should fail.
• Worked extensively with sources and target like Flat files, SQL Server tables
• To build efficient ETL components to meet the technical requirements Environment: Informatica Power Center (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Autosys, DB2, Oracle 12g, Siebel, TOAD, SQL*Plus, Windows, UNIX, Shell Scripting.
- Informatica Developer at Nike
- Informatica Developer at Freddie Mac
- Net Developer at Script, SSIS
2 years, 2 months at this Job
CONTRIBUTIONS Worked as an Informatica developer in Siemens Healthineer to integrate support activities of Extract Transform
Load (ETL) applications. This was very critical from business perceptive as it centralized the production support for Middleware/ETL applications for the entire business. Developed enhancements to increase uptime and provide cost savings for the ETL applications. Roles & Responsibilities:
• Involved in gathering requirements from client through JIRA board on daily basis.
• Analyze requirement and Database object involved in Data migration.
• Migrate and integrate data from multiple source systems like Salesforce, oracle then transform and load into PA Teradata data warehouse, which is used by downstream reporting team.
• Understand business specification and create IDM and design document.
• Designing and deploying enterprise-wide scalable operations on Cloud Platforms.
• Develop ETL mappings implementing business rules provided by business using transformations like Expression, Lookup, Joiner, Update strategy. Create Mapplets for re-usability of logic.
• Worked on building ETL data flows that works natively on HADOOP and developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
• Create Incremental load process using SCDs technique. Developed partitions for large volume data.
• Created applications connections like connection for database, Salesforce and many portals.
• Implement mapplets, mapping, create sessions, Worklets and workflows.
• Debug existing mapping to resolve incident tickets created by business.
• Schedule jobs using scheduler. Testing in Dev and Test with test data.
• Extensively worked on using the PDO (Push down optimization), CDC (Change data capture) mechanism.
• Involved in Rewriting Informatica mapping logic on Teradata by implementing Database Views. Consolidate multiple views of a business line into a single view, which is pulled as source in mapping to improve performance.
• Dump XML to gather mapping derivations and rewrite into SQL code as per Liberty Mutual standards. Used source PDO on Teradata and worked with different Teradata load operators.
• Write minus queries to test data between Teradata View and ETL Mapping targets.
• Create parameter files for each business line like Auto, Workers comp, Property etc.
• Check session and workflow logs for errors during execution. Deploy code to higher environments UAT and Prod as per guidelines.
• Participate in Agile scrum discussing blockers for the day, dependencies on task, status updates, and release planning and update JIRA Kanban board tasks.
• Production Support has been done to resolve the ongoing issues and troubleshoot the problems. Environments used: Informatica PowerCenter 10.2/10.1, IICS, Teradata 15/14, Hadoop, Erwin 9.2, Oracle 12c/11g, SQL *Loader, SQL, PL/SQL, Change Data Capture (CDC), Shell script, Visio, Cloud, AWS, Test Director 7.x, ALM/Micro Focus ALM Octane, Jira, IDQ, Autosys Professional Experiences
- Sr. Informatica Developer at SiemensHealthineer, NJ
- Informatica Developer at Northwell Health
- Informatica Developer at Mayo Clinic
- Informatica Developer at American Express
11 months at this Job
- Bachelor of Science - Information Technology
• Followed Agile-Scrum methodology to design, develop, test and deliver the code.
• Modify existing business logic in Informatica ETL flows.
• Involved in performance tuning and optimization of mapping to manage very large volume of data.
• Worked on Informatica Utilities Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
• Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy. Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts.
• Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.
• Implemented error handling for invalid and rejected rows by loading them into error tables.
• Extensively worked on batch framework to run all Informatica job scheduling.
• Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.
• Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
• Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
• Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
• Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
• Implemented restart strategy and error handling techniques to recover failed sessions.
• Used Unix Shell Scripts to automate pre-session and post-session processes.
• Did performance tuning to improve Data Extraction, Data process and Load time.
• Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model.
• Designed presentations based on the test cases and obtained UAT signoffs.
• Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments.
• Recorded defects as a part of Defect tracker during SIT and UAT
• Identified performance bottlenecks and suggested improvements.
• Performed Unit testing for jobs developed, to ensure that it meets the requirements.
• Handled major Production GO-LIVE and User acceptance test activities.
• Created architecture diagrams for the project based on industry standards.
• Defined escalation process metrics on any aborts and met SLA for production support ticket. Environment: Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, DAC(Scheduler).
- Informatica Developer at Citi Bank
- Informatica Developer at Medtronic
- Informatica Developer at KHJ TRANSCRIPTIONS
- Informatica Developer at GENPACT
1 year, 4 months at this Job
- Master of Science in Information Studies - Information Studies
- Bachelor of Technology - Computer Science Engineering
• Interacted with business representatives for Need Analysis and to verify and understand Business and Functional Specifications and participated in the Design team and user requirement gathering meetings, and prepared technical and mapping documentation.
• Designed and Developed complex mappings, reusable Transformations for ETL using Informatica Power Center 9.6.1 and performed data manipulations using various Informatica Transformations like Aggregate, Filter, Update Strategy, and Sequence Generator etc.
• Design & Develop ETL workflow using Oozie for business requirements, which includes automating the extraction of data from MySQL database into HDFS using Sqoop scripts.
• Responsible for full data loads from production to AWS Redshift staging environment and complete Data loading from Postgresql to AWS Redshift Data Lake.
• Led B2B structured and unstructured transformations that included: resolving end user problems, training on B2B transformations and resolving system failures.
• Created complex SCD type 1 & type 2 mappings using dynamic lookup, Joiner, Router, Union, Expression and Update Transformations.
• Working on a MapRHadoop platform to implement Bigdata solutions using Hive, Mapreduce, shell scripting, and java technologies.
• Used Teradata external loading utilities like Multi Load, TPUMP, Fast Load and Fast Export to extract from and load effectively into Teradata database
• Worked on Informatica B2B 10.1, PowerCenter Unstructured data Transformation (UDT) and made use of Mapper, Parser and Streamer components for working with XML files.
• Worked on Informatica cloud and extracted data from Sales Force source.
• Imported Relational Data base data using Sqoop into Hive Dynamic partition tables using staging tables and imported data using Sqoop from Teradata using Teradata connector.
• Responsibilities included designing and developing complex mappings using Informatica power center and Informatica developer (IDQ) and extensively worked on Address Validator transformation in Informatica developer (IDQ).
• Designed and Developed ETL strategy to populate the Data Warehouse from various source systems such as Oracle, Teradata, Netezza, Flat files, XML, SQL Server, Amazon DynamoDB, Hbase.
• Migrated ETL jobs to Pig scripts to do Transformations, even joins and some pre-aggregations before storing the data to HDFS and data structure used by NoSQL databases are different from those used by default in relational databases, making some operations faster in NoSQL
• Filtered XML files by using filter conditions on D9 segment and converted back the filtered xml files to EDI format using serializer in B2B data transformation.
• Used of Informatica Cloud is easily connected to variety of cloud on-premises, mobile and social data sources.
• Extract, Transform and Loading of fixed delimited files to AWS Redshift Tables using Informatica.
• Extensively used Change data capture (CDC) concept in Informatica as well as in the Oracle Database to capture the changes to the datamart. Change data capture enabled us to reduce the time taken to load the data into the data mart by allowing only the changed data.
• Worked on different file formats like Sequence files, XML files and Map files using Map Reduce Programs and developed multiple MapReduce jobs in java for Data Cleaning and pre-processing analyzing data in PIG.
• Involved in writing Teradata SQL bulk programs and in Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN and using Teradata Explain, PMON to analyze and improve query performance.
• Working with Data scientists on migration of traditional SAS code into Hive HQL to run on Hadoop platform with higher efficiency and less time.
• Automated the code deployment and EC2 provisioning using Ansible and Terrafoam and performed match/merge, run match rules to check the effectiveness of MDM on data, and fine- tuned match rules.
• Translate, load and exhibit unrelated data sets in various formats and sources like JSON, text files, and Kafka queues
• Written SQL overrides in source Qualifier according to business requirements and Created Oracle Tables, Views, Materialized views and PL/SQL stored procedures and functions
• Comfortable in implementing the CDC (Change data capture) for the slowly changing dimensions of types SCD-Type1, SCD-Type2, SCD-Type3.Used effective Start-Date, End-Date, Version, Flagging to capture the change for the records.
• Created the Reports using Business Objects functionalities like Multiple Data Providers, Prompts, Slice and Dice and Drill Down.
• Extensively used Aginity Netezza work bench to perform various DML, DDL etc operations on Netezza database.
• Estimates and planning of development work using Agile Software Development Environment: Informatica Power Center 9.6.1, Informatica BDE, Informatica B2B 10.1, Erwin r9.5, R, UNIX Shell Scripting, Oracle 12c, PL/SQL, Business Objects XI R2, SQL Server 2016/14,Korn Shell Scripting, Informatica cloud, Hive, B2B, Hadoop, MongoDB, HBase, AWS, Teradata, Netezza, SQL, T-SQL, Teradata SQL Assistant, Postgres SQL, Autosys, Informatica B2B, SSRS, Tableau, SSIS, Amazon Redshift, Amazon DynamoDB, S3.
- Sr. ETL/Informatica Developer at Pitney Bowes
- Sr. ETL/SSIS Developer at Bank of the West
- Sr. ETL/SSIS Developer at Truven Health Analytics
- ETL Developer at eLogic Tech Solutions India
1 year, 6 months at this Job
- Bachelor's in ECE - ECE
- Doctor - Informatica PowerCenter applications
• Develops Informatica Mappings in mapping designer, uses different types of transformations to facilitate timely loading of required data.
• Extracts data from various sources, transformed data according to the requirement and loading into data warehouse.
• Creates workflow in task developer, configures session properties and running Sessions based on schedule.
• Develops technical documents for workflow sessions and conduct data quality tests.
• Monitors the execution of workflow verifying the data processed and then loading the output data to generate data presentation
• Suggests ideas in team meetings and delivers updates on deadlines, designs and enhancement.
• Consults with team members to determine system loads and develop improvement plans.
• Discuss project progress with customers, collect feedback on different stages and directly address concerns.
• Responsible for conducting troubleshooting on databases and on workflows and sessions.
- Informatica Developer at RTG Worldwide
- SQL Developer/ Reconciliation Officer at UNITY BANK PLC
2 years, 7 months at this Job
Product Delivery Team
• Informatica Developer (PowerCenter 10.1)
• Informatica Cloud (Workday Integration Connectors)
• Database Maintenance
• Advanced Microsoft Excel
• Website content updates
• QA and Pre-Production website functionality testing
• Proficiency in SQL Queries using Database Visualizer or SQL Tools
• Working on multiple client accounts daily
Product Delivery Team
- Data Analyst/Informatica Developer, Technology Solutions Group at BI WORLDWIDE
- Debt Collector at FINANCIAL RECOVERY SERVICES
8 years, 8 months at this Job
- - Business Administration
CVAC is an existing Claim Application belongs to Legacy CHUBB. After ACE and CHUBB became one, they decided to integrate and migrate common applications between Legacy ACE and CHUBB and make one application. Due to this vision they requested IBM to help them to integrate and migrate CVAC data into CDW (Claims Data warehouse).
In this integration and migration project, Business wants to migrate CDW application from mainframe to Informatica. CDW sources data from multiple mainframe systems such as ClaimConnect, ClaimPath etc, so we ingest the files from these source systems into Sql Server through Informatica. This is a multilayer process where first we ingest raw files from mainframe systems, cleanse the data by applying business transformations and store the data in datawarehouse through Informatica. Once the data is provisioned in Sql tables , business uses these tables to read and generate their reports. Responsibilities:
• Worked as a Sr. Informatica Developer to Design and developing the ETL process using Informatica.
• Involved in getting requirement from end users and design ETL solution for those requirements.
• Performed transformations, cleaning and filtering on imported data using multiple transformations and load data into Sql Datawarehouse.
• Expertise in performance tuning of Informatica mappings and sessions for better performance and meeting SLA.
• Experience in integration of various data sources like Oracle, Flat files into an operational data store.
• Accountable for all phases of the Software Development Life Cycle (SDLC) and successfully delivering solutions to end customers on numerous ETL processes using Informatica. Technology Used:
• Data warehousing: Informatica Power Center 9.x/10.x
• Databases: Microsoft Sql Server
• Scheduling: Autosys IBM India Pvt Ltd (Client AT&T) Location: NJ (USA) Project title: SCA (Sales Compensation Application)
- Senior Informatica Developer at CHUBB Insurance
- ETL/Python Developer at AT&T
- Mainframes/Big Data Developer at IBM India Pvt Ltd (Client Express Scripts)
- Senior Application Developer at Oncor Electric
1 year, 8 months at this Job
- Bachelor's in Electrical Engineering - Electrical Engineering
• Involved in understanding the business requirements, discuss with Business Analysts, analyzing the requirements and preparing business rules.
• Involved in requirement gatherings from users and translated into technical specifications.
• Involved in designing the entire data warehousing life cycle designing
• Analyzed/Profiled the source data and involved in the gap analysis and implemented rules to cleanse the data.
• Responsible for Functional designing / technical designing of data warehouse and introducing new FACT or Dimension Tables to the existing Model
• Modeled the Data Warehousing Data marts using Star Schema.
• Worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer
• Worked on the Reports module of the project as a developer on MS SQL Server 2005 (using SSRS, T-SQL, scripts, stored procedures and views).
• Used DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
• Involved in running the loads to the data warehouse and data mart involving different environments.
• Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, Transform and cleanse data, and Load it into data marts.
• Used the update strategy to effectively migrate data from source to target. Created mappings and Mapplets using various transformations of Lookup, Aggregator, Expression, Stored procedure, Sequence Generator, Router, Filter, and Update Strategy.
• Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules.
• Created database partitions and Materialized Views and improved the performance
• Created, Configured and Load Scheduled the Sessions and Batches for different mappings using workflow Manager and Unix Scripts.
• Analyzed/de-bug production issues and provided quick turnaround.
• Interacted with end users to identify key dimensions and abstracted quantitative metrics feasible in deciding business solutions.
• Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database and other applications in the most efficient manner and created stored Procedures for the business logic using T-SQL.
• Monitoring SQL Server Logs, Application logs and analyzing the logs.
• Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
• Interaction with Offshore Team everyday based on the Tickets/issues and to follow up with them
• Organized data in the reports by using Filters, Sorting, Ranking and highlighted data with Alerts
• Used Unix Command and Unix Shell Scripting to interact with the server and to move flat files and to load the files in the server.
• Responsible for file archival using Unix Shell Scripting. Environment: Informatica Powercenter9.1.0, TOAD, MS SQL, Oracle, PL/SQL, Oracle10g/9i, SQL Server, SSRS, T-SQL,TOAD, UNIX.
- Informatica Developer at Qualifacts
- ETL Developer at Verizon Telematics
- Business Objects Developer at HDFC Bank Mumbai
4 years, 5 months at this Job