Sr. MSBI Developer, ETL Developer
Sr. MSBI Developer, ETL Developer
- Sr. MSBI Developer, ETL Developer at Sr. MSBI Developer, ETL Developer
1 year, 1 month at this Job
• Responsible for Interacting with Business Analyst/Project Architect for understanding of requirements and analyze the requirements.
• Responsible for preparing Design documents (High Level Design document and Low level design document) and security and other compliance standard documents and also Prepared Unit test Plan which needs to be executed in Development environment after the code changes completed.
• Used to handle the group of people (offshore team) and perform lead role to make sure reaching deadlines and get the work done as per the time lines/phases.
• Used Informatica Power Center ETL tool for developing mappings/Mapplets and create sessions/tasks, workflows for integrating the data to large Datawarehouse like Oracle, Teradata.
• Creating IDQ Data profiling, ScoreCards, Mapplets, Rules, Mappings, workflows, Data Cleansing, Data Standardization process using IDQ Developer and Informatica Analyst tools.
• Developed mappings for Change data Capture CDC with Power Exchange with Change data Capture.
• Developed Mappings, Workflows for extracting CDC data and scheduled in TWS Scheduler tool.
• Developed Datamaps for DB2 Data sources using Power Exchange CDC.
• Used Informatica B2B Data Exchange tool MFT the files from external venders to internal servers.
• Used Teradata Utilities such as FLoad, MLoad, BTEQ scripts for loading the data from flat files to staging tables, Dimension tables and Fact tables. These tables will be used for Cognos reporting.
• Used Pushdown Optimization (PDO) technique for performance optimization.
• Developed mappings& writing PLSQL Scripts for loading the data into Dimensional tables (SCD 1, 2, 3,4) and fact tables. Writing shell Scripts to automate Informatica workflows in scheduling tool.
• Identified bottlenecks and performance tuned at source, target, mapping and sessions where bottle neck occurs. Used Partitioning technique and pushdown optimization for improving the performance.
• Perform code reviews and execute the unit test cases as per the UTP.
• Knowledge Transition to QA team for code changes which are migrating to QA Environment.
• Migration of code changes to QA environment and providing support till QA sign off.
• Trouble Shooting the issues/defects which comes during QA testing and fixing defects.
• Migration of code changes to PreProd/UAT environments, trouble shooting and fixing issues.
• Knowledge Transition to Production Support team for code changes which are migrating to Prod Environment and resolution to issues.
• Providing extended support during the production warranty period after the implementation and fix if any issues occur.
• Good experience with Production Support activities, incidents, Service requests, Problem tickets, change tickets, DR activities, Maintenance activities with Service Now and Remedy. Environment: Informatica Data Quality IDQ 9X, Informatica Analyst , Informatica Power-Exchange CDC (change data capture), Informatica PowerCenter 10X/9X, B2B MFT Tool, Oracle, PL/SQL, Teradata, SQL Server, DB2, SalesForceDotCom (SFDC).
- Sr. Informatica ETL Developer at Sr. Informatica ETL Developer
- Sr. Informatica ETL Developer & Technology Lead at Smith & Nephew
- Sr. Informatica ETL Developer at Linde Group NA
- Informatica Developer at TDA ETL Integration
1 year, 2 months at this Job
• Collaborating with business partners to understand their business needs, designing and implementing potential technical solutions using BI Tools.
• Gathered and documented business requirements for analytical reports. Participated in requirements meetings and data mapping sessions to understand business needs. Identified and documented detailed business rules.
• Served as a good Team Player for the Business Intelligence team.
• Quickly jumped on to the requirements and delivered work times within the stipulated time
• Plans, leads, designs, coordinates, analyzes and directs BI project(s).
• Perform all SDLC phases related to extract, transform, and load (ETL) processes using SQL Server Integration Services (SSIS) and SQL Server T-SQL stored procedures within SQL Server 2014 environment.
• Responsible for writing queries in T-SQL, implementing views and stored procedures to support the various BI report requirements.
• Responsible for implementing Cubes (tabular) using SSAS, implementing Power BI reports with cubes as data source for live connection.
• Troubleshoot several stored procedures and Queries and fine-tune them to improve the performance.
• Established Joins and created Clustered/Non-clustered Indexes wherever necessary and implemented triggers and views for simplifying major tasks.
• Created Power BI dashboards data source as API's and RSS feed.
• Developed SQL quires using stored procedures, common table expression (CTE's), temporary table to support Power BI and SSRS reports.
• Developed complex calculated measures using Data Analysis Expression language (DAX).
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
• Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
• Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
• Deployed and maintained reports on power bi report server and managed access of reports and data for individual users using Roles.
• Created Workspace and content packs for business users to view the developed reports on Power BI service. Environment: Power BI (On-Premises, cloud), Microsoft SQL Server, SSIS, T-SQL, Visual Studio, Windows 2007/2010, EXCEL, Informatica Power Center 8.6.1, TFS.
- Sr.MSBI,ETL developer at Sr.MSBI,ETL developer
- MSBI,ETL Developer at Cigna HealthCare
- MSBI,ETL Developer at United Parcel Service, Inc
- MSBI,ETL Developer at Publix
1 year, 1 month at this Job
• Redesign, enhance the process and develop jobs
• Restart or debug/fix the failed Production jobs, i.e. Informatica, SSIS packages, SSAS Cubes and Hadoop related jobs
• Operation of data warehouse and business intelligence support
• Microsoft SQL Server 2008/2012 administration and IIS administration
• Informatica PowerCenter 8.X, 9.X Administration
• TIDAL job scheduler Administration
• Hadoop Ecosystem Administration
• Shell Scripting for the file system and Informatica folder maintenance
• Use Teradata FastLoad and Multiload utilities in conjunction with Informatica PowerCenter
• Execute BTEQ scripts using SSIS Packages
• Use Multiload utility in SSIS packages
• Use oozie to Monitor and restart the failed Hadoop related Hive/Scripting jobs
• Deploy the Informatica, SSIS Packages, SSAS Cubes and Hadoop related jobs into PRODUCTION
• Shutting down and Starting Up of Informatica servers during planned and unplanned Outages
• Modify setting in Hive scripts to optimize the long running jobs Environment: Microsoft SQL Server 2008/2012, Analysis Services(SSAS), Reporting Services(SSRS), Integration Services(SSIS), SQL Server, PowerShell, Teradata, ORACLE 9i/10g/11g, Informatica PowerCenter 9.X/8.x/7.x, Informatica Data Quality 8.X, SQL Navigator, TOAD, SQL*Loader, MS-Office 2003/2007,Solaris, MDX, PL/SQL, SQL*PLUS, T-SQL, XMLA, MDX, XML, TIDAL, Windows XP, Windows 2003 Advanced Server, UNIX Solaris 8 & 10, oozie, Hadoop, Ranger, Ambari, Big Data.
- Data warehouse Consultant,ETL Developer at Data warehouse Consultant,ETL Developer?
- ETL Lead Developer at MVP Healthcare
- Senior ETL Developer at BCBS CareFirst
- ETL Lead Developer at Pfizer
1 year, 4 months at this Job
- Bachelor's - ....
ETL Developer Responsibilities:
• Created and Modified PL/SQL packages, Triggers, Procedures, Functions and Cursors.
• Created /updated tables, Indexes, materialized views, synonyms and sequences per requirement.
• Analyzed Database of Source 9i and target 11g database before and after migration.
• Responsible for analysis of the issue, creation of Functional Specification Document and Design Documents. After proper unit testing of the code moving the code to Quality and after user acceptance in QA move the code to Production environment. Thus, involved in all the stages of SDLC.
• Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.
• Extracted, Transformed and Loaded data into Oracle database using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
• Used DBMS_SQLTUNE.REPORT_SQL_MONITOR package to generate SQL monitoring report and tune the queries.
• Performance Tuning of complex SQL queries using Explain Plan to improve the performance of the application.
• Working on the SQL tuning and optimization of the Business Objects reports. Generated AWR reports for performance analysis and tuning opportunities
• Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS).
• Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
• Created Records, Tables, Objects, Collections (Nested Tables and Varrays), and Error Handling.
• Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable.
• Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.
• Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES for loading, updating and deleting huge data.
• Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, and LISTAGG & ROW_NUMBER Functions for sorting data.
• Used SQL*Loader to load data from Excel file into temporary table and developed PL/SQL program to load data from temporary table into base Tables.
• Used PUTTY & secure shell to connect to UNIX machines.
• Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.
• Developed complex reports using multiple data providers, user defined objects, charts, synchronized queries, and created star schema in SSAS to develop ad-hoc reports for the clients as per their requirements using SSRS in MS SQL 2005.
• Experience in deploying forms and reports on the web.
• Experience in developing forms based on views, tables and procedures in tabular and form layouts.
• Worked on Several Unix/Linux Wrapper shell scripts like .ksh, .Csh , .sh.
• Written shell scripts for processing of the files calling the PL/SQL packages and SQL scripts.
• Used SQL*loader to load the input files from various external systems into the database staging tables and storing it for further processing.
• Active participation in release, deployment, Data migration and Production Cut-Over activities work.
• Responsible for creating PLSQL Programs and UNIX Scripts for Data Validation and Data Conversion.
• Prepared UNIX shell script for the transfer of files from FTP to SFTP. Environment: Oracle 10g, TOAD, Windows XP, Pl/SQL, SQL, Cognos8.4, UNIX Shell Scripting, Informatica, Putty, HP Quality Centre, SSRS, SSAS, Business Objects XIR3.1
- Dean Health at ETL Developer
- ETL Developer at Home Away
- Data Warehousing Designer & Developer at Wolters Kluwer
- ETL Developer at Wells Fargo
1 year, 10 months at this Job
- B.S - Engineering
business divisions, and main functions is to maintain, facilitate and improve the performance of the financial system and entities in it. Enforce and give effect to the law.
Data and Reporting team responsible to load data from Registry and Regulatory OLTP systems to DWH, and COGNOS reports are scheduled to run after DWH Loads and which delivers to ASIC business users.
Role - ETL Developer
➢ Involved in requirements gathering, analysis and attend workshops to finalise requirements scope of various business users for quarterly releases.
➢ Engage with Business Users and key stakeholders to perform impact analysis of requirements and feasibility study.
➢ Create, Design/update existing mappings to fit business requirements.
➢ Design ETL Informatica mappings to pull data from heterogeneous source systems data and load into Data marts and DW.
➢ Create Mappings, Worklets, sessions and workflows as per requirements to implement logic.
➢ Perform Tests and validate Workflows to meet requirements and prepare scripts to migrate to higher environments.
➢ Monitor Daily Production ETL Jobs which source from flat files, xml and DB into DWH.
➢ Run SQL Queries against Oracle DB & DB2 to verify load process.
➢ Debug failed Jobs and resume the load process.
➢ Unit test and Migrate mappings to TST, UAT and Production environments.
➢ Prepare and Document Technical DD and onto Share point.
➢ Manage and co-ordinate CRQ to deploy code in Higher environments.
➢ Identify and resolve Production issues.
➢ Work on Remedy tickets Environment: Informatica 9.6, Oracle11g, DB2, SQL, PL/SQL, SHARE POINT, REMEDY, COGNOS, SOLARIS 10.0
- ETL Developer at Registry and Regulatory
- at ASIC (Australian Securities and Investments Commission)
- Role - Informatica Administrator at Registry and Regulatory
- Informatica Developer at Registry and Regulatory
1 year, 5 months at this Job
- - COMPUTER SCIENCE
Tools and technologies used: Talend Data Services Platform 6.4, Microsoft SQL, REST APIs, JSON, XML, Git, JIRA Talend ETL developer and production support for ETL batch jobs and APIs in iTraycer backend environment: - Worked in migration of 30+ Jitterbit jobs to Talend jobs. Jobs used REST APIs, JSON, MS Sql, XML, delimited files, etc. Engaged in technical design, development and testing of the jobs and APIs, as well as deployment of the jobs and APIs to Talend TAC. Used Git for code version control and Nexus Repository for published deliverables. - Created and maintained Talend batch jobs and APIs for consumption of input client medical trays/kits related data and data population in MS SQL db, based on business rules. Worked on various enhancements to Talend jobs, based on new client requirements as well as performance improvements. - Created and maintained Talend batch jobs for sending needed data points in desired format (JSON, xml files, text files, excel files) to client systems. Also created jobs for sending reports to vendors/clients, based on requirements. - Created/modified MS SQL stored procedures, tables and views to meet new business requirements.
- ETL Developer at PROFESSIONAL WORK
- at PROFESSIONAL WORK
- Software Engineer III at Sears Holdings Corporation
- at Sears Holdings Corporation
1 year, 2 months at this Job
- Bachelor of Technology - (B. Tech) - Electronics & Communication
Team size: 8
Los Angeles, CA Environment: Informatica Power Center 9.6, Autosys, Tableau, Oracle 10g, SQL server
L.A Care The organization provides health insurance for low-income individuals in Los Angeles County through various health coverage programs. We used to work on tickets using JIRA. We work round the clock for on time delivery of work. We have sprints, used to work accordingly
• Used Informatica PowerCenter as an ETL to extract data from source like MS SQL Server, Flat files, Oracle and loaded to target
• Used Lucid charts for Data modelling
• Extensively worked with the various client component of Informatica like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Repository Manager, Workflow Manager, and Workflow Monitor
• Extensively used different transformations like Lookup, Joiner, Aggregator, Filter, Sorter, Expression, Update Strategy, Source Qualifier, Rank, Router to create several mapping and mapplets
• Implemented Slowly Changing Dimension phenomenon Type-1, Type-2 using Informatica ETL mapping
• Created mapplets and reusable session for the performance tuning
• Worked with Informatica workflow monitor in running and debugging its components and monitoring the resulting executable version
• Involved in fine-tuning of sources, targets, mappings and sessions for performance optimization
• Daily monitoring of the mappings that ran the day before and fixing the issues
• Involved in unit, system and end-to-end testing of the design
• Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations
• Extensively worked on the performance tuning of the Mappings as well as the sessions
• Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions
- ETL Developer at L.A Care
- Informatica Consultant at First American
- Informatica Consultant at Verizon
- Informatica Consultant at GE Aero - GRNI
1 year, 6 months at this Job
➢ Working on project Service Center Expansion Number as Sr.ETL Developer /Analytic.
➢ Responsible for gathering and analyzing the requirements and translate business requirements to architect ETL solutions for implementation For SCNE Changes.
➢ Involved in creating high Level Design document and Detail Design document for SCNE project and finding the impact on all upstream and downstream Applications and programs.
➢ Responsible for making modification of InformaticaPower center, Store Procedures and Tera data tables where the service center Number is being used according to the Business need.
➢ Modified code and did the end to end testing to make sure the changes are reflected everywhere and data is populating with correct data type.
➢ Worked and coordinated with team members with all the phases of the project.
➢ Worked on the Informatica and modified the mapping in order the data to populated in correct format and as well Historical data to show right format.
➢ Worked in Unix servers to run and modified the scripts and parameter files.
➢ Worked on TFS DW Tool for automated Migration from Dev to QA and Prod.
➢ Learned new technologies and applications.
➢ Learned Where cape Tool while working on this Project.
➢ Worked in Agile team environment, assist and deliver EDW/BI business value as defined by a Product Owner.Worked with product owners to plan and define requirements, and then with the implementation team to design, develop, test and maintain a high performance data in the project.
➢ Created test Plan and test cases for testing and document every modification to minimize the errors and risk.
➢ Flexible and team player and Eager to learn new technologies. Environment: Informatica PowerCenter 10.1,Informatica Power Exchange 10.1, Informatica Data Quality 10.1, Teradata 14, Oracle 12c, SQL Developer, UNIX Shell Scripting, SQL Server 2008, Autosys 11.3, DB2.
- Sr.ETL Developer/Analytics at Kaiser Permanente
- Sr. Informatica/ETL Developer at
- Sr. Informatica/ETL Developer at JP Morgan Chase
- Informatica/ETL Senior Developer at Legal Aid Society
1 year, 2 months at this Job
- Bachelor of Arts - Economics and Persian
Microsoft Visual Studio 2008/2010/2012.
• Created ETL Metadata Driven Framework to maintain standard across all ETL Developer and company standard.
• Responsible for performance tuningofstored procedures, Database Tables using TablePartitioning, SQL Profiler and Database tuning wizard.
• Hands on experience overallETL (Extract Transformation & Load) process.
• Skilled in High Level Design of ETL DTS Package for integrating data from heterogeneous sources (Excel, CSV, Oracle, MySQL, PostgreSQL, flat file, Text Format Data).
• Hands on experience in MS SQL Server Integration Services (SSIS), MS SQL Server Analysis Services (SSAS) and MS SQL Server Reporting Services (SSRS) using Business Intelligence development studio (BIDS).
• Worked on upgrading from DTS to SSIS packages.
• Experience in designing Database Models using Microsoft Visio and creating class diagrams, activity diagrams, use cases diagrams, sequence diagrams and flow charts using UML.
• Experience in Design, Development, Implementation and Documentation of business requirements in Microsoft .NET framework 2.0/3.5 using Microsoft ASP.Net, Microsoft ADO.NET, C#.Net, VB.Net, Web Applications, Windows Applications and XML.
• Worked on Notification services in setting up the Scheduled jobs and alerts.
• Hands on experience on Data WarehouseStar Schema Modeling, Snow-Flake Modeling, FACT& Dimension Tables, Physical and Logical Data Modeling.
• Proficient in implementing the business logic to design/develop the Cubes,Aggregation, Masures using SQL Server Analysis Services (SSAS).
• Expertise in generating reports using SQL Server Reporting Services tool,Crystal Reports andPivot Charts/Tables in MS Excel Spreadsheet.
• Extremely motivated, diligent, conceptually strong team player with ability to take new roles and adapt quickly to new technology.
• Detail-oriented, results-driven, excellent verbal and written communication skills with interpersonal and conflict resolution skills and possesses strong analytical skills.
- Sr. BI developer/ SSIS/ SSAS / SSRS at ETL Developer and company
- Sr. BI developer at SSIS/ SSAS / SSRS
4 years at this Job
- Bachelor's Degree - Computer Science