Regal Beloit is a leading manufacturer of electric motors headquartered in Beloit, WI, catering to a wide range of markets from heavy industry to high technology. Regal has seen major growth due to the numerous acquisitions that are done each year. This necessitates the need for working with numerous different ERP systems for managing the businesses.
Projects at Regal Beloit:
Acquisition of A.O.Smith by Regal warranted the need for integrating the reporting for EPCs Sales, Orders & AR data into the Regal framework. EPC was on a different ERP (Mfg Pro) & reporting was based out of Business Objects. This project was an effort to read data from Mfg Pro and load it into the existing warehouse for Regal so that there is a common platform for reporting as in the case of all other Regal's businesses.
RK/PK - Customer On-Time Delivery Metrics
This project was a result of Regal's initiative for Customer Satisfaction. The prime objective of this project is to measure the ON-TIME delivery metrics of shipments from Regal to the customer. Metrics are captured across multiple ERP systems across different geographies to evaluate the two main KPIs RK (Request Kept) and PK (Promise Kept) at different plants in Regal.
Unico - Integration Project
This project is again a result of Regal's acquisition of the Unico business whose ERP is based out of IFS. Through this project, the data from IFS was integrated with the existing warehouse, to enable reporting on Sales & Orders from Unico.
Regal has a global presence with businesses on different ERPs. The company's objective is simplification and to bring all businesses into a single ERP system,
this involves taking into account the country specific rules and also currency conversions.
• Participated in Requirement Gathering sessions to understand business needs
• Designed the data warehouse model to suit the specific business needs
• Designed the ETL process to cater to the user requirements
• Designed and developed various mappings in Mapping designer and sessions and workflows in Workflow manager to extract data from flat files, Oracle sources & third party ERP systems for loading data into the warehouse
• Developed Transformation logic and designed various complex Mappings in the Designer
• Worked with various transformations such as Lookup, Aggregator, Expression, Router, Filter, Update Strategy, Stored procedure, Sequence Generator. Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager.
• Run the Power Mart sessions through DAC.
• Create user friendly functions in UNIX to be used for data quality and meet other requirements.
• Created the Presentation layers in RPD for the reports to be built upon the data warehouse.
• Created OBIEE Dashboards and reports.
• Providing Production Support post Go Live. Environment: Informatica Power Center 7.3.2 / 8.6 / 9.0.1, UNIX, Oracle 10g / 11i , EBS 11i / R12, Datawarehouse Administrative Console, Oracle BI Dashboards, BI APPS 188.8.131.52 / 184.108.40.206, 220.127.116.11, OBIEE 10.1.3.4.1, 18.104.22.168.3
- OBIEE/Informatica Architect/Lead at Regal Beloit
- ETL Lead at Decision Support & Insight
- Global Head Count & Inventory Aging at Regal Beloit Corporation
- Sr. ETL Developer at Regal Beloit Corporation
7 years, 2 months at this Job
- Bachelor of Technology - Technology Exposure
Supporting the Defense Logistics Agency (DLA) Transaction Services Logistics Data Gateway (LDG) Data Warehouse/Data Mart Initiative. Served as the Administration/Team-Lead of the Developer Team (3 staff members). Involves training the Developers in the use and functionality of Informatica toolset and optional plug-in's to develop/design ETL processes and EDI X.12 Unstructured-Data Parsers (B2B DataTransformation), including Web-Service-Provider and Consumer transformations. Assign tasks to Developer Staff as appropriate to accomplish enhancements to existing processes, to automate manual processes, and to implement solutions to new requirements. Monitor/Supervise the Development cycle through specification, build, and unit-testing. Deploy ETL, parsers, and web-service processes to various life-cycle staging points (e.g., Development, System-Regression Verification, End-User-Acceptance, Promotion-to-Production), assisting staff in each stage. Continue to perform Change Management Administration tasks to test/verify/deploy staff-developed ETL solutions. Continue to perform duties of Software Engineer III slot, with the main focus of upgrading the Informatica COTS application (e.g., server configurations, performance tuning on processes) to maintain service/support vendor contract level at latest 'generally available' version. The current GA version 10.2 requires moving the environment from HP-UX platforms to RedHat Linux as HP-UX is no longer supported. This effort is in-progess, with an implementation in a 'Development/Test-Bed' environment during Winter/Spring 2019 and proposed Production environments during C.Y. 2019.
- Senior Informatica Architect at Serco NA
- Analyst at IBM
- Software Engineer III at Support the Defense Logistics Agency
- Senior Software Engineer at Liberty Mutual Insurance
8 years at this Job
- B.S. - Computer Technology
- A.A.S. - Computer Technology
• 10 years of IT experience in banking domain and my area of expertise have been Banking Customer information maintenance. My forte is Teradata with exposure to UNIX, Informatica and Mainframe
• Experience in Teradata utilities like BTEQ, TPT, FastExport, FastLoad, MultiLoad to export and load data from different source systems
• Managed Development, Technology re-engineering projects, involving Teradata, DB2
• Experience with Data Extraction, Transformation, and Loading (ETL) from different Data sources like DB2, Flat files using Informatica
• Have involved in Backend DB and Batch process development, maintenance, optimization, monitoring and support
• Experience with Agile and waterfall development and project methodologies
• Experience in trouble shooting and resolving data issues
• Ability to adapt seamlessly with new environment/people and the potential to learn new technology/processes in ease
• Worked closely with client managers/business analysts of the bank to drive technical solutions, design and provide development estimates for schedule and effort
• Dynamic, hard-working, ability to work in-groups as well as independently with initiative to learn new technologies/tool quickly and emphasis on delivering quality services
• Have strong ability to build productive relationships with peers, management, and clients using strong communication, interpersonal, organizational, and planning skills Technical Skills Databases - Teradata ETL Tools - Informatica, UNIX Teradata Utilities - MLoad, FLoad, TPump, FastXport, BTEQ, TPT Tools - Teradata SQL Assistant, Toad, Putty, SSH Tectia, JIRA Operating Systems - OS390, UNIX, Windows Languages - COBOL, Easytrieve, JCL Testing Software - Quality Centre Job Scheduling - CA7 Scheduler, Autosys
- Teradata and ETL Informatica - Architect - DW at Bank Of America
- Teradata Senior developer at Bank Of America
- Teradata Developer at Bank Of America
4 years at this Job
• Responsible for driving the overall design and determining the ETL architecture necessary for the BI team to develop and implement large complex projects
• Responsible for leading and guiding Informatica based ETL architecture to handle complex business rules while balancing the fulfillment of stringent performance requirements
• Partner with ETL Lead and ETL developers through the development of complex ETL routines to load a data marts, Data Warehouse and supporting layer. Lead and guide on error handling routines and load balance processes.
• Determine organizational strategies for data integrity validation processes.
• Establish policies and best practices for optimizing ETL data throughput/accessibility
• Identify opportunities for new architectural initiatives; makes recommendations on the increasing scalability and robustness of ETL platforms and solutions
• Remain current on new ETL techniques and methodologies and communicate trends and opportunities to management and other developers as needed.
• Identify opportunities for uses of those technologies to enhance current or anticipated information systems and business goals needs
• Support ETL developers by providing technical assistance, troubleshooting and alternative development solutions.
• Enterprise MDM solution architecture and implementation approach research
• Responsible for the development and enforcement of best practices for the BI team
• Mentor ETL developers, BI analysts and other members of the BI team
• Responsible for defining and implementing Enterprise Data Catalog solution
• Enterprise Ontology management - Experience in OWL validations using Protégé
- Data Solutions Architect/ Informatica Architect at Voya Financial
- Business Solutions Consultant at Best Practice
- Sr. ETL Architect at Insurance Solutions
- Design Team Lead / Big Data Management at Evarient
5 months at this Job
• Played a very important role in team in development of mappings, workflows and troubleshooting the issues.
• Analyzing the source data coming from various databases and files.
• Identifying and tracking the slowly changing dimensions (SCD), used CDC logic (Change Data Capture) for the SCD tables loading in Oracle.
• Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling
• Extracting data from Oracle and Flat file, Excel files, XML and COBOL sources and performed complex joiner transformations, Expression, aggregate, lookup, stored procedure, filter, Router transformations and Update strategy transformations to extract and load data into the target systems.
• Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).
• Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.
• Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
• Designed Data Quality using IDQ and developed several informatica mappings.
• Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
• Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, Workflows, and Target Data.
• Created reusable Tasks, Sessions, reusable Worklets and workflows in Workflow manager.
• Worked very closely with Upstream teams all the time in order to fix the issues and completed tasks effectively in time and proceeded to production level.
• Very familiar about Sorter, Filter, Expression, Consolidation, Match, Exception, Association and Address validator transformations.
• Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.
• Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.6.1 environment
• Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes. Environment: Oracle 11g, PL/SQL, Informatica Power Center 9.x/10.xData Quality 9.x
- Informatica Architect at VERIZON BUSINESS
- APPLICATION SPECIALIST V at BANK OF AMERICA,Charllote,NC
- ETL Architect at BB&T BANK,Winston Salem,NC
- Offshore Team Lead at HAVARD PILGRIM HEATCH CARE
9 months at this Job
- Master of Science
• Design the Data Integration process.
• Implement procedures for suggestion and implementation of ETL processes and architecture enhancements.
• Coordinated with technical teams for estimation and translation of client and business requirements into specific systems.
• Coordinate the ETL code deployment to the production environment.
• Executed methods for designing application modules to produce required products.
• Document technical details related to the ETL Integration.
• Prioritizing the ETL work from high to low depending on the business requirements.
• Standardize the ETL deployment procedure, ETL naming standards, and ETL production support documentation.
• Manage and assign the day to day tasks to the ETL developers and give an update to the business partners.
• Define Scope of the Data Conversion to the Team.
• Work with the QA team to prioritize the resources and provide updates on the analysis.
• Identifying all the Data Sources And Destination Objects for Data Conversion.
• Analyze and Profile Data and publish the reports to the SME and Business.
• Gather Requirements for Data Conversions and Provide estimates.
• Hands on experience importing Power Exchange Sources and Targets using Informatica Power Exchange 9.6
• Involved in analysis, design, development, implementation and quality assurance of DW applications and reporting system. The Data warehouse solution provides the reports to the claims department to generate reports.
• Coordinate with the off-shore and Off- Site development team for the updates.
• Detailed data analysis to determine the best way to improve the Process flow.
• Involved in Release management work, which includes creating support document, managing the process flow to achieve optimal results.
• Data imports using SQOOP table import and SQOOP import-all-tables to Hadoop file system.
• Import Data into HIVE using SQOOP command line and overwrite existing tables.
• Export HDFS file to MySQL database using SQOOP export.
• Evaluate SQOOP SQL statements using the eval command.
- Informatica Architect at Wells Fargo
- Lead IT Developer at Ecolab/Nalco
- ETL Developer/Architect at Lenovo
- ETL Developer/Architect at Lincoln Financial Group
4 years at this Job
- ETL Informatica Architect at Pyramid Technology Solutions, Inc
12 years, 11 months at this Job
- Bachelor's - Computer Science
Project Name: Federal Reserve Bank of NewYork, New York,NY
Duration: Aug 2015 to July till date
Platform & Skills: Oracle 11g/12cR2, 12c Cloud Control, Veritas Cluster, Solaris 10, Linux 5/6, RAC/RAC-ONE(11gR2/12c), OID, 12c Goldengate, 12c/13c OEM,Informatica 9.5/9.6.1
Role: Datawarehousing Architect/Senior Oracle DBA
● Migrated Informtaica source and target databases from 11g to 12c pluggable databases
● Migrated Informatica 9.5 to Informatica 9.6.1 on the new servers along with upgrading database from 11g to 12
● Tuned both source and target databases at various levels for ETL using informatica
● Created and modified transfomations for efficient data loads and business changes
● Involved in data modeling and ETL strategy design
● Developed a strategy to migrate 11g databases into 12c as PDB databases using datapump
● Installed and created 12c RAC-ONE databases on Linux
● Developed provisioning process using 12c OEM for installing 12c GI, RAC-ONE RDBMS and create CDB/PDBs
● Extended the existing clusters to add third node and resolved the issues
● Resolved the issues with OCR/Voting disks
● Stabdardized the Oracle Homes for GI/RDBMS and OEM installations
● Configured ACFS
● Installed and configured 12c Goldengate on ACFS and configured high availability
● Developed provisioning for patching process with 12c OEM
● Configured dataguard for 12c RAC-One databases
● Relocating RAC-ONE databases for maintenance operations to reduce downtime
● Resolved RMAN backup issues
● Configured OLS for data security
● Created and managed wallets for RMAN backups and datapump
● Developed a strategy to migrate stats and strategy to collect stats in optimal way
● Tuned the databases at various levels
● Upgrading 12c OEM to 13c OEM
● Upgraded 11g Oracle Restart databases with dataguard in place
- Datawarehousing Architect/Senior Oracle DBA at Veritas Cluster
- Datawarehousing Architect/Senior Oracle DBA at Veritas Cluster
- Senior Oracle DBA at ORACLE CORP
- Datawarehousing Architect/ Senior Oracle DBA at Platform & Skills
3 years, 5 months at this Job
- B.Tech - Computer Science and Engineering
Technologies AWS, Redshift, S3, Hadoop - EMR, Spark-SQL, Python, RDS, EC2, Athena
Reporting Tableau 9.1, Quick Sight
Database Redshift - Big Data Analytics, Oracle 11g
Part of the Fastest Growing Finance BI Team, using Analytic to solve Business problems.
Building the next Generation Data Lake platform which can scale horizontally and vertically on both Storage and Compute with a decoupled architecture. Using AWS stack to build Data Lake on cloud.
Project Roche POC
Team Strength 3
Duration (Sep-2016 to Dec-2106)
Role DWH Solution Architect
Technologies Athena (In-House ETL)
Reporting Tableau 9.1
Database HP Vertica - Big Data Analytics, Oracle 11g
Laboratory Information System Roche Infinity Lab System
Build Data Mart to provide Laboratory Analytic Solution to help Roche determine and track various KPI, build insights and report Utilization Management. Reports provide insights on Client Order Volume Trend and Comparison across Clients, Compare Lab Test Order trend by Physicians. Report TAT (Turn Around Time) taken at various Events of an Order. Report the outlier's orders missing TAT target time. Staffing Analysis across dimensions like Time, Day of Week to determine and predict average volume of incoming orders and distributing Staff per daily Shifts. TAT and QC analysis on Roche Instruments used in GenLabs and Sendout Analysis and trends.
• BI Architect responsible for design and code review.
• Data Analysis to identify the relations and business entities from Infinity Source System.
• Build high level Data Model for POC.
• Design Tableau reports and ETL design
• Performance tuning at HP Vertica Database level.
- DWH Solution Architect at Amazon - FincTech
- DWH Solution Architect at U.S Laboratory System
- DWH Solution Architect at HP Vertica - Big Data Analytics
- Solution Architect at PowerExchange, HortwonWorks
3 years at this Job
- Master - Computer Applications
- Bachelor - Computer Science
Background: Fiserv, Inc., is a US provider of financial services technology. The company's clients include banks, thrifts, credit unions, securities broker dealers, leasing and finance companies, and retailers, based in Columbus,Ohio. The role is to provide Architecture and Administration for products like Informatica PowerCenter, IDQ, Metadata Manager for Fiserv Internal customers. Also Production support for E-Payments Shared Services Team.
● Informatica Administration and initial setup in DEV/QA environment for internal customer.
● ETL Architecture and direction for internal customer.
● Linux administrative tasks such as developing shell scripts for ETL Monitoring.
● Provide production support for E-Payments Shared Services Team and working on Realtime production issues.
● Handle deployment activities and Prod Control Change requests in ServiceNow, code walkthrough for ETL Mapping changes. Environment: Informatica Power Center 9.6.1, IDQ, Metadata Manager, IBM Netezza,SQL Server 2016, Oracle 12c, Linux, Business Objects XI, CA7 Mainframes.
- Senior Software Engineer BI/Informatica Specialist at Fiserv
- Senior Informatica Architect at Kaiser Permanente
- Informatica Azure Big Data consultant at Marietta Memorial Hospital
- Senior Informatica/SAP Consultant at Utopia Global, Inc
6 months at this Job
- Bachelor's - Civil Engineering