Work as an ETL Architect, architecting, leading, designing and developing ETL solutions from multiple disparate source systems and applications.
• Work as an ETL Architect performing assessments, capacity planning, system architecture, design, development and implementation of ETL solutions.
• Collaborate with Data Management team to develop architectural requirements to ensure that business needs are met.
• Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture.
• Identify, recommend and implement ETL processes and architecture improvements.
• Evaluate new tools, techniques and make recommendations to improve real-time and batch data access, transformation and data movement across heterogeneous technologies and platforms.
• Work with the Enterprise Architect developing and enhancing the data warehouse standards and procedures.
• Troubleshoot Informatica issues and work with Informatica technical support to resolve issues in a timely manner.
• Performed POCs with Snowflake, AWS ML and Talend.
• Played a key role in migrating our Informatica environments from Century Link to AWS.
• Developed ETL processes to move data between AWS EC2 and S3 bucket.
• Created data pipelines for batch and near real-time data process.
• Developed several Python scripts to automated Informatica migrations.
- ETL Architect at OceanX, LLC
- Sr. Data Management Developer/Informatica Administrator at Guthy-Renker
- Data Warehouse Specialist at Mattel
- Informatica ETL Developer (Contract) at Toyota Motor Sales, USA, Inc
3 years at this Job
- BS - Computer Science
- A.A. - Data Processing
04/2007 - Till Date ETL Architect This project involves building a new Data warehouse for the client. This new data warehouse will be based on business model of the customer and would be source system independent. Data from various systems would be cleansed and stored in a normalized database. Teradata is used as the data base, DataStage as the ETL tool and Business Objects for report generation. Environment: Windows, DataStage, Teradata, Business Objects. Responsibilities: - Gathering business requirements - Design and Development of Grouper Data using Symmetry Suite 8 - Creating data mappings between various source systems to the data warehouse Objects - Responsible for Teradata 15.0 Administration - Write BTEQ Scripts - Develop Data Model using Erwin - Writing Technical specifications for ETL programs, Model Metadata - Production Support - Designing the general ETL infrastructure necessary to address the common issues like ◦ Error handling and restart/reprocessing techniques ◦ Parameterizing the DataStage jobs and building the controls necessary to feed parameter values correctly at run time ◦ Auditing the ETL programs ◦ Scheduling the DataStage jobs
- ETL Architect at Department Of Health Care Services , CA
- at Harvard Pilgrim HealthCare, MA
- Extensively used Meta Broker at LOWES
- Dataware House Specialist at IBM/ FORD MOTOR COMPANY
11 years, 9 months at this Job
Role - Data Architect
Daily Units is an ecosystem that gathers the data from various Point of Sales systems and ERP systems and transforms them into consumable and consummate metric system. Daily Units reports helps users to see how the sales and production information is tracking to the budget and forecast numbers.
• Subject matter expert on the Daily Units Reporting eco-system
• Quickly learnt the home grown ETL Platform to build the Daily Units.
• Responsible for the correctness of data for 70 companies and over 30 different data sources on a daily basis. Transportation Data Project Role - ETL Architect Transportation Data mart is a data integration project that brings Point of Sales, Ticketing System, Time Card data and ERP system data together. This helped to get insights in Construction Plant efficiencies and crew efficiencies. This project is projected to save the company million dollars or more in the coming years. Responsibilities:
• Designed and developed a data mart that brings in the data from disparate sources like Point of Sales, Ticketing System, Time Card and ViewPoint ERP system.
• Developed reporting solutions that helped the National transportation team to track Construction Plant and Crew efficiencies. Environment: SQL Server 2014, SQL Server 2012, SSDT, BIDS, SSRS, SSAS, SSIS, CRH Home grown ETL Platform, Oracle, Tableau
- Data/BI/ETL Architect at CRH Materials
- Data Architect at
- Sr. BI Consultant at CareerBuilder
- ETL Architect & MDM Consultant at TWC - The Weather Channel
7 months at this Job
- Doctorate in Computer Science - Computer Science
- Master of Computer Science - Computer Science
- Bachelor of Computer Science and Engineering - Computer Science and Engineering
• Architecture the Store Set up by following the IBM WebSphere commerce data model, thereby customizing the load process and providing the data for both customer accounts, products and orders data in the ecommerce websites.
• Analyzed the data and designed the ETL components to implement working Capital Project, which lead to $9 M in cost savings within a span of 6 months, and thereby rolling out the similar feature to other stores.
• Architecture and Delivered the functionality for providing price information for all the products in a customer's catalog and thereby enhancing the customer experience.
• Architecture and Delivered the Catalog Product Management for SoneBiz websites.
• Created logical data models for many Sonepar BI and eCom applications such as Stock Management, Inventory, Customer Partnumber, Para reel Builder data flow.
• Designed the email marketing data flow and design to send email to customers for campaigning with new user registration, special offers etc.
• Created Dash boarding, visualization reports by using Tableau for demonstrating the POC's to the client on various BI Initiatives.
• Created Relational Data bases and tables for analytical reporting of the Product Model.
• Gathering requirement and performing Gap analysis by interacting with the client and evaluating existing tickets for each of the BI and eCom enhancements and support tasks.
• Performed POC for migration of one of the ERP system from SQL server to Teradata for handling huge volumes of data and its significance.
• SME for commerce platform, customer, products and sales functional data.
• Designed new ETL and data model architecture to support Reporting and Business Decision making.
• Conducted analysis by executing SQL queries against Oracle/MS Sql server databases to validate data from the source Data Warehouse and verify the data is loaded correctly in the BI Data Warehouse.
• Creating Excel reports and power point presentation to explain the trends/shortfalls from the Data Analysis.
• Delivered quality code adhering to agile methodologies.
• Dimensional Modelling using Visio and SQL power architect.
• Led a team of 4 on-shore and 10 offshore resources for the delivery of BI and ecom projects.
• Estimated the ETL work for each of the modules/task.
• Created Project Plan, Project Estimation (hours) for planning the sprints and deliverables.
• Working with the scrum team in assigning the projects/enhancements for the sprints from backlog.
• Acted as an onshore - offshore coordinator and providing the timely status to both client managers and client team.
- Data Architect/ ETL Architect at Sonepar
- BI Lead at Sonepar
- BI Lead/ Sr. BI Developer at AT&T
- BI Developer/Onshore Liaison at Washington Mutual
5 years, 10 months at this Job
Hands on development experience using Informatica Power Center, Informatica BDM, BDS, Big Data technologies and AWS Cloud. § Hands on experience in ETL architecture, design and development of more than 20 projects. This involved detailed requirement gathering discussions with the business teams, analysis and understanding of requirements and providing integrated, scalable and resource optimized technical solutions. § Designed and developed ETL to handle conversion of ICD9 to ICD10 procedure and diagnosis code for various source systems such as NASCO, Facets, NCAS and FEP. § Design and development experience in conversion of existing MSSQL script-based extract and load queries to Informatica ETL workflows. § Redesigned the legacy MS-SQL scripts using Informatica, implementing a successful multi-dimensional OLAP data warehouse. Designed and developed the TDS reporting layer with rollup data on which the MicroStrategy reports ran. § Designed and developed enrollment data layer to dynamically calculate valid daily enrollments for CF Large Group members. § Reengineered existing Audit Balance and Control for ETL using big data technologies such as Informatica BDM, BDS, Kafka connector and Hive. § Experience with ingestion and processing of real-time continuous log data. Used Informatica BDS and Kafka connector to stream and analyze the Informatica runtime logs. Integrated Informatica BDS using spark engine for real-time analysis of Informatica error logs. Stored the files to HDFS in parquet format and used HIVE external tables to view and extract the processed data for downstream dashboard reporting. § Worked on architecting AWS Elastic Map Reduce (EMR) clusters for creating data lakes on S3 and building an OLAP data analytics layer for claims, hospitalization and member risk score analysis. § Designed and developed ETL to load medical enrollment and claims data from on premise RDBMS to Amazon S3 buckets using Informatica BDM and spark execution engine. Created Hive external tables to view and analyze the loaded data. § Designed and developed ETL to extract medical enrollment, product and claims information from Amazon S3 bucket, applied transformation business rules and loaded the data to AWS Redshift creating a cloud based OLAP data warehouse for analytical processing. § Familiar with Informatica BDM Mass Ingestion tool to bulk transfer data from on-premise to AWS S3. § Functional understanding of Informatica Cloud services including the usage of RESTful API calls and responses. § Implemented data parallelism using Informatica partitioning to optimize the data heavy, long running workflows. § Optimization of Informatica workflows by using push down optimization, custom dtm buffer sizes and variable target commit points. § Performance tuning of slow running SQL queries using advanced window functions, partitioning tables and using optimizer hints to meet the SLA for delivery and runtime. § Performed data analysis and profiling on flat files, relational database objects and big data objects using Informatica IDQ/BDQ. § Hands on experience in developing complex SQL queries using various analytical functions for data extraction and transformation. § Experience in working with various data sources such as flat files, Cobol copybook files, xml, parquet and JSON files. § Experience in writing UNIX scripts for various flat file manipulation tasks. § Worked with the data architecture team to finalize the conceptual and logical data model designs based on ETL architectural solutions. § Worked on corporate initiative to build a central BI warehouse by integrating multiple source systems modernizing the BI analytics framework using the big data and cloud technologies for faster, fault tolerant and scalable downstream integration. § Architected TDS and Rollup layer for implementing multidimensional OLAP system for data slicing and downstream reporting needs. § Worked with BI Architects to design reporting data warehouse layer and MicroStrategy schema for multiple analytical and business reports. § Working on a proof of concept to use Amazon RDS to host the existing CBIW database. The setup includes multi-AZ synchronous replication with two additional cross region read replicas for disaster recovery and read latency improvement, CloudWatch and CloudTrail for metric, audit and security monitoring. § Created detailed system design, data flow and architecture documents. Technical Delivery Manager § Technical management of a team of 10 consultants for large scale, multi-year conversion project. § Monitored and tracked the ETL design, development, testing efforts and ensured all the deliverables were completed on time and on budget. § Successfully lead a development team that upgraded 150+ SSRS reports to MicroStrategy with enhanced business functionalities. § Understanding and finalizing the business requirements in collaboration with the business team. § Drafting functional and technical requirement documents in association with the data governance team. § Collaboration with the data architecture team on drafting data architectural requirements and conceptual data models. § Created and delivered High Level and Low-Level design documents. § Handled multiple parallel running projects with demanding delivery timelines.
- ETL Architect at CareFirst BCBS
- ETL Architect at British Sky Broadcasting
- BI Architect at
- Project Manager at Kaiser Permanente
7 years, 1 month at this Job
- Bachelor's - Computer Science Engineering
At McDonald's, we have a 10TB finance Oracle data warehouse designed primarily for the financial reporting (OBIEE) with Oracle financials (EBS) as the main data source, highly integrated with Hyperion planning system (forecasting), legacy file based and mainframe systems (for Sales, real estate, QCR data). Data warehouse also source master data (accounts, business locations, employees etc.) from MDM and DRM.
Summary of responsibilities in the current role:
• Manage/mentor technical teams
• Responsible for the design/architect and maintenance of a 10TB finance data warehouse
• Providing technical leadership/mentorship
• Work as a data expert (SME). Responsible for all BI/ETL related matters
• Work closely with business stakeholders to gather business needs and interpret them into the functional/technical specifications
• Expert in ETL tools (Informatica and others) and PL/SQL
• Creates complex ETL solutions
• Design/create finance dashboards
• Data modeling/design
• System development life cycle using Agile methodologies
• Production support, resolution of data load/data accuracy/data performance issues.
• Resolving production issues under tight deadlines as per the SLA
• Creating functional/technical documentations as per the company standards
• Recommending efficient/scalable data solutions for all new business processes/enhancements
• Work closely with all business data owners to understand data models and help in devising the new data models
• Creating and maintaining the data lineage (source to target) documents
• Business user training/presentations
• Working with DBA/infrastructure team for installs/deployments, creating deployment methodologies/procedures/documentations
• Creating and enforcing best practices
• Guiding QA team (testers) with new data logic and helping them with the test cases
• ETL performance tuning
- Data/ETL Architect/Manager at McDonald's Corporation
- Data warehouse Lead at Toyota Financials
- Senior ETL/ERP specialist at Mobyz/SE Technologies
- Programmer at Rail Coach Factory
8 years, 1 month at this Job
- Master of Science - Math and Computer Science
SQL Server - SSIS/SSRS/SSAS - Data Warehouse)
Penumbra, Inc. is a global healthcare company focused on innovative therapies that designs, develops, manufactures and markets innovative devices. SQL Data Warehouse Analytics project requires to build the BI solution by pulling the data from ERP systems and different API sources for different modules like Invoice, Sales, Inventory, GL, Clinical and Certify etc. This helps to build the Data Warehouse to centralize the quality of data for all subject areas and publish the reports in Tableau and Power BI.s
• Anchored the requirement discussion workshops with business and suggested metrics and reporting framework which are critical to business needs.
• Analysis of the specifications provided by the Legacy System.
• Created various mappings for populating the data into star schema.
• Developed stored procedures, functions, views, triggers, constraints
• Implemented best practices, query optimization and performance tuning.
• Building of ETL packages using different transformations and tasks in SSIS.
• Providing unit test case documents for BI solutions for unit testing.
• Support in deploying and scheduling the BI solutions.
• Major and Minor Enhancements, Bug Fixes and production support.
• Attending the client calls on daily basis
• Code Reviews of team Environment: SSIS, Power BI, Tableau, SQL Server 2017, C#.Net, VS.Net, Excel, Flat Files, CSV, Windows 10, AWS API, SFTPs
- ETL Architect at Penumbra Inc
- MSBI Team Lead at First Quality, USA
- MSBI Team Lead at Gallagher Bassett Services, USA
- MSBI Team Lead at Pfizer
8 months at this Job
- MSC in Information Techniology - Information Techniology
- Bachelor of Commerce - Commerce
The main objective of the project is to automate and maintain the data on the central data repository, EDM solution that benefits in better data quality management.
• Technical Architect, responsible for all the projects (EDM/Ecommerce/NFA/BRS) and delivery from onshore.
• Directly interacting with VP of client development team for any new coming projects and change requests.
• Provided technical leadership to project team by design the ETL architecture, gathering requirements, design ETL process and workflows and managing scope.
• Collect requirements from business department for the new SSIS design or enhancement for the existing SSIS packages.
• Create SSIS package to populate data into data warehouse.
• Design SSIS package to generate data file for accounting database system.
• Monitor the SSIS packages jobs during the Month End and check loading status.
• Provide the business support for Financial and accounting department.
• Perform data validation and conversion, Ensure data quality and accuracy.
• Write SQL queries for different source databases by interpreting their data models
• Created Report for business users by using SSRS
• Generated test data and tested database to meet the functionalities deliverables in the project documentation and specifications
- EDW/ ETL Architect at Pioneer Investments
- Datawarehouse Developer at Bank of America
- SSIS Developer at NOMURA Securities International
- DBA/SSIS/SSRS Developer at Department of Sanitation New York
1 year, 3 months at this Job
• Reviewed current Data Warehouse environment to provide and implement recommendations based on my findings and observations of the Core functional and ETL processes layer, Operational layer for improving Efficiency/Performance, ABC Layer to tighten security and to set up control checks, balances and audits as necessary
• Developed a data strategy and Business Process Re-engineering approach
• Built a version of legacy Enterprise Data Warehouse and also a Data Mart to automate generation of Treasury Reports and other Deposits related reports
• Worked on numerous upgrade cycles (of source systems, databases, ETL and warehouse)
• Worked as part of ETL Team to build a larger Enterprise Data Warehouse, Data Mart and Reporting Platform
- ETL Architect at National Cooperative Bank
- MITRE staff at MITRE Corporation
- at Bearing Point/ASCAP, NY
- at General Mills, MN
6 years, 6 months at this Job
- Master's - computer science
Project: Retail Data Lake Iterations
Data have been sourced from lot many applications into Retail Pharmacy Data Warehouse and Retail Data lake has been built in Hadoop for storing all different data across CVS Health including PBM. All type of data like XML file, delimited flat file, fixed width file, data from Teradata table/Oracle table have been ingested into RDL to have analytics done on it. This project will enable single data repository for analytics team for better data comparison and visualization.
• Involved in building Sqoop script to ingest data into Hadoop (HDFS) file system
• Prepared scripts for loading Historical data into Hadoop using Sqoop to match with incremental file
• Involved in updating Control framework for Hadoop loads.
• Built Hive tables using HQL and added partitions to load data from HDFS files.
• Engaged in coming up with Python script to parse xml file
• Supported Production Go Live and have provided warranty.
- ETL Architect at CVS Caremark
- ETL Architect at CVS Caremark
- ETL Architect at CVS Caremark
- ETL Architect at CVS Caremark
7 months at this Job
- Bachelor of Engineering in Electrical and Electronics Engineering - Electrical and Electronics Engineering