As a ETL Architect, works closely with different application teams to provide DataStage ETL Application and Design consulting, provide Subject Matter Expert (SME) support for DataStage based ETL solutions, estimating tools, development standards, recommendations, performance tuning and best practices.
As a ETL Architect, works closely with different application teams to provide DataStage ETL
Application and Design consulting, provide Subject Matter Expert (SME) support for
DataStage based ETL solutions, estimating tools, development standards,
recommendations, performance tuning and best practices.
- ETL Architect at Cognizant Technology Solutions
at this Job
- MCA - Computer Applications
• Worked closely with business users and BI groups, gathered business requirements by understanding business processes and need, in an agile environment with daily Scrum and Jira ticket updates.
• Identified the data source and type based on requirements. Involved in designing logical and physical data warehouse with analysts, data architects and DBA.
• Developed connection manager for multiple external vendors and corporate sources system including Oracle, Teradata, PostgreSQL, SAP ERP, IBM DB2, AWS, MS Azure, and MS Dynamics 365, etc. As well as ADO.NET connections.
• Created FTP/SFTP tasks and batch-file to receive inbound source files. Also wrote shell scripts to call REST API.
• Implemented Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources.
• Managed the metadata and designed cross reference tables and temple tables.
• Dealt with different methods of slowly changing dimensions (SCD) type and multi-level hierarchical dimensions in stage, ODS, EDW and Mart. Created variables and dataflow flags, parameterized ETL process.
• Used Python numpy, pandas and matplotlib to da data analysis, visualization, pre-processing and predictive modeling.
• Used expressions and tasks to convert and standardized data. Wrote C# or Python script to pivot/ unpivot and combined records, furthermore any other requirements.
• Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various Informatica Power Center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.
• Designed log tables and columns, generated error log files and created automatic email alert tasks.
• Conducted gap analysis and developed data migration and integration plans for system companies into a centralized parent EDW.
• Architect solutions using MS Azure PaaS services such as SQL Server AWS system environments.
• Reviewed and deployed SSIS projects to TFS (MS Teams) and Server SSISDB then created agents to execute schedule jobs in different environments for DEV/QA/TEST/PREPROD/PROD. Used Oracle Enterprise Manager and MS Server Management Studio for troubleshooting, monitoring, and optimizing with developers and QA.
• Subscribed SSRS reports and dashboard through Web Server by SSIS package, supported general business reports and benefit plan administration processes.
• Optimized performance, used joins and sub-queries to simplify complex queries involving multiple tables, created database objects (table/views/store procedure/triggers/functions), removed unnecessary columns, eliminated redundant and inconsistent data, normalized table, established joins and created Clustered, Non-Clustered indexes whenever necessary.
• Responsible for creating tests for unit, integration and end-to-end validation. Troubleshoot the Productions failure and provide root cause analysis. Worked on emergency code fixes to Production. Environment: SQL Server 2016, Visual Studio 2017, Informatica Power Center 9, Pentaho Data Integration Spoon 5, Oracle Data Integrator 11g, Eclips IDE4,WebSphere,Postman,SQuirreL SQL3,WinSCP, PostgreSQL,SAP ERP,Jupyter, Teradata Studio 16, IBM DB2, AWS, Azure PaaS, MS Azure, Ms Dynamics 365, SCRUM, Jira, SharePoint
- ETL Architect at Sonepar
- SQL Server (ETL/SSIS, SSRS) Developer at XCG Design Corp
- Energy Consultant at U.S. Department of
- Data Analyst at Transamerica
1 year, 4 months at this Job
- M.S. in Mechanical Engineering and Mathematics - Mechanical Engineering and Mathematics
- B.S. in Flying Vehicle Power Engineering - Research
Lead the BI team as a Data/ ETL Architect to provide a sustainable solution in the Data Integration of various Data Sources into an On-Premise and Cloud based Datawarehouse and enabling an ecommerce platform.
• Sonepar has a budget to acquire new companies into their portfolio, and with this acquisition huge amount of data must be brought into the Datawarehouse, and in turn enable the ecommerce platform as a solution.
• Designed, Developed a modern data architecture practice for the Data Integration into a $10B business.
• Created Logical and Physical data models to illustrate the data flow into the BI Data warehouse from acquisition using SQL Power Architect and ERWIN.
• Developed Informatica objects to enable the data loading for analytical reporting using BI tools.
• Providing Data solution for an IBM WebSphere commerce data model supported ecommerce websites, thereby customizing the load process and providing the data for both customer accounts, products and orders data. With this enablement of commerce platform, we are generating sales of around $3B.
• Analyzed the Inventory data and designed the ETL components to implement working Capital Project, which lead to $9 M in cost savings within a span of 6 months and rolling out the similar feature to other companies.
• Sonepar wanted to build an MDM for both Products and Customer, and to achieve, implemented Informatica IDQ tool, ETIM standards and UPK suite of applications.
• Create and Architecture Data models for demonstrating the data flow.
• Developed Informatica Objects to build the MDM data warehouse.
• Integrated the MDM data into the BI Data warehouse to enrich the web product sales and customer data
• Sonepar appointed me as a Data Steward supporting the commerce platform.
• Designed and Architecture the store set up by following the IBM WebSphere commerce data model, providing the data load using the mass load utility and data load utility loading into oracle database.
• Improved the commerce store set up process over the years by enabling store based on single market, multiple market and store catalog level. For demonstration purposes, built single market model, so that the features can be explained to new companies, and improvements can be made as an enhancement. By doing this, the rate of store set up went from 3 months to 15 days and thereby enabling the sales through commerce channel quickly.
• Designed the Para Reel builder data flow from ERP to Commerce platform, with this enhancement, Sonepar were able to meet the customer long time requirements to build the para reel builder from the website and thereby enabling sales in this segment by $40M in revenue.
• Designed the Catalog Pricing Export functionality, using a cost-effective solution using Informatica and Store Procedures to get the HTTP response from a Mass Product Inquiry created by commerce team. By implementing this solution, client rewarded Kudos to the team for designing solution well under the budget.
• Designed the email marketing data flow for targeting customers with new user registration, pending cart details, adapting customers to buy relative product from purchase history.
• With the recent Cloud technologies taking control over On-Prem Datawarehouse solutions, Sonepar started to transform their On-Prem to Cloud Data warehouse solutions, while supporting the traditional warehouse.
• Created a new MDM platform using Stibo suite of applications hosted in AWS, wherein product data integrations from multiple systems were mastered for product enrichment.
• Extracted data from AWS using command scripts to extract files and load them to BI Datawarehouse.
• Data analysis by generating scorecard reports for vendor using BI tools.
• Built a newer near real time data warehouse solutions on On-Prem and building the BI Datawarehouse on the Snowflake cloud hosted on Microsoft Azure using Fivetran for data lakes, making it a hybrid solution.
• Used Microsoft SSIS for loading data into the On-Prem solution.
• Created adhoc reports using Power BI and Tableau for demonstrating data in basic tabular reports.
• Performed some management roles over the years
• Conducted SCRUM calls daily to track project status and followed individuals to get meet sprint goals.
• Created Project Plan, Project Estimation for planning the sprints and deliverables.
• Acted as a coordinator between onshore and offshore to get the deliverables delivered to the client.
• Managed a team of around 4 onshore and 10 offshore and also assessing the resource performance.
- Data Architect/ ETL Architect at Sonepar
- ETL Architect Support at
- Data Migration Consultant - EH&S at DOW Chemicals
- BI Lead at Sonepar
1 year, 8 months at this Job
Data Supply Chain is designed to be the centralized information repository of key bank system. The scope of Data Factory is to build the centralized history data warehouse and data marts for subject areas like Deposits, Loans, Cards, Core Banking data and Marketing to have effective reporting to power users. This Project is a Data warehouse development project and involves developing applications related to sourcing and consumption refactoring to feed the data into data supply chain and develop consumption specific extracts for the end user applications for reporting. Responsibilities:
• Working as an ETL Ab Initio tech lead at onshore and coordinating with offshore team for smooth and efficient project deliverables.
• Working in agile project methodology a project delivery manager at onshore and coordinating with offshore team for smooth and efficient project deliverables.
• Involved in sprint planning and grooming session to understand new stories for upcoming sprint release.
• Working with Scrum master and product owner to estimate development efforts required for new stories for the current sprint.
• Understanding new requirements and come up with appropriate ETL design approaches for new development.
• Work with the team closely to ensure all required processes and tools are in place to meet the requirements of the business and customers.
• Creating high level as well as low level design documents for new development and handed over to offshore team for their understanding and smooth development.
• Take part in code review and standards for newly developed graphs as per KeyBank coding and naming standards.
• Reviewing unit test cases for each process that has been developed and tested by ETL Ab Initio.
• Coordinating with testing team for their smooth testing and data validation if any issues found during testing phase then work to close those defects as soon as possible.
• During project integration testing, working with LOB to ensure that data loaded by ETL Ab Initio is correct and meeting project requirements fully and functionally. Environment: Ab Initio, SQL
- ETL Architect at Key Bank
- ETL Architect at Bank of America
- Senior Datastage Developer at Bank of America
- Technical Lead at Bank of America
1 year, 8 months at this Job
GCC is the largest U.S. provider of international vacations for Americans aged 50 and older. We're a family of three brands: Grand Circle Cruise Line, Overseas Adventure Travel, and Grand Circle Travel. Our small group sizes, unsurpassed value and excellence, and unique itineraries position us as the industry leader in travel and discovery. Responsibilities:
• Involved in Designing ETL Architecture framework for new project using Informatica ETL Tool, Oracle, UNIX and ERWIN, Active Batch/Informatica Scheduler
• Work supportively and collaboratively with other teams build relationships and trust with key stakeholders to support programme delivery and adoption of integration standards. Communicate in line with project and CoE guidelines with technical teams and wider stakeholders. Lead the project integration development team to deliver effectively No direct line management responsibilities are planned for this role
• Enforced leadership in working with business subject matter experts, developers, and quality assurance staff to ensure that deliverables are met on a prescribed time.
• Leads and manages teams by providing work direction, coaching and mentoring, and performance feedback
• Primary responsibilities work as part of Integration Centre of Excellence, reporting to the Integration Vice president as a trusted technical lead and advisor for all integration tasks within enterprise projects as they arise
• proactive in ensuring that any spill overs or risks highlighted or reported by component (outside program boundary) teams are addressed to ensure no impact on program timelines
• Manage the change process into existing operations- from acquisitions and new team onboarding in current businesses; Develop and strengthen relationships with all lines of the business
• Design Data Architectureframework for EDW projects
• Expertise in Health care domain like Medicare, Medicaidand Claims process withinHIPPA regulation and requirement.
• Created Oracle Functions/Stored Procedures to implement complex business logic for good performance and extensively used Informatica Stored Procedure transformation.
• Performing Client Liaison, Requirement Analysis, Data Quality Analysis, Gap Analysis, Software Architecture Document Environment: Informatica Power Center/Power Exchange10.1.1,ICEDQ, Oracle 11G, Tableau, AWS, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), ERWIN, UNIX, Active Batch/Informatica Scheduler, TFS, SharePoint.
- ETL Architect at Grand Circle Corporation
- Principal Consultant at FreseniusMedical Care
- Sr. Datawarehouse Engineer at Sunovion Pharmaceuticals
- CRO IT Credit Risk at Credit Suisse
1 year at this Job
The main objective of the project is to automate and maintain the data on the central data repository, EDM solution that benefits in better data quality management.
• Technical Architect, responsible for all the projects (EDM/Ecommerce/NFA/BRS) and delivery from onshore.
• Directly interacting with VP of client development team for any new coming projects and change requests.
• Provided technical leadership to project team by design the ETL architecture, gathering requirements, design ETL process and workflows and managing scope.
• Collect requirements from business department for the new SSIS design or enhancement for the existing SSIS packages.
• Create SSIS package to populate data into data warehouse.
• Design SSIS package to generate data file for accounting database system.
• Monitor the SSIS packages jobs during the Month End and check loading status.
• Provide the business support for Financial and accounting department.
• Perform data validation and conversion, Ensure data quality and accuracy.
• Write SQL queries for different source databases by interpreting their data models
• Created Report for business users by using SSRS
• Generated test data and tested database to meet the functionalities deliverables in the project documentation and specifications
- EDW/ ETL Architect at Pioneer Investments
- Datawarehouse Developer at Bank of America
- SSIS Developer at NOMURA Securities International
- DBA/SSIS/SSRS Developer at Department of Sanitation New York
1 year, 11 months at this Job
Universal Services Administrative Co(USAC) is currently in the process of building an Enterprise Data Warehouse(EDW). Currently the data resides in multiple systems across various platforms and in multiple formats. The proposed architecture includes building of a Data Lake, an ODS and a semantic layer for reporting purpose. Responsibilities:
• Proposed and built a generic ETL framework for extracting data from multiple sources and loading them into the target Vertica database.
• Resolved performance bottlenecks and proposed solutions for increasing the throughput which resulted in faster execution times.
• Proposed database structures for ETL job processing and standardized guidelines for seamless execution.
• Interacted with the various business partners on identifying non-public information and enhanced methods for merging the data.
• Designed and developed jobs for updating the current execution status. Environment: Pentaho 8.2, Oracle 12c, Vertica 9.2, Sql Server 2012, Postgress, Jira
- ETL Architect at USAC
- Team Lead at FANNIEMAE
- ETL Architect at SOCIAL SECURITY ADMINISTRATION
- Sr.Developer at FANNIEMAE
4 months at this Job
• Designed ETL processes in Pentaho and Informatica Cloud to pull insurance policy and claim data from Oracle and MSSQL source systems into analytical databases including Redshift, Vertica, MongoDB, and Oracle Exadata
• Led efforts to learn and implement MongoDB for the company's first POC implementation of this document DB
• Led efforts to architect and decide on best solution to store geospatial data ranging from US state down to the city block level, sourced from USPS and Census GeoJSON shapefile data
• Led efforts to combine this data with FEMA flood data for cost- and risk-projection purposes
• Advised data modeling and MDM / data governance teams on Enterprise DW design decisions
• Served on a team of senior architects combining data from Kemper Corporation and Infinity Insurance Corporation data into one cloud solution after the completed merger of the two large companies
- Senior ETL Architect at Kemper Corporation
- Senior Data Engineer at Daxko
- Programmer Analyst III at Regions Financial Corporation
- Business Intelligence Developer at LHC Group Inc
1 year, 6 months at this Job
- Bachelor of Science in Computer Science - Business Intelligence
Responsibilities: Drive Development, Testing and deployment to Production
Drive Development, Testing and deployment to Production
- ETL Architect at Navy Federal Credit Union
- Sr. ETL Developer at DXC Technology
- ETL Consultant at AT&T
- Lead ETL Consultant at GE Corporate
2 years, 11 months at this Job
- Master's in Information Systems - Information Systems
- Bachelors in Electronics and Communications Engineering - Electronics and Communications Engineering
• Project Description: Northern Trust had undertaken CMRM(Capital Markets Risk Management) program to replace existing end-of-life version of IBM Algorithmics with murex to support Market and Counterparty credit-risk functions. The reporting work stream is part of the overall implementation plan to building enrichment layer(CDW) and Multi dimension data warehouse(MDW) to replace the existing reports Responsibilities:
• Involved in meetings with business users for requirement gathering.
• Involved in designing and build ETL process to feed the data to Cognos reporting tool.
• Translate business strategy/drivers into IT requirements to solve complex problems and proposed suitable technical solution.
• DAtawarehouing with huge volume of data(50 Millions per day)
• Prepared high level and low-Level designs; Source to target data mappings.
• Designed and developed DataStage 11.3 parallel jobs for extracting, transforming, integrating and loading data into Target data warehouse tables.
• Participated in weekly status and monthly status meetings for project deliverables and regulatory reports.
• Involved in SIT, Regression and UAT testing cycles.
• Involved in peer review, defect analysis and fixing.
• Involved in designing Control-M jobs.
- ETL Architect at Hexaware Technologies Inc
- at Hexaware Technologies Inc
- System Analyst at Hexaware Technologies
- Software Engineer at Magna Infotech Pvt Ltd
1 year, 7 months at this Job
- Bachelor of Technology - Technology