Integrated Customer Service Mart (ICSM): - In-Progress
The purpose of this project is to initiate automated collection of vendor data to minimize and eventually replace the manual efforts currently required for Interactive Customer Service reporting. The future end-state is a data mart that displays metrics for the various categories of customer interactions to facilitate operational and strategic planning and ultimately provide insight to 'Customer Journeys' across all company interactions. Collection Insights:
Collections Insights provides an end-to-end view of delinquent customers that reports on defined stages and tactics in delinquency for each legacy MSO (Bright House, Charter, and Time Warner).
There are more than 5 source systems (Artiva/FRS/XDW1/FF etc.) which provides the data and it's integrated to core layer as per defined business rules. Data Flow Diag- QMATIC:
QMATIC is Spectrum Store Data. The purpose of this initiative to analyze Spectrum store data w.r.t. below questions and Leadership will be able to see a customer's visits to Spectrum store:
➢ Reason for visiting
➢ How long the they waited to be helped by an associate
➢ Overall duration of their trip
➢ Transaction Count/Volume Reach Business Intelligence (RBI): Reach Business Intelligence (RBI) will combine three legacy companies into a single database, permitting unified reporting. This new BI system will contain more granular detail about the business, drilling down to individual contract lines and insertion-level detail. The inclusion of Eclipse/Wide Orbit avails activity and non-verified activity (e.g., auto-fill and PSAs) will consolidate all information pertinent to linear inventory and facilitate a truer view of how inventory is utilized across the Spectrum Reach footprint. POC on AWS (Alexa Skill): As part of this PoC, I have created the BI voice driven application for specific KPI's which will be answered by AWS Echo. In this process, Echo will read the data from DynamoDB (NoSQL DB) using Alexa Skill and Lambda function (Server less architecture to execution the Node.JS program). Data will be populated in DynamoDB from S3 bucket by automated way means if any new data will come in S3 then it will have reflected in database. Also I have used AWS Glue for ETL work/massaging the data. Client Business Operation(CBO): Client Business operations (CBO) project deals with Business reports (Automated and self-serve), Trends analysis and business usage of Campaigns and Ads event data for linear and digital stream, which is used to make business decisions, market strategy and also sell ADs, for both online and TV. These self-serve reports are interactive reports which business users can play with to do analysis. Audience Measurement (AM- Viewership): Audience Measurement Data validation(AM) project deals with Data validation, Trends analysis and business usage of Tuning, Ad, and VOD event data, as well as subscriber and programming reference information which is used to make business decisions, market strategy and also sell data. AM provides highly flexible data analytics platform to accommodate a wide variety of Media Sales, Marketing, Programming, and Operations business needs via the secure ingest of normalized and anonymized subscriber, programming, and network configuration data into a scalable data management solution. For these reasons, TWC has initiated the Audience Measurement Dimensional Data Store (AM/DDS) project to architect a comprehensive system that supports these business needs. The project involves a series of changes to an existing Data Warehouse which include extensive analyzing and validation of ETL scripts, database objects and validation of data which is extracted from different source systems, transformation logic applied on the extracted data and loading of transformed data to the different data marts are tested in detail on technical and functional aspects. Also extensive analysis of the data to ensure that changes haven't deviated the business requirements. Tools/Technologies Used: Hadoop-DataLake, Podium/Talend, Hive, Teradata 15.10, Unix Scripting, Jira, SVN Responsibilities: * Worked with business users to design, develop, test, and implement business intelligence solutions in the Enterprise Data Warehouse. * Created robust Framework to Extract and Ingest the data into DW layer irrespective of source system (Oracle, SQL Server etc.) * Created process/data flow diagram and documentations * Created coding standards, process and deployment guidelines documents * Design and developed SCD1 and SCD2 codes for loading the data into DW layer * Design and implemented Audit framework for data governance * Design and implemented auto metadata capture framework * Conducted data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. * Reviewed and validated QA test plans and supported QA team during test execution. * Participated in designing data transformation (source to target mapping/ETL) and data warehouse models. * Migrated CBO project from Netezza to Teradata * Ensured change control and change management procedures are followed within the program/project as they relate to requirements. * Conducted impact analysis as a part of the change management process. * Conducted unit testing on ETL and reviews of the peer's code * Provide support during UAT phase and production implementation * Co-ordinate with other dependent teams as required * Making sure that all the process and deliverables are well documented * On Call support for Production failure Cognizant Technology Solutions / T-Systems, Pune/Germany (Deutsche Telekom)
- ETL Architect at Cognizant Technology Solutions U.S. Corporation
- at DWH Kunde
- at Hewitt Associates-AON Hewitt
- at MIndTree Consulting Pvt. Ltd
5 years, 2 months at this Job
Actively involved in various tasks on a daily basis: 1) Maintain the data flow of transactional and master data elements going into and being displayed on the Avnet's e-Commerce Site. 2) Implement and maintain Avnet's in-house Product Information Management system, PIM. 3) Maintaining Avnet HR system to WorkDay integration. 4) Implement and lead new projects to enhance Avnet's e-Commerce site. 5) Building a data warehouse for the business users to access customer, sales and product master. 6) Avnet Inventory Feeds for various Vertical Search Engines. 7) Lead the effort in implementing various ERP to Open Systems Integration projects. 8) ETL Architect/Thread lead for the new SAP implementation. 9) Avnet ERP to SalesForce Integration. 10) Avnet ERP integration to 3rd party price engines.
11) Informatica Administration - maintaining users, roles, groups, LDAP integration, server health checks, version upgrades.
• Involved in gathering business requirements and translating them into technical requirements.
• Extracted the source data from different systems like iConnect, SAP, Fidelity, SalesForce, SAP HANA, Data Marts.
• Built interface programs to exchange data between Avnet DB and External Vendors for nightly updates.
• Actively worked with code changes, code promotions and production issues.
• Designed Informatica mappings to pro-actively handle the business and technical exceptions.
• Responsible for Code migrations from Dev - QA - Production.
• Heavily involved in implementation and maintenance of Product Information system, PIM for the eCommerce site
• Ensured all code was adequately documented, and that all documentation was kept up-to-date.
• Escalated any potential issues, road-blocks and risks and identified potential issues with the management team to enable early risk management and resource adjustment.
• For the SAP implementation, convert and design requirements into technical docs for the developers
• Design review and implement best practices into new, current and old code base for better performance
• Architect for the 30 in house Informatica servers landscape
• Manage the production support team 24/7 Environment: Informatica Power Center 10.2/9.5.1/9.1.0/8.5.1/8.1.1, Oracle 12g/11g/10g, ADP (HR, Payroll, Time Management, Benefits), SQL Server 2005, PL/SQL, LINUX, SAP R/3, SalesForce.
- Sr. Systems Analyst/ETL Architect at Avnet Inc
- Sr. Informatica Developer / Lead at BNSF Railways
- Data Analyst at Canberra Inc
- ETL Lead at Bose Corporation
10 years, 1 month at this Job
- M.S. in Computer Science - Computer Science
VISA Inc. Migration of Abinitio Jobs to Hadoop Ecosystem for VELTSTG Technologies : Hive,Pig,Sqoop,Python, Abinitio Responsibilities : Converting existing Abinitio graphs.plans to Hadoop HDFS ecosystem using Hive, Sqoop and Pig. Extracted the data from DB2, Oracle and various relational sources into HDFS using Sqoop and Unix Scripts Created and worked in Sqoop (version 1.4.6) jobs with the incremental load to populate Hive tables. Worked on writing UNIX scripts for finding out space usage / Collecting statistics/ Data Archival and Clean-up process. Created scripts to automate the alert generation and email using Python. Interact with clients on technical complexity and provide end to end solutions. Lead end-to-end project-related activities on both functional & non-functional areas w/ includes Front-door project planning/forecasting, resource estimation (time, resources, environment), regular status reports to various stakeholders VISA Inc. Continuous Data Streaming and Automatic Alert Generation for Record Count drop This system receives the transaction log feed from the Visa VIP and deliver it in near-real time to subscribers including MP. The UB receives the VIP logs using the existing VIP-APF UMF transfer protocol. Technologies : Hive,Sqoop,Python,Abinitio Responsibilities : Design and develop new Ab Initio Graphs Continuos Flow. For all of these CPU intensive transformation , find out the optimized code which takes less Abinitio Spec int. This also requires extensive testing. Design the architecture and solution to minimize delay in case of downtime or server maintenance. Design and Develop automated Failover to BRP systems and automatic recovery with no downtime. Build new project under Git Repository and checkin all Codes through Git and turnover done though Jenkins. Read the data from xml/json format, Analyzing the data and cleanse and refine the data before loading to warehouse. Implementing best practices for optimizing ETL data throughput/accessibility. Requirement Understanding by several discussion with Client and Product Office. Enhancing the Process/graphs for performance improvement to produce real time feed to downstream subscribers.
- ETL Architect at Wipro Technologies
- Assistant Technical Consultant at RS Software Ltd
- Assistant Systems Engineer at Tata Consultancy Services
3 years, 5 months at this Job
- - Education
• Collaborate with Internal customers and Stake holders to gather requirements for Integrated Budget and Acquisition Planning (IBAPS) Reporting Solution
• Architect, Design and Develop Data warehouse using Star Schema model
• Develop ETL pipelines using ODI 11g to Load data from Hyperion Planning, Oracle, MS SQL Server sources and CSV/Flat Files into Oracle Data Marts and Data Warehouse
• Building data warehouse for Hyperion Planning Application
• Building Dashboards and Reports using OBIEE 12c
• Customizing the ODI knowledge modules for efficient data loading
• Building AdHoc reports using OBIEE and Oracle SQL as per user's request Environment: ODI 188.8.131.52, OBIEE 12c, Oracle Loans, Oracle 12c, PL/SQL, Oracle SQL Developer, SQL Data Modeler, Windows, Linux
- Data/ETL Architect at Food and Drug Administration
- Reporting Team Lead at Export Import Bank of America
- ODI Specialist/OBIEE Developer/SME at IBM/Toshiba GCS
- Oracle/ODI/OBIEE Lead Developer at COMCAST
1 year, 9 months at this Job
GCC is the largest U.S. provider of international vacations for Americans aged 50 and older. We're a family of three brands: Grand Circle Cruise Line, Overseas Adventure Travel, and Grand Circle Travel. Our small group sizes, unsurpassed value and excellence, and unique itineraries position us as the industry leader in travel and discovery. Responsibilities:
• Involved in Designing ETL Architecture framework for new project using Informatica ETL Tool, Oracle, UNIX and ERWIN, Active Batch/Informatica Scheduler
• Conceptual, Logical & Physical Data Model designing for large scale projects
• Work supportively and collaboratively with other teams build relationships and trust with key stakeholders to support programme delivery and adoption of integration standards. Communicate in line with project and CoE guidelines with technical teams and wider stakeholders. Lead the project integration development team to deliver effectively No direct line management responsibilities are planned for this role
• Enforced leadership in working with business subject matter experts, developers, and quality assurance staff to ensure that deliverables are met on a prescribed time.
• Leads and manages teams by providing work direction, coaching and mentoring, and performance feedback
• PRIMARY RESPONSIBILITIES Work as part of Integration Centre of Excellence, reporting to the Integration Vice president as a trusted technical lead and advisor for all integration tasks within enterprise projects as they arise
• proactive in ensuring that any spill overs or risks highlighted or reported by component (outside program boundary) teams are addressed to ensure no impact on program timelines
• Manage the change process into existing operations- from acquisitions and new team onboarding in current businesses; Develop and strengthen relationships with all lines of the business
• Design Data Architecture framework for EDW projects
• Closely worked with other IT team members business partners data stewards stakeholders steering committee members and executive sponsors for all the Data Quality and Data Governance related activities
• Created Oracle Functions/Stored Procedures to implement complex business logic for good performance and extensively used Informatica Stored Procedure transformation.
• Performing Client Liaison, Requirement Analysis, Data Quality Analysis, Gap Analysis, Software Architecture Document Environment: Informatica Power Center/Power Exchange 10.1.1, ICEDQ, Oracle 11G, Tableau, AWS, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), ERWIN, UNIX, Active Batch/Informatica Scheduler, TFS, SharePoint.
- ETL Architect at GRAND CIRCLE CORPORATION
- Principal Consultant at LOGANBRITTON INC
- Senior Data Warehouse Engineer at PANTAR SOLUTIONS INC
- Team Lead at ONDEMANDAGILITY SOLUTIONS PTE LTD
1 year, 1 month at this Job
The main objective of the project is to automate and maintain the data on the central data repository, EDM solution that benefits in better data quality management.
• Technical Architect, responsible for all the projects (EDM/Ecommerce/NFA/BRS) and delivery from onshore.
• Directly interacting with VP of client development team for any new coming projects and change requests.
• Provided technical leadership to project team by design the ETL architecture, gathering requirements, design ETL process and workflows and managing scope.
• Collect requirements from business department for the new SSIS design or enhancement for the existing SSIS packages.
• Create SSIS package to populate data into data warehouse.
• Design SSIS package to generate data file for accounting database system.
• Monitor the SSIS packages jobs during the Month End and check loading status.
• Provide the business support for Financial and accounting department.
• Perform data validation and conversion, Ensure data quality and accuracy.
• Write SQL queries for different source databases by interpreting their data models
• Created Report for business users by using SSRS
• Generated test data and tested database to meet the functionalities deliverables in the project documentation and specifications
- EDW/ ETL Architect at Pioneer Investments
- Datawarehouse Developer at Bank of America
- SSIS Developer at NOMURA Securities International
- DBA/SSIS/SSRS Developer at Department of Sanitation New York
1 year, 5 months at this Job
• Lead design, development and implementation of the ETL projects end to end.
• Worked as BA to understand requirements and create Design documents.
• Responsible for projects estimates, design documents, resource utilization and allocations.
• Interpreted logical and physical data models for Business users to determine common data definitions.
• Setting up ODBC, Relational, Native and FTP connections for Oracle, DB2, SQL server, VSAM and flat file.
• Developed Informatica workflows/worklets/sessions associated with the mappings across various sources like XML, COBOL, flat files, Webservices, Salesforce.
• Responsible for mentoring Developers and Code Review
• Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
• Designed and developed several mappings to load data from Source systems to ODS and then to Data Mart.
• Work with offshore/onsite team and lead the project and assign tasks appropriately to the team members.
• Responsible for projects estimates, design documents, resource utilization and allocations.
• Interacting and assigning development work to Developers that were offshore and guiding them to implement logic and troubleshoot the issue that they were experiencing.
• Worked with cleanse, parse, standardization, validation, scorecard transformations.
• Worked with transformations Source Qualifier, Update Strategy, XML transformation, SQL Transformation, Webservices, Java transformation, Lookup (Connected and Unconnected).
• Worked on SQL tuning in Exadata Performance Testing and Production environments by using hints, SQL tuning set.
• Worked on UNIX shell scripting for file processing to third party vendor through SFTP, encryption and decryption process.
• Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys
• Used IDQ to extract, transform and load (ETL) rules and source-to-target mappings to derive additional business rules for data quality checks.
• Run profile and generated score cards using IDE to validate cleansed data and standardize them with reference tables in IDQ.
• Migrated IDQ and Power Center mappings and mapplets into production environments.
• Coordinate with business unit users and SMEs on setting up business rules for IDE and IDQ.
• Involved in designing of star schema based data model with dimensions and facts.
• Involved in upgrading from Informatica 9.1 to Informatica 9.5
• Developed BTEQ and multiload scripts to load Teradata tables.
• Worked extensively with Teradata utilities (MLOAD, TPUMP and FAST LOAD) to load data.
• Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
• Used NZSQL scripts, NZLOAD commands to load the data.
• Used Informatica Developer to perform data quality tasks like data profiling, validation and cleansing. Environment: Informatica Powercenter 9.1/9.5/10.1, Informatica Power Exchange 8.6.1/9.1, Informatica B2B Data Transformation Studio, IDQ 9.1/9.0, IDE 9.1/9.0, Informatica MDM Hub 9.1, Web logic, Address Doctor, UNIX, PL/SQL, MS SQL Server 2008, Oracle Exadata 11g, Teradata 14, Tidal, Putty.
- ETL Architect /Lead at WellPoint
- ETL Architect /Lead at PepsiCo Inc
- Lead/Senior Informatica developer at J.C. Penney
- Lead/ Senior Informatica Consultant at AETNA
1 year, 9 months at this Job
D.C * Created generalized Ab Initio graphs in components level and did dynamic PSETs for similar functionality interfaces. * Worked with Ab Initio corporation in customizing RWI module (Records With Issues) for the client. This process involves some level of metaprogramming and highly dynamic in nature. * Used Express>IT for Data Quality validations and common reformats. * Replaced multiple graph modules in shell scripts by using Conduct>IT. * Performing the ETL design and architecture for inbound and outbound interfaces. * Even though the production scheduler is Stone Branch, Control center is used in other lower level environments to refreshing data, etc. Environment: Oracle 12c, Abinitio 3.2, Sun OS, Bash and Korn script, EME, Stone Branch
- ETL Architect at Meridian
- Data Warehouse Architect at NT Concepts
- Sr. Data Warehouse Developer at Fannie
- Sr. Program Analyst at AOL Inc
2 years, 2 months at this Job
Ameren is a utility-based company which serves electricity and gas for the State of Missouri and southern part of Illinois. EIM is the analytical platform built in Teradata where several applications data flows, which are CSS (Customer Service Survey), MMSE (Meter Management System Electric), MMSG (Meter Management System Gas), CC (Command Center) and more meter related systems. We have semantic layer like PUB where all the views are built which would be used for the reporting purpose using OBIEE and other reporting tools. To pull data from the various systems we use the following Informatica tools which are the Informatica Powercenter and Informatica Data Replicator (IDR). The Powercenter is the crux of the EIM ETL where are the logics are built to transform the data to our ODS layer. IDR is used to the replicate the data asis from our Oracle Sources (CSS, MMSE and MMSG) and also provides the change data capture (CDC) for maintaining historical information in our ODS layer. Responsibilities:
• To extract data from the archivelogs of various Oracle system and load to Teradata platform using Infomatica Data Replicator (IDR) tool.
• Developed ETL process by using power center (Mappings, Workflow) Designer tools.
• For better performance we implemented Pushdown optimization for our ETL process where the source and target system reside in same database.
• Designed and developed mappings using all the transformations and implemented file listing for sources which involved flat files.
• Involved in creating and reviewing the database objects for any new or existing process along with the Data modelers.
• Working as Informatica Administrator and upgraded from Informatica 9.5.1 to 10.1.1 till Production.
• Implemented Teradata Temporal (Maintains History based on period with timezone) in ODS and Utility Model.
• Created security layers, migrations, apply hotfixes and other activities as part of Informatica administration.
• Involved in enhancement and maintenance activities for the existing warehouse components.
• Worked in various sources Flat Files, JMS (Messaging Que), XML files, COBAL files, Oracle and Teradata system to extract the data and store into DWH.
• Creating test cases and involved in code reviews before promoting to higher environments.
• Involved in requirement gathering and review the business requirement document (BRD) for any new interface or extracts process
• Involved in writing Unix scripts for file listing, formatting files for ETL purposes, maintenance scripts for purging logs etc.
• Implemented Teradata Views and Stored Procedure in Semantic/Reporting Layer for Portals and Reporting purposes.
• Involved in working with offshore teams which involves status, reviews, technical and knowledge sharing discussions.
• Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
• Introduction to S3 File systems and Redshift Database in Amazon Web Services (AWS). We are in the process of introduction Cloud and build Data Lakes by doing POC on existing database which resides in Teradata.
• Scheduling jobs using Autosys.
• Production support Environment: Informatica 10.1.1/9.5.1, IDR 9.6.3/9.5.1, Oracle, Teradata, UNIX Shell Scripting, Flat-Files, Autosys, Service Now.
- ETL Architect/Lead at Ameren
- Informatica Lead/ Developer at Affinion Group
- Informatica Lead/ Developer at Hyatt Hotels Corporation
- Informatica Lead/ Developer (Offshore,India) at NBC Universal
4 years, 11 months at this Job
- B.TECH - Electronic and Communication Engineering
• Provide expertise on various data management initiatives including data warehouse and business intelligence. Requirement elicitation and feasibility analysis with client's stake-holders and Subject-Matter-Experts to understand business needs.
• Assist in analysis and documentation of those requirements by resolving ambiguities and conflicts and ensuring requirements are signed-off in a given ETL scope. Impact analysis of existing systems and client-applications.
• Analyze functional specifications and translate them into technical design document. Development of complex ETL program using Informatica Power Center, performance-optimization of Informatica program and organizing implementation-planning and workflow-scheduling.
• Extensive hands-on exposure to most of the complex SSIS / Informatica Power Center Transformations such as Informatica Aggregator, Sorter, Router, Filter, Joiner, Lookup, mapplet, Transaction Control, SQL transformation, Stored Procedure, XML transformation, CDC and other regular transformations.
• Handling Power Center workflow/mapping parameters and variables along with XML Schema, XSD, XSLT/XPath and B2B transformations. Design and review ETL Data Mapping-specification with the SME's, Data Architects and Technology Specialists.
• Responsible for designing ETL architecture using Informatica Power Center applications in UNIX and Windows environments. Develop and optimize complex SQL programs (such as Stored Procedures, customized functions / packages, applying triggers / index/cursors etc.)
• Develop complex KPIs, dashboards and scorecards using Tableau. Perform data/query analysis and suggesting Performance achievement strategies to DBA's/Architects. Mentor junior team members on Data Integration-strategies, Informatica Power Center and SQL programming.
• Conduct and support Change Control process during production-roll-out phase and responsible for production support and enhancements. Perform data validation and defect management as part of QA exercise by writing test cases, executing those test cases and supporting User-acceptance-testing.
• Design and Review relational and dimensional data models, document metadata process and details relevant to technical specifications. Research, analyze and recommend 'Information-Integration' strategies across hospital-management systems and integrated-health-care systems. Environment: Informatica Power Center 10.1, 9.6, MS SQL Server, DB2, ER/Studio Data Architect 9.5, Shell Scripting, MS Office, Tidal Scheduling, Agile Methodologies.
- Senior ETL Architect- Data Warehousing & Business Analytics, Senior ETL Designer at BlueCrossBlueShield of Nebraska
- Senior ETL Architect- Business Intelligence Practice at Spectrum Health
- Senior ETL Developer, Sector at Cognizant Technologuy Solutions
- Senior ETL Developer at Cognizant Technologuy Solutions
3 years, 9 months at this Job
- Bachelor of Technology - Computer Science and Engineering