Role: Design and maintain conceptual, logical and physical enterprise data models, infrastructure architecture and overall health of the processes that populate, use and maintain the Enterprise Data Warehouse (EDW). Facilitate
business initiatives through careful, thoughtful and futuristic data architecture and designs that meet or exceed
business needs while balancing model scalability and performance in a manner that is efficient and conscious of corporate vision and occasional expedited projects that support fast-paced business requests. Accomplishments:
• Completed data architecture for complex projects while collaborating with business end-users, data analysts, project managers, ETL developers, data warehouse DBAs, report developers, BI administrators and senior management to deliver integrated data warehousing and enterprise reporting solutions
• Single-handedly modeled Salesforce application data (Customer and Partner Care business verticals) into EDW
• Deployed and published major data models including Shipment Audit (Supply Chain), Customer Pain Point (Sourcing), Freight Claim (Finance/Sourcing), Customer Event (Marketing) and Subcategory Attribute (Marketing)
• Introduced a new Design Review phase into the architectural workflow to help standardize collaboration framework and improve synergy with data consumers and ETL developers partaking in development life cycle
• Introduced a new easy-to-complete, one-page, four-question, business-focused and technology-independent EDW Requirements Document to improve alignment with industry standard SDLC framework
• Published several data governance policies and documents on documentation platform Confluence, including: * A new and revived EDW Architecture home page depicting a pictorial representation of EDW along with its various components, their value, purpose and interaction with one another as well as with the peripheral systems that feed into and off EDW * EDW policy for Operational Process Depot (OPD) and End User Input (EUI) * ETL Process Scheduling Guidelines document to facilitate reporting needs with varying frequencies * Policy for end-user access to EDW * Guidelines for working through EDW workflow from requirements gathering through production deployment * Guidelines for drafting definitions of data objects * Guidelines for decommissioning an ETL process that is no longer required due to discontinued business needs Environment: ER/Studio Data Architect 10.0.2, ER/Studio Team Server 4.0.3, Confluence 5.8.17, Teradata 15.0, SQL, UNIX shell scripting, PERL, BTEQ, Golden Gate, DMExpress 9/8, Oracle, SQL Server, Control-M 9.0/7.0, Subversion
- Enterprise Data Warehouse Architect at Overstock.com (retail industry)
- Principal ETL Developer at Overstock.com (retail industry)
- Senior ETL Developer at Overstock.com (retail industry)
- ETL Developer at Grand Circle Corporation
1 year, 8 months at this Job
- Bachelor of Engineering - Electronics and Communication Engineering
• 6 years of experience in Data Warehouse applications
• Expertise in Requirement analysis, Design, Development, Deployment and Production Support
• Domain knowledge of Media and Entertainment business
• Experience of managing small teams and grooming associates
- Data Warehouse Architect at Tata Consultancy Services (TCS)
7 years, 10 months at this Job
- Bachelor of Technology - Electronics and Communication Engineering
Description: New York Times Web Portal logs are imported into Advertising and Reporting Datamarts along with various in-house and third party systems. These facilitate Reporting and BI for end users to enhance optimized usage of Online Advertising Space. Data is imported from Hadoop, Sugar CRM, Amazon Redshift, Vendor files, etc using Pentaho Data Integrator and Cubes are generated on these Datamarts for Reporting. Simultaneously Google Analytics is also implemented on a subset of advertising portals. Responsibilities:
• Requirement Gathering from the Business Users in Advertising and Reporting
• Defining Source-target mappings for ETL in Pentaho, dependency matrix for ETLs and Reporting.
• Incorporating reconciliation using Google Analytics and source target data profiling.
• Dynamic ETL in Pentaho using Sugar CRM metadata to keep pace with the numerous changes on the CRM Analytics and ensuing that the ETL development is no longer a bottleneck for business users.
• Incorporating data from Redshift and web logs in Hadoop.
• Scripting in PIG to create rollups of weblogs for feeding data into data-warehouse. Environment: Pentaho BI Suite, Redshift, Oracle, Google Analytics.
- Data Warehouse Architect at New York Times
- Sr. Architect Consultant at Altria
- Sr. Architect - Consultant at Wachovia Corporation
- Director of Data Engineering/Architect at Muze Inc
1 year, 11 months at this Job
§ Over 15+ years of experience in business analysis, database architecture, database design and database modeling in various business intelligence, data warehouse and data integration projects. § Highly competent Business Intelligence/Data Warehouse expert on OBIEE/Siebel Analytics, OBIA (Out of the Box Prebuilt Applications), Oracle BI tools, ERP’s, OLTP & OLAP Applications. § Involved in end to end implementations of BI solutions for implementing Oracle Financial Analytics, Procurement and Spend Analytics, HR, Supply Chain and Order Management Analytics, Project Analytics. § Involved in implementing prebuilt solutions in converting data from ERP systems like Oracle EBS, PeopleSoft, and Siebel CRM into Enterprise Business Analytics Warehouse (BAW). § Strong experience in Project methodologies, SDLC, DW and BI infrastructure and architecture using middleware OBIEE, OBIA, data loading, schedule and monitor workflows/jobs, and dimensional modelling using Star schema in Data warehouse life cycle. § Experience serving industrial segments like Finance, Insurance, HR, Sales and Marketing, Manufacturing and Service in business related to implementation of BI Solutions.
- Data Architect / Data Warehouse Architect at Charter Communnications
at this Job
• Servicing as the Data Warehouse Architect for the Enterprise Data Warehouse (EDW) initiative; EDW is one of the 5 top projects for the TWC state agency.
• Designed and implemented multi-tier data architecture
• Developed highly advanced and fully flexible software architecture to forward-engineer the entire data model for the four tiers: Staging, Raw Vault, Data Vault, Biz Vault.
• Designed and developed ETL software architecture suitable for daily runs of 5+ TB/day.
• Technologies: ◦ Oracle 12c ◦ C#/.NET / System.SqlBulkCopy ◦ Data Vault 2.0 - Dan Lindstedt and Michael Olschimke
- Enterprise Data Warehouse Architect at State Of Texas Workforce Commission
- Chief Technology Officer at Krinock Software Services, LLC
- Senior Data Warehouse & SOA Architect at Eagle Creek Software Services
- Senior Data Warehouse & SOA Architect at State of Nevada
11 months at this Job
- Bachelors Degree - Business Administration and Computer Based Information Systems
- - Music
Worked for various US/UK and Indian clients for complete implementation of BI solutions.
Recent project is associated with Macys which is the one of the largest retail giants. Fulfillment Business Intelligence team (FBIT) is one of the most crucial teams that captures entire Macys stores day to day operational data in fully operational data warehouse. The daily operational data like shipments, reservations, orders are captured through TIBCO and processed through ODI into Oracle warehouse. Responsibilities included during my current employment for various clients:
● Providing Strategic roadmap for migration projects
● Performance tuning for all ODI jobs
● Changing system generated I$ tables (customizing Knowledge modules) to CTAS
● Tuning long running queries, Capturing all ODI metadata objects for re-platforming BI to Big data.
● Experienced in writing advanced PLSQL scripts, scheduling tools like CAWA, ODI Scheduler
● Implementing changes across cross functional teams, Ensuring optimal solution delivery
● Implemented hive queries in MapR 4.0 environment
● Involved in identifying the artifacts for re-platforming existing BI environment
● ETL implementations, Golden gate implementation & integrations
● Involved in tuning HIVE queries Advanced shell scripting for server maintenance
● Good understanding about Predictive models built in Python
● Involved in migrating on premise to cloud applications Environment: ODI 12c, Oracle 12c, WebLogic 12c, RHEL 7.7, MapR 5.5, Hive, Sqoop, Kafka Windows. Applications Impala, Oracle Data Integrator 11g/ODI 12c, Hadoop Cluster, Oracle Transport Management, Weblogic, OBIEE12c, Oracle SOA BI Tools ODI, Oracle Golden gate, OBIEE, Weblogic Hadoop Cluster MapR 5.6, SQOOP, Hive, Scala ETL Support Tools ODI, SQOOP Relational Databases Oracle 11g Data Modelling tools Erwin Operating Systems RHEL 7.5, OEL, Solaris, Windows Server Languages SQL, PL/SQL , shell scripting , Python
- Data warehouse Architect at
- BI Architect at K12 Inc
- Technical Lead at Iron Mountain
- Member Technical Staff at Oracle at www.oracle.com
10 months at this Job
- B.Tech - distinction
Client: Whole Foods Market Tools: Erwin Data Modeler, Altova XMLSpy, DBeaver, Teradata SQL Assistant
Environments: Amazon Redshift, Amazon EMR, Teradata
• Introduced and promoted Data Vault 2.0 (DV2) methodology for warehouse development. Resulted in company shifting from Inmon style warehouse to DV2 for new warehouse.
• Introduced warehouse automation tools which are under evaluation for adoption.
• Played key role as lead model developer on first DV2 project involving complex transaction data.
• Defined and promoted best practices for development process, warehousing techniques, and data governance.
• Coordinated with management to improve partner team communications and joint development practices.
• Developed data models for both legacy and new warehouse using Erwin Data Modeler.
• Wrote scripts for new functionality on Teradata legacy warehouse.
- Data Warehouse Architect at Brooksource
- Data Warehouse Architect at Allied Consultants, Inc.
- Data Architect at INTERSYS CONSULTING
- Data Architect at AKILI
1 year at this Job
- MS - Aerospace Engineering
- BS - Mechanical Engineering
Roles: Restructuring, Designing, building and supporting Current Data Warehouse for State. Feasibility study for migration in-house Data warehouse to Cloud. Identify, design and establish technical requirements for modernizing Data warehouse. Implementation of KAPSCH as a new Enterprise Back Office system. Environment: SQL Server 2017, Visual Studio (SSIS-SSAS), Tableau Server, SharePoint, Microsoft Azure.
- Data Warehouse Architect at State Road and Tollway Authority
- Senior Business Intelligence Consultant at Box Beauty Co
- MSBI Project Manager at Orchid Pharmed Co
- BA/Report Developer at Dorsa Group Co
8 months at this Job
- MBA - Strategic Management
- Bachelor of Science - Computer Engineering
Role: Data/Integration Architect Responsibilities: · Administer and Maintain the Informatica Environment · Architect and Develop processes to provide data support to both Operational and Analytical teams · Develop, enhance and maintain existing Data Marts, and provide data support to BI team. · Define Standards and utilize best practices to ensure ETL performance. · Informatica Team Lead - mentor other developers by architecting solutions, reviewing code. · Architected and Developed solutions to recover failures, thereby helping on call team with less overhead. · Created scripts to restart Informatica CDC processes daily to improve server performance and reclaim disk space. · Provide solutions for complex integration requirements. Incorporate different technologies such as BizTalk, SQL Server, and Informatica along with web services etc. · Create, Maintain and enhance ETL processes, both with Real Time Change Data Capture, and scheduled Batch processes. · Maintain, enhance and expand current ODS using Informatica ETL jobs · Provide after hours on-call support to Data Team · Ensure performance of business critical Informatica processes. Document Business Requirements, and Technical Specs.
- Sr.Data Warehouse Architect at U.S.Xpress
- Software Developer -Data Warehousing at Jewelry Television
- Programmer Analyst at AET Solutions, Inc
6 years, 8 months at this Job
- Master of Science - Electrical Engineering
Array provides new and emerging technologies and leading-edge solutions, application development and support service are governed/implemented under most common industry standard, such as CMMI and ISO Keystone decision support system provide a web-based access to a data warehouse that integrates logistics and financial source system data; ad hoc analysis capabilities; improved timeliness of data availability; and forecasting of future sales activity; performance forecasting; and identification of unusual sales activity.
Business Object is the BI tool used to access, create Ad Hoc, and batch reports by management and end users.
• Working as a Data Architect which involves defining the logical data model, physical data model and data mappings.
• Installation and configuration of Informatica Power Center server, Setting up complete environment (Services, Repositories, User access) and client tools (Designer, Workflow manager, Monitor, Repository manager)
• Performed upgrades of Power Center server from V 9.1 to V 9.6 and from V9.6 to V10.2, creating new Repository with no content in the new environment and associate with the same schema in the old environment
• Managing multiple Environment (Dev, Test, Prod) with high availability
• Schedule and monitored daily backup of Informatica environment, and restore when needed
• Administer projects, roles, user access, and privileges across all three environments.
• Create the Technical Design, high level low level design documents, creating ETL specifications, test cases documents for all the new mappings developed for production applications in compliance with SDLC processes
• Developed and assisted developers on creating Oracle SQL scripts and stored procedures to fix urgent and highly critical production data quality issues, with highest priorities.
• Daily Interaction and brainstorming sessions with functional business analysts, end users, SMEs to get detailed understanding of existing functionality and system requirements
• Monitor daily, weekly, monthly Data Warehouse load processes and other DW related processes
• Extensively worked on UNIX shell scripts on automation (Auto restart of servers, Disk space utilization, INFORMATICA Server monitoring and file system maintenance and cleanup), and scripts using INFORMATICA Command line utilities.
• Root cause analysis of ETL job failures, and data issues raised by end users and the testing team.
• Define standards and Processes to be followed by ETL technical Teams
• Migration of Informatica Mappings/Sessions/Workflows from Dev, Test to Prod environments.
• Monitor Informatica platforms Performance across all three Environments and taking necessary actions in case of File system/permission/space issues
• Streamlined and Tuned ETL mappings, and workflows for improving performance Work History
- Senior Informatica ETL Data warehouse Architect at Array Information Technology
- Senior INFORMATICA ADMIN - ETL Developer at CACI International Inc
- Technical lead at WorldCom-UUNET Technologies
- ETL data warehouse Developer at Great American Insurance Group
3 years, 8 months at this Job
- Master of Science - Computer Science
- Bachelor of Science - Computer Engineering