• Servicing as the Data Warehouse Architect for the Enterprise Data Warehouse (EDW) initiative; EDW is one of the 5 top projects for the TWC state agency.
• Designed and implemented multi-tier data architecture
• Developed highly advanced and fully flexible software architecture to forward-engineer the entire data model for the four tiers: Staging, Raw Vault, Data Vault, Biz Vault.
• Designed and developed ETL software architecture suitable for daily runs of 5+ TB/day.
• Technologies: ◦ Oracle 12c ◦ C#/.NET / System.SqlBulkCopy ◦ Data Vault 2.0 - Dan Lindstedt and Michael Olschimke
- Enterprise Data Warehouse Architect at State Of Texas Workforce Commission
- Chief Technology Officer at Krinock Software Services, LLC
- Senior Data Warehouse & SOA Architect at Eagle Creek Software Services
- Senior Data Warehouse & SOA Architect at State of Nevada
9 months at this Job
- Bachelors Degree - Business Administration and Computer Based Information Systems
- - Music
Data Warehouse Architect
Environment worked on:
• Informatica 9.6.0 hot fix 3 on a Windows SERVER 2008R
• Informatica repository and our data warehouse on an Oracle 11G database
• Performed upgrade from PowerCenter 8.5. to 9.1.0.
• Put together the presentation to management and technical group that determined what server we could get. I showed them what we needed for the present and what we would need for the future. Because of my presentation we were able to get the severs we needed.
• Part of my job was to determine when we would upgrade our software. ◦ In the new releases I looked at what was fixed and determined if any of our critical needs may have been fixed. ◦ What project we were doing and how an upgrade would affect those projects. ◦ If we didn't upgrade would we be out of compliance with Informatica.
• Part of my job was to install Informatica metadata and teach the group how to use it.
• When Informatica product DVO was not meeting our needs, I presented this information to upper management and got them to stop paying for maintenance, saving us money.
• I worked with management to get a third-party software, TIDAL, brought into the company. TIDAL is scheduling software and we used it to schedule Informatica jobs as well as other jobs.
• Wrote our standard manual for Informatica and created needed forms. I kept our procedure for promoting Informatica code and non Informatica objects to multiple environments so that we used the best procedures.
• As the Informatica administrator ◦ I maintained the security in Informatica for users, group and folder access to keep everything up to date. ◦ Validated that repositories and services were backed up daily. ◦ When I determined a repository needed to be restored, we used our backup and performed the restore. ◦ Performed the backup for the domain. ◦ Ensured that all new connection information in Informatica was kept up to date.
• Was the lead person for all Informatica projects.
• Oversaw the work of 7 developers on the EPIC project.
• Wrote mapping specifications for many of the jobs. This included writing specifications on how to convert our existing data to the new format.
• Designed our master work flows for loading our Production data.
• I'm an expert trouble shooter. I worked all Informatica issues and, if the developers couldn't solve a development issue, I would work with them to get them what was needed.
• Worked in an agile environment.
- Data Warehouse Architect at PARTNERS HEALTH CARE
- ETL Architect / Lead at RCG Information Technology
- Database Marketing Technical Architect at Harte-Hanks Inc
- Senior Data Warehousing Consultant at
9 years at this Job
- - Computer Information Systems
- - Computer Programming
- - Physics
Aug 2008 - to current date 10yrs 5Months.
Sr Data Warehouse Architect - BI/ETL
Development, Enhancement, Implement and support of internal HJF departments, external
Customer/programs from Oracle E-Business Suite/Human Resource system PeopleSoft
application. The reporting tools used to develop and deploy are Oracle BI Discoverer, Actuate
eReport designer, Crystal Reports writer and Web-based Logi Analytics Business Intelligence
system. Task involves in delivering tested data extracting oracle package procedures.
• Data warehouse created from PeopleSoft Financial system and Oracle E-Business Suite R12 are the data source for BI report development. Database packages/procedures are created and scheduled to run on periodically.
• Identify the data table object, required fields/columns, data filter/criteria and develop SQL.
• Data validation/testing using SQL and end-user.
• Report parameters, Data layout design and user test.
• Identify the Chart /Graph type on reports/dashboards.
• Data warehouse extract procedures created and schedule to run on crontab, executive dashboard published from this data source.
• Identify the format/formula on data display; output formats like PDF, XLS, and Doc etc.
• Deployed reports in test Database/web environment for user testing finally move into Production.
• Post report development/deployment, user given support for any enhancement and data issues.
• Evaluating for tools/cloud software to protect the legacy systems and other application. Additional Project/ Responsibility - Applications Security Engineer
• Collaborate with developers to adapt DevOps practice in our development and IT Operational environment to improve Continuous Integration /Delivery (CI/CD) of projects.
• Planning to automate the security vulnerability scan process to identify and report the risk assessed to applications/Assets owner.
• Preparing to educate the software development team to focus on software security first in their mind, apply DevSecOps process in Software Development Life Cycle (SDLC).
• Vulnerability assessment based on OWASP Top 10, SANS Top 25. Conduct penetration tests of all web application periodically using tools like OWASP ZAP, Burp Suite, and other commercial tools.
- Sr Data Warehouse Architect at Henry M. Jackson Foundation for the Advancement of Military Medicine
- Oracle Technical consultant at EPCO, Inc
- Senior Oracle developer/ ETL Developer - IBM DataStage at Bahrain Petroleum Company
- System Analyst/Programmer at Baharat Earth Movers Limited
10 years, 5 months at this Job
- Master's - Computer Applications
- Bachelor of Science - BSc Physics
Solely designed, built, and populated a production data server and required databases from scratch. Later led a team for subsequent data supplementation from a wide variety of data sources.
● Integrated data from a breadth of sources in a wide range of formats to create a data warehouse for staging data to generate sensible questions, answers, and false answers for multiple choice trivia questions.
● Some data source examples: TiVo - movie, TV, music, sports, book, game, entertainment data in various formats including various delimited & JSON formats, historical, full, and delta data; well defined large dataset with many tables/relations. Oxford Dictionaries - over 50 dictionaries/encyclopedias loaded primarily XML. Webster Dictionary - English word definitions, related words, and rhyming. Kelley Blue Book (KBB) - automotive data. Various curated question/answer sets in a variety of formats such as Excel, delimited flat files, JSON.
● Data integration from a staging database to a consistent production database, to other outside sources used by the front-end application such as MongoDB and MariaDB.
● Data cleansing, data analysis. Creating methods to identify "bad" questions/answers and cleaning them up to make the most of our data. Later leading a team for data QA.
● SSIS, T-SQL, MySQL, Microsoft SQL Server Management Studio, Microsoft Visual Studio, Data Warehouse, JSON, XML, MongoDB, MariaDB, Trello, Harvest
- Data Warehouse Architect (Contractor) at International Knowledge League
- Data Integration Architect (Contractor) at iVHR (Cleveland Clinic Innovations)
- Senior Programmer/Data Analyst (Contractor) at CVS Caremark
- Senior Business/Programmer Analyst (Contractor) at University Hospitals (UH)
2 years at this Job
- - Computer Science
Buckeye Partners, LLC Using Microsoft Integrated Services stack, designed, developed and implemented ETL technology landing three terra bytes of data into the Enterprise data warehouse. This ETL methodology consist of the following features implemented at Buckeye world-wide. 1. Lift feature that move data from Source tables to the Staging database tables within the data warehouse. 2. Lift Audit feature that captures the count of records for each Source and Staging table. 3. Load feature that move data from staging tables to Data Lake and/or Master Data database tables within the data warehouse. 4. Load Audit feature that captures the count of records for each staging table and / or Data Lake Master Data database tables. 5. Registration .Net feature that allows maintenance of administrative tables - the implementation of Master Data Services. 6. Next Incremental Feature that calculate and captures the next Lift and Load run date. 7. Developed AI Data Warehouse search engine tool - SQL and .Net. This ETL methodology support both Incremental and Flush and Fill data movement operations. Key accomplishments: 1. Data Warehouse Environment development. 2. Star Schema development. 3. Dimensional Modeling, Dimensional Development and Fact tables' creations. 4. Transactional and aggregate query development. 5. SQL Pivot development. 6. Dashboard creation. 7. Q & A engagement. 8. SQL Performance Tuning. 9. End User training. 10. Data Warehouse documentation.
- Data Warehouse Architect at Buckeye Partners, LLC
- SQL Developer/ Team Lead at Hewlett Packard (hp)
- SQL Server BI Developer at Hewlett Packard (hp)
- SQL Server & Crystal Reports Developer at Universal American
2 years, 10 months at this Job
- - Business Administration
- - Business Administration
- - Computer Science
Warehouse design and implementation lead for the mortgage data warehouse. Worked extensively with business-line and technical resources to implement analytic and operational reporting solutions across the mortgage business.
- Senior Data Warehouse Architect at US Bank National Association
- Senior Business Intelligence Consultant at Daugherty Business Solutions
- Senior Consultant at Accenture
8 years, 6 months at this Job
- Bachelor's - Computer Science
Emory University., Big Data Architect/ Data Warehouse Architect/Cloud Architect/Data Architect/Team Lead: Primary roles: On Premise DBA, Cloud Infrastructure Architect, Configuration Management Architect, Cloud DBA. Secondary role: Consultant, Developer, BI & Data Warehousing Architect
• As a Big Data Architect for AWS Big Data Center Builds: Providing consulting and advisory role in creation of AWS EMR clusters. This included transient as well as Permanent clusters for processing and converting large sets of data to enable the Data Solutions team to run reports using Jupyter Notebooks.
• As a Data Architect Created a Denormalized Star Schema Data Model for stabilizing Universities Security Portal to make it a scalable and reliable application. Partitioned several Oracle tables using Interval Partitioning and wrote PLSQL scripts to drop older partitions and to archive data. Currently in the process of upgrading Oracle from 220.127.116.11 to Oracle 12.2.
• As a Big Data Architect built a Cassandra service and installed/created/partitioned/queried and backed up several Cassandra nodes in a cluster. Recently helped in migrating Cassandra 2.2.5 to Cassandra 3.1.1. Added nodes and decommissioned nodes to resize cluster using nodetools
• Managing Standalone Mongo service: For Champs Child Health and Mortality Prevention (Champs) Built and monitored Mongo database service with Data at rest for customer using a Security appliance Vormetric and Mongo Cloud Manager. Helped Customer troubleshooting Json queries. Completed upgrading Mongo from 3.2.9 to 3.6. The database was primarily being used to ingest unstructured data.
• AWS Security Control Policies Enablement: Building Test cases so that data is secure in the cloud for the following AWS services: AWS RDS (Oracle, SQL Server, MySQL, Postgres, Mariadb), Dynamodb, Aurora Cluster (Mysql and Postgres), Amazon Redshift, Elastic Cache. Enabled SSE-KMS for data at rest and downloaded certificates for Data in motion.
• AWS Cloud DB creation cookbooks: Created low cost AWS footprint by creating new aws cli based scripts so that Development template database in AWS RDS (Oracle, SQL Server, MySQL, Postgres, Mariadb) and Dynamodb, Aurora can be built on the fly and shutdown during off peak hours.
• As Configuration Management Architect: Compared Chef and Ansible on various categories so that a proof of concept can be executed to choose a product for the University.
• Managing Development, Maintenance and ETL for School of Medicines applications using Oracle on Windows using Powershell and PLSQL scripts.
- Big Data Architect/ Data Warehouse Architect/Cloud Architect/ Data Engineer at Emory University
- Team Lead/Data Architect and Mentor at Ericsson
- Data Warehouse Architect/ Data Architect at Sprint Nextel Corp
- Senior Principal Consultant at Oracle Corp
4 years, 8 months at this Job
- Master's - Analytics
- Master's - Business Administration and Management
New York Times Web Portal logs are imported into Advertising and Reporting Datamarts along with various in-house and third party systems. These facilitate Reporting and BI for end users to enhance optimized usage of Online Advertising Space. Data is imported from Hadoop, Sugar CRM, Amazon Redshift, Vendor files, etc using Pentaho Data Integrator and Cubes are generated on these Datamarts for Reporting. Simultaneously Google Analytics is also implemented on a subset of advertising portals.
• Requirement Gathering from the Business Users in Advertising and Reporting
• Defining Source-target mappings for ETL in Pentaho, dependency matrix for ETLs and Reporting.
• Incorporating reconciliation using Google Analytics and source target data profiling.
• Dynamic ETL in Pentaho using Sugar CRM metadata to keep pace with the numerous changes on the CRM Analytics and ensuing that the ETL development is no longer a bottleneck for business users.
• Incorporating data from Redshift and web logs in Hadoop.
• Scripting in PIG to create rollups of weblogs for feeding data into data-warehouse. Environment: Pentaho BI Suite, Redshift, Oracle, Google Analytics.
- Data Warehouse Architect at New York Times
- Sr. Architect Consultant at Altria
- Sr. Architect - Consultant at Wachovia Corporation
- Director of Data Engineering/Architect at Muze Inc
2 years, 3 months at this Job
• Part of a team maintaining and developing OBIEE reports and dashboards for United Airlines and specifically for Reliability Engineering.
• The Extent of the OBIEE solution for Reliability Engineering was measuring the effectiveness of the Airline in using aircraft parts efficiently and effectively
• Responsible for working with Teredata data warehouse team to ensure appropriate data was available for use with OBIEE and the appropriate data architectures were in place to support efficient reporting
• Worked with data warehouse team to develop new data measures to support new user requirements
• Responsible for maintaining and pushing changes to the RPD and incorporating any new data requirements to satisfy new user requirements
• Developed a MDM strategy to harmonize data after the merger of United and Continental Airlines
• Developed a supporting data warehouse architecture to support the harmonized data
• Responsible for updating reports per user requirements with the new data measures as incorporated in the rpd
• Responsiblie for user interactions and developing enhanced reports per user requests
• Worked with users to update reports and dashboards with new analytical measures enhancing report offerings to executives and stakeholders interested in Airline performance FAA
- Senior OBIEE/Data Warehouse Architect at Mphasis
- Senior Business Intelligence Developer at Oracle Federal Financials
- Senior Business Intelligence and Data Warehouse Architect at Code Plus
- BI Consultant at Panoramic Staffing
8 months at this Job
- Ph.D. - Economics
- M.A. - Economics
- B.A. - Economics and Mathematics
• Developed and led the data migration approach for the Conversion process from the Legacy Professional Audit Support System to the next generation Gentax Application using InfoSphere DataStage and SSIS on DB2 10.1 and SQL Server 2012 Platforms.
• Teamed up with DOF technical teams and vendors (IBM), to coordinate the installation, setup and configuration of all new P770 hardware across four different environments (Development, Staging, Production and Disaster Recovery/Fail Over) while leading the software upgrades of InfoSphere DataStage 7.5 to 9.1 - and ultimately to 11.5 - DB2 8.2 to 10.1 and AIX 5.3 to 7.1.
• Managed, advised and assisted in the DOF Data Warehouse implementation: database design, extraction, cleansing, mapping, harmonization, enrichment, consolidation, loading and reconciliation.
- Data Warehouse Architect at CGI Group
- Data Architect / Technical Lead at CGI Group
- Technical Lead at CGI Group
- Data Architect at CGI Group
4 years, 11 months at this Job
- BS - Computer Science
- Associates Degree - Business