As a data architect, I am responsible for design and optimization of data and integration systems for the healthcare and banking industry. The systems are built using various Microsoft technologies including both SQL Server and Azure components.
• Design data models and integration for Customer Journey Analytics using Azure technologies including Azure Table Storage, Azure Data Lake, Azure Data Factory, and Azure SQL Database.
• Maintain and enhance existing ETL for multiple data marts and LOB applications using SSIS
• Migrate ETL from SSIS to Azure Data Factory for line of business applications
• Assist in DevOps practices by building CI/CD pipelines, configuring work tracking, and test plans using Azure DevOps
• Develop POC to migrate legacy code to cloud technologies.
• Build, integrate, and deploy reports and dashboards using Power BI.
• Mentor developers in areas of schema design, T-SQL, SSAS, SSIS, and SSRS development and implementation.
- Data Architect at Argo Data
- Data Architect at Service King
- Data Architect at Associa, Inc
- Database Engineer at Match.com
5 months at this Job
- A.A.S - Business Computer Information Systems
500 Griswold Street Suite 1200 Detroit, MI 48226 United States 04/2016 - Present ~~~~~~~~~~~~~~~~~~~~~~~~~~ Hours per week: 40 Data Architect Duties, Accomplishments and Related Skills: Improved the systems of managing documentation for the Hardest Hit Fund program. Responsible for the programming of the primary land records management system for the City of Detroit. Successfully integrated cross departmental file management. Manage a database of all City of Detroit parcels and report information to a citizen portal: https://detroitmi.gov/departments/detroit-building-authority/detroit-demolition-program Tetra Tech 2301 Lucien Way #110 Maitland, FL 32751 United States
- Data Architect at Detroit Land Bank Authority
- Data Manager III at Tetra Tech
2 years, 11 months at this Job
- - Engineering
Played vital role as a Data architect collecting business requirement, identifying resources plus data points and finally come up with a data model. Performed extensive data analysis and maintained logical and physical models of the applications utilizing Erwin & Toad for Oracle. Involved in developing and maintaining database scripts like PL/SQL, T-SQL, Functions, Triggers & Table partitioning. Following are some key development listed. · Developed and designed data model diagram for export feature in RECON system. This facilitated report configuration data to be stored in the database and flexibility of having the report to be generated on multiple servers. · Improved security exception data reporting by creating stored procedures that perform dynamic table partition and moves previous days exception to history and have the start of day APL AUTOSYS job kick off daily exception load process. · Developed a plan to centralize database login for production users. Increased security and reduce risk of data manipulation in production. · Migration of legacy RECON (Security & Account Reconciliation) system that was built on flat denormalized tables into normalized and partition data where required to boost performance and scalable across multiple application. Designed & model report export feature that can accommodate multi user request for one-off or on-demand report generation in excel or pdf format.
- Data Architect at Fiserv
- Senior Vice President at CITIGROUP
- Sr Software Engineer at LORD ABBETT & CO LLC
- Project Manager at
1 year, 1 month at this Job
- Bachelor of Science - Chemistry
- Advanced Diploma - Computer Software System Analysis and Design
- Diploma - Public Speaking
- Certification - Udemy Online Website
● Point person for 2018 Enterprise Information Catalog Collection (EICC) project.
● Tables and databases for CNO Unified Repository (CURe) project.
● Served as member of Data Architect Working Group team, setting standards for data design and usage within organization.
● Designed and introduced ETL application to read files published by Data Governance Council and push results into CNO's curated database (CURe). Became familiar with Informatica Power Center Designer and Informatica Workflow Manager. This was accomplished between March through September of 2018, in addition to two input file revisions and three output file revisions.
● Oracle databases for various administrative systems and customer interfaces.
- Data Architect at CNO FINANCIAL GROUP
- at CNO FINANCIAL GROUP
- ACORD Data Architect at CNO FINANCIAL GROUP
- Analysis Programmer at COMPASS CARE, INC
2 years, 2 months at this Job
- - GRADUATE STUDIES, Educational Administration
- MASTERS IN ARTS - Planetarium Education
- BACHELOR OF ARTS - Physics
Brown Brother Harriman 12/15 –
Data Architect – Compliance Systems
• Design, develop and implement a Compliance Governance Data warehouse using Oracle Exadata Platform
• Specialization with Artificial Intelligence and Knowledge and Rule based implementation for the Trade surveillance, Fraud Investigation , AML, Anti-Money Laundering and Monitoring activities for Compliance business communities within BBH.
• In depth knowledge of AML commercial applications such as Actimize, Fircosoft and Fortent
• Design, develop and implement a blockchain representation for the trade transactions and KYC, Know Your Customer documentation and dossier management system. Business data are encrypted using Oracle’s Transparent Data Encryption technology and the blockchain container is implemented with its native AES128 bits encryption.
• A Compliance Data architecture is built to support the business stake holders and their communities with a meta-data driven framework to support the multi-data sources, data feeds to support Critical Data Elements with versioning control, and audit tracking for all Critical Data elements as identified by business analysts. All procedures and codes in the audit tracking meta-data repository are auto-generated without any external interference.
- Data Architect at Brown Brothers Harriman & Co
- Senior IT Engineer - Database Technology & Engineering at TIAA-CREF
- Manager - Architecture, CREF Investment at TIAA-CREF Investment
- Vice President - Database Systems Administrator, Citibank Global Asset Management at CITIBANK N.A
3 years, 3 months at this Job
- - Electrical Engineering
- - General Electric
- B.S - Industrial Engineering/Computer Science
Contribute to modernization and digitalization of 7-Eleven business domains, specifically covering merchandizing,
logistic solutions, analytics, and Master Data Micro Services (MDMS). Utilize agile methodologies for project
management. Define business domain scope, architect end-to-end solutions, partner with data stewards, and establish business and data cleansing rules and processes.
• Lead team of 7 data analysts.
• Serve as Principal Data Architect across 8 business domains, working with ~40 project team members.
• Developed MDMS architecture, including data, infrastructure, integration, and API end point.
• Achieved 200ms API response time, on par with Google search time performance.
• Rolled out 7-Now application, a new food delivery system.
• Integrated fuel prices real-time with third-party Gas Buddy application.
• Built blockchain proof of concept (POC) to track items from time of manufacture to delivery, simplifying manufacturer re-call.
• Nutrition and Location Services Master Data Products are built from the scratch and rolled out MVP with high accuracy of source of truth data.
- Principal Data Architect at 7-Eleven
- Enterprise Data Architect at Brinker International
- Data Architect at Nationstar Mortgage
- Senior Database Administrator/BI Developer at SCAN Health Plan
1 year, 2 months at this Job
- Master of Science - Computer Technology
- Bachelor of Engineering - Computer Science
Work Location: Tampa, FL / Telecommuter
Role: Data Architect - Big Data
Responsibility Summary: Requirement Analysis, Functional & Technical Specifications, Data Modelling, Architect Reporting layer and design schema objects, Dashboard Development, Estimation & Implementation, Customization & Configuration, Business Demos, End User Training, Project delivery, Change Management, Validation of Data Quality Summary and Detail Reports, Data Reconciliation, Responsible for all System Integration Testing (SIT) deliverables and timelines, Issue Management and Defect Resolution, Support CDH Upgrade Testing Activities, Coordination with GPA and vendor support for EAP BI tools, Work in Agile sprints
Project 1: Citi AML NAM Retail Broker Dealer (RBD) and Trade & Treasury Services (TTS)
The major objective of this RBD CIFS project is to enhance monitoring of NAM LSB by implementing additional scenarios for better AML risk coverage for CGMI and CIFS lines of Business and to move from Vendor dependent platform to Enterprise Application Platform (EAP) using Big data technologies. This project will also build efficient DQ control reports for CGMI and CIFS and will be aligned with Global MIP.
● Analyze DIS 5.9 guide version and provide ETL mapping logic across EAP Levels
● Review FRD and coordinate with business to understand data elements and feed file format received from source and file arrival frequency
● Provide E2E Hive ETL design and custom SCD Type 2 design for Global Data Warehouse
● Design Data Model across all EAP layers with data standardization, functions, parsing logic, delimiters, Orphan data handling across dimension tables and identify Hive audit & partitions for staging and final tables
● Design and develop model and mappings in alignment with for L2 Global Data Warehouse
● Develop data models across all EAP layers from ingestion, staging through reporting layer
● Design and develop Scenarios and define thresholds for Alert generation
● Expertise in defining Interface between EAP and outbound Alerts to CitiAIP (NextGen) system
● Develop DQ control reports Data Models specific to RBD
● Proven efficiency in gap analysis to replicate MIP Global platform for TTS and across other LOBs
● Expertise in fact and dimensions design and data flow diagrams to capture entity relationship
● Define and develop transformation logic to fulfill scenario development and alert generation
● Expertise working in Agile Sprints for planned activities and also cater to Ad Hoc requests Project 2: Citi AML Global Monitoring Intelligence Platform (MIP) Project Description: This Project aims to implement a new AML monitoring solution for NAM retail business that will be primary channel of detection for potentially suspicious activity and improve monitoring capabilities. MIP will introduce next generation of AML monitoring by leveraging Behavioral Analytics on Citi's strategic EAP Big Data platform in addition to existing rule based monitoring by engaging AML Technology with other AML Business stakeholders. The potential suspicious behaviors / cases identified will be sent to Case Management System (NextGen) for further investigation. Responsibility:
● Coordinate with various data providers to establish feed file format, file frequency and file arrival schedule
● Implement SCD Type 2 for all dimension tables in Hive and design bridge table design for better query performance and reduce processing time
● Analyze, design and develop parsing logic to fetch from DIS (Data Interface Specification) 5.7 Mantas (legacy) to migrate to Enterprise Application Platform in CDH environment
● Define level 1 structures to replicate input data format, design Level 2 standardized Global DWH Model and Mapping to serve as Enterprise Data Warehouse, design and Model Level 4 to cater to project specific requests, design and Model Level 5 Serving layer for outbound interfaces
● Define Data types, delimiters, parsing and transformation logic, orphan data handling, SCD Type 2 design, audit and partition columns for Hive tables
● End to end design and orchestration from ingestion, staging transformation up to reporting layer.
● Analyze data, model and develop mapping logic and functions to fetch inbound Case Disposition Feedback loop (CDFL) data elements from Nextgen AIP to EAP platform
● Analyze legacy CMT tables and attributes mappings with new Nextgen AIP platform and develop transformation logic for legacy cases migration
● Design Data model and mappings for Behavior Analytics model scoring to generate customer risk score based on various behavior groups, typologies and red flags from AML Compliance
● Develop transformation logic for CitiKYC (Know Your Customer) generic feed specification and develop data models
● Analyze IMRs in existing system and perform impact assessment for data pipeline MIP platform
● Coordinate and provide walkthrough for project stakeholders and support till UAT sign off
● Define Data load sequence across all facts and dimension tables and establish dependencies
● Build standardardized lookup tables in staging area - jurisdiction codes, transaction product types, country, currency, geography risk score, watch list entities
● Design a standardize Data Model with Portfolio Data Architect recommendations
● Prepare ER diagrams, models and Data dictionary to represent overall data workflow
● Identify critical data elements for mandatory checks and recommend optimized DQ rules
● Propose various options to reduce processing time during ETL as part of Data Architecture
● Fine tune queries for DQ team to improve performance and disable redundant queries Project 3: Citi AML Coverage Framework, Quarterly Reference Data, Waterfall Reports, ING Control, Behaviour Detection Control , Referral Volume Control and Reconciliation Framework Reports Project Description: The scope of this project is to implement data ingestion checks and controls to ensure data feed integrity, perform data quality checks, standardize data in AML standard data model and provision datasets for feature calculations, behavior analytics model executions and case generation. Responsibility:
● Instrumental in design and development of Data Model and mappings, reporting layer design for Coverage Framework Reports (Customer, Account, Behavior Group), Quarterly Reference Data and Waterfall reports (Transactions and Cases) from Mantas application (legacy) to Hive EAP
● Developed design and model for Ingestion Control report to capture ingestion, ETL logic and target data load is completed without any data loss at the granularity of transaction product types
● Developed E2E design, data model and mapping logic for BDC report to reconcile Case Disposition Feedback loop from Case Management system with EAP generated cases
● Expertise in data model design for RVC report to forecast the Case volume for every PA model run and logic to calculate 4 runs IQR, 12 Run IQR, running average, delta percentage etc
● Instrumental in design and development of report template and reporting layer mapping logic for all the reports in Hive
● Configure Report delivery mechanism through email distribution list and Sharepoint upload Project 4: Global Cards AML Transaction Monitoring Strategic Solution (FRB CO) Project Description: This project involves enhancing monitoring of Citi's Global Consumer Bank (GCB) Cards businesses, Card issuing (consists of Branded, Co-Branded, Retail Services, and Small/Medium Business Cards) and Merchant Acquiring, by developing a globally consistent Anti-Money Laundering (AML) Transaction Monitoring TM future state model for Credit Cards. A discipline in 'Stage 1' of the enhance monitoring program is the development and global implementation of a segmentation model (rule based system) for the Card Issuing business, a work stream owned by the Global AML Optimization team using Big Data Hadoop Technologies (Cloudera) on Citi's Enterprise Analytical Platform (EAP). The future state Cards AML Transaction Monitoring solution will include both alert generation and post-alert generation activities and develop interactive Global Cards Data Quality dashboards for Segmentation, Transaction Monitoring and Strategic Alerts Dashboard. Responsibility:
● Expert analysis skills on Citi's Anti Money Laundering (AML) Cards Transaction Monitoring data
● Strong expertise in developing Data Model for Cards Data Quality Mart
● Prepared Technical Design Document for AML Cards TM E2E Reconciliation, DQ and Alert Dashboard solution including - ETL design, DQ process design and MicroStrategy technical design
● Proven expertise in developing AML Cards Transaction Monitoring Solution Dashboard, Data Quality Dashboard and Tactical Strategic Alerts dashboard for Citi CMT using MicroStrategy Developer, MSTR Web, Datameer- Workbooks & Infographics, Platfora Lens & Vizboards and Analytics using Paxata
● Expert knowledge on data preparation and modeling techniques for Citi's Segmentation and Alert generation rules for Transactions, Payment and Wire activities for Global and regional clusters
● Proficient in Data Modeling and report development on AML Compliance Cards Data for regional and cross border rules
● Established MicroStrategy connectivity with Hive using ODBC connector
● Performed analysis on MicroStrategy 10.9 Library, Dossier, Collaborate and Table of Contents
● Proven development skills using MicroStrategy Developer and Web. Design and architect Schema objects, Warehouse Catalog design, interactive Dashboard style documents with various charts and widgets
● Expertise in designing MicroStrategy Intelligent cubes based Report Services Documents and interactive Dashboards using VI editor. Developed freeform SQL Reports
● Proficiency in SSO-LDAP integration for MicroStrategy developer and web
● Proven experience working in onsite - offshore model leading as Reporting layer Architect for Business Intelligence Analytics using various reporting tools in Hadoop Big Data platform
● Strong development experience in Hadoop Ecosystem, MapReduce, HDFS, Hive, Sqoop, Datameer, Platfora, Paxata, Greenplum, Spotfire, Jasper, Zeppelin and MicroStrategy Environment: Cloudera (CDH 5.7/5.9), Datameer 5.11/6, Hive 1.1, Platfora 5.3, Paxata 2.14, Microstrategy Developer 10.5 , Microstrategy Architect 10.5 , Hue, Zeppelin, Impala, Spark, Oozie, Autosys, Unix, Talend, Collaborate, SharePoint, Shell scripting, SharePoint, Visio 2016, Microsoft Project 2010
- Data Architect - Big Data at Mitchell Martin Inc
- BI Data Analyst - Big Data at Horizon Soft Solutions Inc
- at UPS
- SAP SD Business System Validation (BSV) Analyst at Navayuga Infotech LLC
3 years, 7 months at this Job
Technology Stack: SAP, JD Edwards, AWS Cloud, Erwin, Python-Scikit Learn, Tableau, Hadoop, NoSQL, SQL Server, Oracle
• Data Strategy & Roadmap: Responsible for assessing the current state of data platform & architecture, defining the target state and the transition to the target state to meet the business needs.
• Data Lake Implementation: Lead a team of developers to implement a Data Lake using AWS cloud services to manage business data and development of advanced analytics. Some of the key AWS services used but not limited to: S3, EC2, EMR, Glue, Athena, Quicksight, and RedShift.
• Data Ingestion pipeline: Design and architect the integration patterns to ingest data from multiple ERP (SAP, JD Edwards) systems and non-ERP systems into ODS to facilitate transactional reporting.
• Streamlined Reporting process and advanced analytics solution: Design and architect data flows based on various use cases to streamline reporting requirements and develop advanced analytics for the business.
• Data Model Implementation: Lead a team of data modelers to develop and maintain a model mart using ERWIN for all the data elements in the ODS. As part of the analytics team, lead efforts on model driven architecture, data catalog, governance, and quality regarding complex analytics and business intelligence projects. Define standards and procedures for data modeling, database design, reference data management, and ETL requirements and design.
• Data Science POCs: Lead a team of data science developers and responsible for conceptualizing, implementing and testing statistical models for preventive device maintenance using logistic regression.
• Team Building & Management: Responsible for hiring and training new talent. Manage a team of developers and contractors with a focus on development and training. Responsible for performance evaluations, resource utilization and one-on-one discussions. Accountable for developing talent for the benefit of the organization by understanding individual development needs and facilitating the appropriate development resources. Project History (cont.):
- Data Architect at Analytics, Employer
- Data Architect, Employer at SSTech, Client
- Data Architect, Employer at SSTech, Client
- Data Architect, Employer at Capital One
1 year, 11 months at this Job
- MS-Information Technology - Information Technology
Developed the business plan, marketing plan, marketing strategies and operational procedures Strategic Trader. Performed data analysis and market research for various instruments. As part of my technical improvement, attended numerous training sessions for Data Modeling and Data Architecture. Mar 2015 - Jun 2016 USPTO Alexandria, VA Performed Database model reviews based on enhancement of existing models. Reversed engineered existing models to add new entities and relationship. Supported development team with implementation issues. Worked with developers to ensure requirement and objectives were met. Performed analysis, user requirement for new projects. Developed Logical and Physical Models for new projects. Developed several star schemas for USPTO reporting group. Performed data model conversion from IBM Data Architect to ER Studio.
- IBM Data Architect at Business Intelligence Tech, Inc
- at Business Intelligence Tech, Inc
- Data Architect/ Data Analyst/Data Modeler and Business Process Analyst at Department of the Treasury
- Data Architect and Business Process Analyst/ Project Lead at AARP
4 months at this Job
- Masters - Information Systems
- Bachelor - Electrical and Electronics Engineering
Promoted to Enterprise Data Architect. Focus is put on bigger, higher visibility projects. Also planning and executing plans for extensive BI growth becomes a focal point. Mentoring a Data Architect is also part of the job, in addition to providing some delegation and supervision. -Established as the lead Enterprise Data Architect for the biggest IT project underway at Boyd Gaming. -Built an Azure SQL and Power BI POC that was successful. On-prem data would be refreshed nightly into an Azure SQL Database, and users could analyze the data via Power BI. -Azure SQL Data Warehouse POC was successful and praised by managers and executives.
• Built pipelines in Azure Data Factory with the Data Management Gateway to transfer on-prem transactional flat files from into Azure Storage. Polybase was used to read the transferred data from Azure blob storage into Azure SQL Data Warehouse tables. -Successfully built an ETL solution to have a new visualization tool consume data from properties running different systems under one data schema.
- Enterprise Data Architect at Boyd Gaming
- Consultant at
- Data Architect at Boyd Gaming
- SSRS Developer / PHP/MySQL Developer at R5 Ensino Profissional, Dracena
3 years, 2 months at this Job