- Sr. Data Modeler/Data Analyst at Werner Enterprises
7 months at this Job
- Master's - Business Management
Data Modeler Responsibilities:
• Participated in JAD session with business users, sponsors and subject matter experts to understand the business requirement document.
• Translated business requirements into detailed, production-level technical specifications, new features, and created conceptual modelling.
• Done the data profiling and identified the gaps by doing gap analysis.
• created Logical and Physical Data Models for central model consolidation.
• Worked with DBAs to create a best fit physical data model from the logical data model.
• Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
• Involved in Logical and Physical Designs and transforms logical models into physical implementations.
• Created 3NF business area data modeling with de-normalizedphysical implementation data and information requirements analysis using Erwin tool.
• Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
• Involved in extensive Data Analysis on Teradata, and Oracle Systems Querying and Writing in SQL and Toad.
• Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle and SQL Server.
• Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
• Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
• Involved in the critical design review of the finalized database model.
• Involved in the study of the business logic and understanding the physical system and the terms and condition for database.
• Used Reverse Engineering to connect to existing database and developed process methodology for the Reverse Engineering phase of the project.
• Generated Data Models for Information Governance CatLog (IGC).
• Created documentation and test cases, worked with users for new module enhancements and testing.
• Designed and Developed Oracle Database Tables, Views, Indexes with proper privileges and Maintained and updated the database by deleting and removing old data.
• Performed Unit Testing and tuned for better performance.
- Data Modeler at Blue Shield of California
- Data Modeler/ Analyst at Harman International
- Data Modeler/Analyst at CVS Health
- Data Modeler/ Data Analyst at ITP Software Solutions
2 years, 3 months at this Job
Data Modeler Responsibilities:
• Performed DB changes in specific subject areas such as pricing, deals, contracts, material management etc., in collaboration with SMEs and architects.
• Developed source to target ETL mapping specs based on data structures obtained in various formats such as XML/XSD/WSDL/JSON files and SAP proprietary document formats.
• Extracted and validated metadata from SAP GUI such as relationships, business rules, key constraints, descriptions, sample values, etc., to design future-state enterprise logical model.
• Developed customized modeling tool macros to semi-automate modeling activities such as metadata imports and exports, model consistency checks etc.
• Built data lineage between enterprise logical data model and the dimensional models.
• Involved in ontology development to address inconsistencies in semantics among business domains in a globally distributed database environment.
• Identified KPIs and metrics across data domains to be migrated to data lake to support business reporting and analytics.
• Designed future state data models in to be implemented on Amazon EMR (Hive/S3 on AWS) for applicable subject areas, starting with the Minimum Viable Product (MVP) to foster development.
• Built current state mappings based on data catalog from Source system (SAP) to staging to Data Vault to EDW.
• Incorporated ETL rules required to cleanse, harmonize and transform data such as current row logic from raw layer to serving layer in the data lake.
• Enriched existing dimensional models to synchronize upstream SAP application and downstream reporting environments based out of AWS through aggregated fact tables.
• Incorporated the best design practices by sorting, distribution and partitioning to improve query execution plans. Environment: SAP, SQL Server 2016, Amazon Redshift, Aginity Workbench for Redshift, Amazon EMR, AWS Athena, Qlik View
- Data Modeler at Hewlett-Packard Enterprise
- at CIGNA
- Sparx Enterprise Architect at American Family Insurance
- Data Analyst/Modeler at GAP inc
1 year, 9 months at this Job
• Experienced and skilled in data modeling through the Conceptual, Logical and Physical modeling stages.
• Created domains and expectedly identified Entities and attributes as well as the cardinalities.
• Incorporated proper model designs fit for the database application taking into consideration the concept of Sub-type
- Data Modeler at M & T Bank
- Data Modeler/ Reporting Analyst at Cigna Healthcare
- at Inovalon Healthcare Empowered
3 years, 2 months at this Job
- Doctor of Medicine - Medicine
- - sciences
- - Pre-medicine
- School Certificate
I worked on Comcast Corporation, It's an American telecommunications conglomerate headquartered in Philadelphia, Pennsylvania. Data Modeler/Data Analyst Responsibilities ❖ As a Data Modeler / Data Analyst I was responsible for all data related aspects of a project. ❖ Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, agile methodology. ❖ Extensively used Erwin for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse. ❖ Configured Hadoop Ecosystems to read data transaction from HDFS and Hive. ❖ Worked on Data load using Azure Data factory using external table approach. ❖ Developed data warehouse using Azure SQL Data Warehouse for casting & budgeting data in the cloud. ❖ Designed Data lake, Master data, Security, data hub and data marts layers. ❖ Worked on Data load using Azure Data factory using external table approach. ❖ Involved in creating Pipelines and Datasets to load the data onto data warehouse. ❖ Performed Reverse Engineering of the current application using Erwin, and developed Logical and Physical data models for Central Model consolidation. ❖ Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards. ❖ Created E/R Diagrams, Data Flow Diagrams, grouped and created the tables, validated the data, for lookup tables. ❖ Created the template SSIS package that will replicate about 200 processes to load the data using Azure SQL. ❖ Developed Semantic models in Azure Analysis Services to encapsulate all necessary user data to be easily queried in a drag-and-drop experience. ❖ Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow Flake Schemas. ❖ Involved in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments. ❖ Created PL/SQL procedures in order to aid business functionalities. ❖ Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data ❖ Used Erwin for reverse engineering to connect to existing database and ODS. ❖ Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing. ❖ Managed and created and altered Databases, Tables, Views, Indexes and Constraints with business rules. ❖ Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality. ❖ Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team. ❖ Designed OLTP and OLAP system environment and maintained documentation of Metadata. ❖ Prepared reports to summarize the daily data quality status and work activities. ❖ Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed. Environment: Agile, Erwin 9.7, Hadoop 3.0, Azure, SSIS, OLTP, 3NF, PL/SQL, SQL, SSRS, OLAP
- Data Modeler/Data Analyst at Comcast
- at J. B. Hunt
- Data Analyst/Data Modeler at
- Data Analyst at Prime Soft Solutions
1 year, 1 month at this Job
• Created Logical/Physical Models and performed forward engineering using ERwin for Shell Na Kika Project to perform a database build based on Microsoft SQL server for PowerBI and Cloud integration
• Performed reverse engineering of physical data models for Shell's Surveillance database using ERwin to generate relational model schemas and generate DDL
• Coordinated with DB2 on database build and table normalizations and de-normalizations
• Conducted brain storming sessions with application developers and DBAs to discuss about various de-normalization, partitioning and indexing schemes for Physical Model
• Involved in reviewing business requirements and analyzing data sources form Excel/SQL Server for design, development, testing, and production rollover of reporting and analysis projects
• Maintain the company's EBIS international market & competitive intelligence products; in charge of data wrangling, validation, digitization and mapping
• Working towards the growth of EBIS products in a fast-paced startup-oriented environment
• Spearheaded efforts to digitize Oil & Gas (O&G) assets for input into company's EBIS database
- Data Modeler at SETLD Inc
- Data Modeler / Data Analyst at Drillinginfo, Inc
- Reservoir Engineering Analyst at Drilling Info
- Production/Completion Engineering Intern at Saudi Aramco
1 year at this Job
- Bachelor of Science in Petroleum Engineering - Petroleum Engineering
• Massively involved as Data Modeler/Analyst role to review businessrequirement and compose source to target data mapping documents.
• Designed the business requirement collection approach based on the project scope and SDLCmethodology.
• Designed and deployed scalable, highly available, and fault tolerant systems on Azure.
• Conducted datamodelingJAD sessions and communicated data related standards.
• Extensively used Agile Method for daily scrum to discuss the project related information.
• Worked on Teradata SQL queries, Teradata Indexes, MDM Utilities such as Mload, Tpump, Fastload and Fast Export.
• Created Data stage jobs (ETL Process) for populating the data into the Datawarehouse constantly from different source systems.
• Performed reverse engineering using Erwin to redefine entities, attributes and relationships existing database.
• Developed complex mapping to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, and Applications.
• Involved in writing T-SQL working on SSIS, SSAS, Data Cleansing, Data Scrubbing and Data Migration.
• Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
• Worked on the Enterprise Metadata repositories for updating the metadata and involved in Master Data Management [MDM].
• Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measuredfacts.
• Wrote Python scripts to parse XML documents and load the data in database.
• Written DDL and DMLstatements for creating, altering tables and converting characters into numeric values.
• Identified the Entities, attributes and designed a relational database system (RDBMS).
• Well versed in system analysis, ER/Dimensional Modeling, Database design and implementing RDBMS specific features.
• Translated businessconcepts into XMLvocabularies by designing XMLSchemas with UML.
• Extensively worked with enterprisedatawarehouse development by building data marts, staging and pre-staging.
• Used reverse engineering for a wide variety of RDBMS, including MS Access, Oracle and Teradata to connect to existing database and create graphical representation using Erwin.
• Worked on Data load using Azure Data factory using external table approach.
• Designed and Developed Oracle, PL/SQL and Data Import/Export, Data Conversions and Data Cleansing.
• Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using MongoDB.
• Worked on the reportingrequirements and involved in generating the reports for the DataModel using crystal reports
• Developed several detailed reports to understand the trend analysis over periods using SQL ServerReporting Services (SSRS)
• Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
• Extensively used MS Access to pull the data from various data bases and integrate the data.
• Involved in T-SQL queries and optimizing the queries in SQL Server and Teradata
• Written and executed customized SQLcode for ad hoc reporting duties and used other tools for routine report generation. Environment: Erwin 9.8, Oracle 12c, SQL, PL/SQL, Agile, Teradata r15, ETL, SSIS, SSAS, MDM,XML Azure, MongoDB 4.0,T-SQL.
- Data Modeler/Data Analyst at Thermo
- Data Modeler/Data Analyst at T-Mobile - WA
- at Medapati Technologies
1 year at this Job
Project: Summit Materials is a leading construction company and growing rapidly by acquiring smaller companies, part of that worked on building a central repository called Summit Data Platform. SDP is a multifaceted integration system for Summit Materials business data. As a Sr. Data Modeler in the firm, responsible for designing a Data lake, Master Data and datamart for various applications like POS, Catavolt, Production volumes, Invoicing, Downtime, Gathered data from multiple POS systems like JWS Apex, Libra Systems, Command Series and process solution and integrated various lines POS data into datamart which also helped us to build and enhance the performance of various reports like Daily sales, Customer order performance, Material sales and order management to provide strong decision support system in each level of management Responsibilities:
➢ Responsible for data warehousing, data modeling, data governance, data architecture standards, methodologies, guidelines and techniques
➢ Partner with various business stakeholders and technology leaders, gather requirements, converted them into scalable technical and system requirement documents
➢ Designed rule engine to handle complicated data conversion requirements when syncing data among multiple POS systems and the centralized ERP system
➢ Designed Data lake, Master data, Security, data hub and data warehouse/data marts layers.
➢ Created logical, physical models according to the requirements and physicalize them.
➢ Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
➢ Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
➢ Performed Reverse engineering of the source systems using Oracle Data modeler.
➢ Involved in capturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
➢ Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.
➢ Generate the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
➢ Tuned DB queries/processes and improved performance
➢ Reverse engineered crystal reports (Command Performance), SSRS reports to identify logic/business rules for the Driver's Performance Metrics, Customer Order Performance, Order management and daily sales .Created data mart based on multiple POS system for Power BI Dashboards/reports.
➢ Worked on Data load using Azure Data factory using external table approach.
➢ Involved in creating Pipelines and Datasets to load the data onto data warehouse.
➢ Worked closely with ETL SSIS Developers to explain the complex Transformations using Logic
➢ Created ETL Jobs and Custom Transfer Components to move data from Transaction systems to centralized area (Azure sql data Warehouse) to meet deadlines.
➢ Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
➢ Developed SSIS packages to load data from various source systems to Data Warehouse. Environment: - Oracle Data modeler, Visio, Microsoft outlook, Adobe pdf, DQ Analyzer, SQL Server, Azure data factory, Power BI, Microsoft Teams, Microsoft Visual studio.
- Sr. Data Modeler at Summit Materials
- Sr. Data Modeler/Sr. Data Analyst at United Health Care
- Sr. Data Analyst/Data Modeler at Waddell & Reed Inc
- Data Analyst/Data Modeler at Texas Health Resources
1 year, 7 months at this Job
- Bachelors of Science in Information Technology - Information Technology
- Masters of Science in Engineering Management - Engineering Management
Responsibilities: * Massively involved as Data Modeler/Analyst role to review business requirement and compose source to target data mapping documents. * Designed the business requirement collection approach based on the project scope and SDLC methodology. * Configured Spark streaming to receive real time data from the Apache Kafka and store the stream data to HDFS using Scala. * Developed the Sqoop scripts to make the interaction between Hive and vertica Database. * Extracting data from multiple sources, data lake, preprocess, clean, sort, transform, impute, normalize and prepare model population using Python Pandas, numPy, data frames and csv files,Sql etc * Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs, and Scala. * Prepare model data from data warehouse by testing the data populations, create adhoc reports and apply Linear, Logistic Regressions, KNN, K-Means Clustering machine learning algorithms as part of the data mining process * Worked on Teradata SQL queries, Teradata Indexes, MDM Utilities such as Mload, Tpump, Fastload and Fast Export. * Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems. * Performed reverse engineering using Erwin to redefine entities, attributes and relationships existing database. * Evaluating models using confidence intervals, Confusion Matrix, ROC,AUC Curve, AIC, p value, RMSE, Adj R2, K fold Cross validation, Classification Accuracy and built graphs/charts using python matplotlib * Involved in writing T-SQL working on SSIS, SSAS, Data Cleansing, Data Scrubbing and Data Migration. * Optimized and updated UML Models (Visio) and Relational Data Models for various applications. * Worked on the Enterprise Metadata repositories for updating the metadata and involved in Master Data Management [MDM]. * Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts. * Wrote Python scripts to parse XML documents and load the data in database. * Written DDL and DML statements for creating, altering tables and converting characters into numeric values. * Translated business concepts into XML vocabularies by designing XML Schemas with UML. * Extensively worked with enterprise data warehouse development by building data marts, staging and pre-staging. * Worked on Data load using Azure Data factory using external table approach. * Automated recurring reports using SQL and Python and visualized them on BI platform like Tableau. * Using advanced database queries to extract patients' ICU lab results from PostgreSQL database, and then transform / format using Python to feed into machine learning program. * Developed various Qlikview Data Models by extracting and using the data from various sources files Excel and Flat Files. * Designed and generated various dashboards, reports using various Tableau Visualizations. * Download data using API call, store them in MongoDB, parse JSON data, and then load into PostgreSQL * Designed and Developed Oracle, PL/SQL and Data Import/Export, Data Conversions and Data Cleansing. * Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using MongoDB. * Worked on the reporting requirements and involved in generating the reports for the Data Model using crystal reports * Developed several detailed reports to understand the trend analysis over periods using SQL Server Reporting Services (SSRS) * Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents. * Developed Python, Shell/Perl Scripts and Power shell for automation purpose and Component unit testing using Azure Emulator. * Extensively used MS Access to pull the data from various data bases and integrate the data. * Involved in T-SQL queries and optimizing the queries in SQL Server and Teradata * Written and executed customized SQL code for ad hoc reporting duties and used other tools for routine report generation. Environment: Erwin 9.8, Oracle 12c, SQL, PL/SQL, Agile, Teradata r15, ETL, SSIS, SSAS, MDM, XML Azure, Tableau 10.2, MongoDB 4.0, T-SQL.
- Sr. Data Modeler/Mapper at BCBS
- Data Modeler/Data Analyst at T-Mobile
- at Comcast
- Data Analyst/Data Modeler at Sutter Health
1 year, 10 months at this Job
Hired as permanent Sr. Data Modeler following initial consulting role to work with developers, DBA's, and management to maintain data in the company OLTP databases. Implement production data changes and develop data models for new and existing applications. Oversee the integrity of production data via application of database standards including database instance and object naming, established data structure options, developer guidance in PL/SQL, SQL, shell scripting, and other technical assistance whenever needed. Available for on-call support. Key Contributions:
● Reengineered production release procedure so that data model changes were guaranteed to be implemented in dependency order and optimized for speed and accuracy through integration with the Serena SBM system.
● Optimize data model implementations through data-driven automation of production release steps attaining greater than 200% speed increase.
● Streamlined the database refresh of development and test environments from production by working with the DBA group and coordination with concerned user areas.
● Primary developer of the database side of phase I of the NPI data masking "purge" project, including design and implementation of all database structures and PL/SQL code.
● Designed and produced ERDs provided upon request to programming areas, individual developers and all new developers.
- Sr. Data Modeler at Essent Guaranty, Inc.
- Sr. Data Modeler at Essent Guaranty, Inc
- Data Admin - Contractor at Essent Guaranty, Inc.
- Banner Consultant - Contractor at Department of Information Technology
4 years at this Job
- Bachelor of Science in Computer Science - Software Engineering