- Engineer, Data warehouse Engineer at Linden Lab
- Conch and Wheel DBA at Freelancing
- Senior Technologist at Clear Ink
- Technology Consultant at Exponent Partners
8 years, 3 months at this Job
SQL BI Responsibilities:
• Support large legal global provider of integrated technology and services for the legal profession, including electronic discovery, bankruptcy, class action and mass tort administration, federal regulatory actions and data breach responses.
• Provide innovative solutions are designed to streamline the administration of litigation, investigations, financial transactions, and other legal matters.
• Analyzing the KPI's based on the end user requirements and needs to develop dashboards using Power BI.
• Extracting business data from different OLTP systems and loads into data warehouse using SQL server Integration Services (SSIS).
• Worked on providing row level securities using power bi to restrict data access for various users.
• Expertise in publishing Power BI reports of dashboards in Power BI server and scheduling the dataset to refresh for live data in Power BI server.
• Develop customized calculated Columns and measure using DAX Formula language in Power BI to satisfy business needs.
• Developed SQL queries using stored procedures, common table expressions (CTEs), temporary table to support Power BI reports.
• Work closely with Client Services, Data Services and Document Control teams to understand project requirements and advise on best practices for project setup.
• Creates maintenance plans, stored procedures and functions using Transact-SQL to increase performance and ease of access for data manipulations.
• Creates and implements Microsoft Office Suite custom applications to interface attorneys with data and automates applications to assist with creations of document production sets and privilege logs.
• Provides application support and troubleshooting assistance to attorneys.
• Utilizes SQL- DML, DDL, and DCL classes to setup, create, maintain and manage SQL databases backend functions daily.
• Data is integrated from multiple internal sources into an Azure SQL Server with Azure Data factory and Azure Automation
• Used SQL Azure Databases, Data Lakes, SSIS, and SSAS tabular models to capture features used to train risk attribute models used to score and identify high risk deals.
• Used TFS (Team Foundation Server) and Visual studio Team services (VSTS) to share all the work, code, to sort out the exceptions raised during work.
• Responsible for creating documentation to assist my team to effectively transfer duties of different projects to one another.
• Technical problem solving issues with internal tools such as image loading, OCR, document production, production history tracking and image tracking tools.
• Identify and implement new procedures to simplify and improve process. Environment: MS SQL Server Management Studio 2014, SSIS, SSRS, T-SQL , Power BI Desktop, Power BI Dashboard ,DAX Functions, Azure Data Factory(ADF), TFS, Visual studio Data Tools 2015, PL/SQL, Crystal Reports, .Net, C#, Visual Basic, HTML.
- Sr. Data Warehouse Engineer at Epiq Systems
- Sr. Data Warehouse Engineer at Garden City Group
- Sr. Data Warehouse Engineer at Citi Group
- SQL Developer at Gsquire Tech
1 year, 2 months at this Job
- Master of Computer Applications - Computer Applications
- Bachelors in Computer Science - Computer Science
• As part of the refactoring project requests from business users, designed/created data marts and provided training to GAIG ETL/BI resources on best practices.
• Determined operational objectives by studying business functions, gathering information from different teams, evaluating requirements and formats.
• Work with BI Director, leads and analysts to review and assess business requirements relating to data load process and Proposed solutions to enterprise architects.
• Identifying new ETL design pattern for claims analytics team to remove EDW deficiencies for Processing claims data in to enterprise claims repository that is used by Corporate Claims team and strategic comp teams, identified SLA breaches within the Claims Data Mart (CDM) & the Enterprise Data Warehouse (EDW) - PSAR (Policy Search and Retrieval) projects.
• Deliver consulting expertise scope to Design MarkLogic Data Ingestion workflows which extracts and loads data on daily basis.
• Participated in supporting Data Governance, and Risk Compliance platform utilizing MarkLogic
• Optimize processes, programs and functions to improve data quality, data security and data consistency
• Consulted/coordinated with various teams in the org to consolidate the BI reporting needs to ensure the underlying changes reflect effectively in the reporting layer.
- Sr. Data warehouse Engineer Consultant at Great American Insurance
- ETL Lead consultant at Care Source
- ETL Developer II at OSF Healthcare System
- ETL Developer consultant at CVS Caremark
2 years, 4 months at this Job
Created an enterprise class data warehouse from scratch, including orchestration, ETL and Data Quality
processes. Responsible for setting coding and processing standards for the data warehouse side as well as the OLTP side. Mentored and guided BI developers on Data Warehouse best practices.
• One View of the Rocket: this data mart provides a daily manufacturing state of the rocket, modeled into a hierarchical tree of parts that includes an As-Built part and an As-Planned section. This data mart included financial metrics, issues and a number of labor-related metrics on each built part. The financial piece of this data mart saved the company one person/month per flight. This data is also used in Issue Maintenance section of the ERP system. Responsible for design, development, test and maintenance of this project.
• Production Data Mart: provides metrics on the Work Order, Operation and Labor Ticket levels (which are aggregated on top of each other). This data mart eliminates the need for the many departments to calculate commonly used metrics on all three levels, providing a 'Single Version of the Truth' that is used throughout the entire enterprise. Responsible for design, development, test, deployment and maintenance of this Data Mart.
• Tooling Data Mart: for the Request Part for Tooling business process, this data mart provides insight in what phase any requested part is. This provided valuable insight for both the Planning and Purchasing departments. Responsible for design, development, test and deployment of this Data Mart, assisted with designing an SSIS cube based on this Data Mart.
• Other duties:
• Responsible for Data modeling (using Erwin, Embarcadero and Visio) on the Data Warehouse.
• Performed Database Modeling, code and process reviews for both OLTP and OLAP,
• Monitoring the state of the Data Warehouse and its processes.
• Data Warehouse Processes:
• Fast Transform & Load: mentored the development and enhanced an orchestration tool that executes Stored Procedures, SSIS packages or SQL Server jobs based on a set of dependencies. This eliminated the need to do all orchestration in a difficult to maintain SSIS package. By being able to fine- tune dependencies between the population of Fact tables and population of the Dimension tables that a Fact table depend on, the total time of the ETL process was reduced from 6 hours to 2 hours.
• Data Sync: maintained and enhanced a tool that copies data from any SQL Server database to the ODS database. Instead of writing an SSIS packages for every single table to be copied, this table is configured in a set of database tables. This process eliminated the need to write and maintain an SSIS package for this purpose, as column additions or removals are automatically detected.
• Quantitative Data Quality: created a process that counts the number of records inserted per day in any table in the source system and compares this with the corresponding Dimension or Fact table in the Data Warehouse and/or replicated database. Any discrepancy in these numbers would quickly identify where an error has happened (Data Sync, ETL, SQL Server Replication/Always On). The time spending on troubleshooting any problems with the data feed to the Data Warehouse was reduced severely by this tool
• Qualitative Data Quality: maintained a process that runs a set of queries against a predetermined result set (e.g.: retrieve all employees without a department set). This tool provides business users a measure of data sanity.
- Data Warehouse Engineer at SPACE EXPLORATION (SPACEX)
- Scrum Master - Database Engineer at CREATIVE CHANNEL SERVICES
- Lead Data Warehouse Engineer - Data Warehouse at MYSPACE
- Sr. Data Engineer - Data Warehouse at ADCONION MEDIA GROUP
7 years, 8 months at this Job
- Bachelor of Science - Computer Science
Managed Data Warehouse Cloud Migration to Azure SQL Server from Postgres 9.6 local DB and Talend (2019) ◦ Created Azure objects such as Servers, SQL Databases, DataFactory, Pipelines, DataBricks, Connectons to migrate ETL for Salesforce, AWS, legacy ERP, Netsuite to Azure back-end. ◦ Migrate Data Warehouse ETL written in Perl/SQL scripts to Talend 6.2. (2017) ◦ Write integrations in Talend pulling from 1)Barracuda OLTP Postgres database 2)Salesforce 3)AWS 4)MySQL Remote Billing DB 5)Pardot as well as other data sources. ◦ Configured and implemented Salesforce integration using DBSync ◦ Engineered Salesforce integration utilizing Talend and Salesforce API objects for additional objects not available through DBSync. ◦ Create custom data sources for Tableau reporting for Sales and Finance ◦ Create and maintain data pulls from AWS marketplace and AWS Customer Support utilizing Jenkins. ◦ Create and maintain a serial lineage data cube that enabled financial reporting such as Churn metrics ◦ Created data quality suite of tests for the Data Warehouse
- Data Warehouse Engineer at Barracuda Networks
- Data Warehouse Engineer at YouSendIt
- Senior Software Developer at KEMA
- Web Developer at Silicon Image
4 years, 10 months at this Job
- BA - History
- BA - History
Responsible for development and support of system functionality and implemented solutions. Contributing member of Information Architecture team designed to understand detailed data definitions as well as strategic direction of the environment to ensure direction and development solutions are aligned with corporate technological roadmap.
• Direct support and troubleshooting of high-volume production enterprise including failures in scheduled jobs, identifying and remediating errors, and coordinating deployment of fixes for production issues.
• Support within the Information Architecture - ETL team to include review and revision of deployment plans, coordinating deployment with DevOps team, management of releases from QA to Production environments, and being POC for business users.
• Compilation and collaboration around Standard Operating Procedures for support within the team, documentation of solutions to recurring requests and common issues around production environment support.
- Data Warehouse Engineer I at BCD Travel
- Business Intelligence Developer at CAREERBUILDER
- Executive Administration/Leadership at VARIOUS
2 years, 2 months at this Job
- B.A. - Sociology
As a Sr. Data warehouse Developer, I developed data marts to provide key metrics, insights and reporting infrastructures for finance, marketing and product developments.
• Gathered business requirements from finance, marketing and product groups. Identified data Items and data sources and performed data discovery on source systems that included MySql, Oracle and flat files systems (web server logs).
• Designed logical data model that enabled users to better understand site traffics, marketing performances and user engagement. Designed and implemented psychical model utilizing Oracle 11g to maximize query performance and maintain data scalability.
• Designed ETL processes flows that integrated data from multiple data sources. Created custom scripts to collect, organize and maintain data from multiple data sources. Coded ETL processes using SQL and scripting language. Resolved data issues.
• Developed BI analytics and reporting using SAS BI Tools and Cognos (v10) for Marketing and Finance team. Worked with business users to translate complex transactional data into valuable analytical information. Performed ad hoc analysts.
• Maintained and enhanced existing in house reporting applications. Tuned queries and application to improve performance.
- Sr. Data Warehouse Engineer, Data Warehouse and BI at An Internet Ecommerce Company
- Sr. Database Engineer, Database, Data Warehouse and BI at OVI.COM, Nokia
- Data Warehouse and BI Engineer at Yahoo! Inc
- Sr. Software Engineer, Field Engineer at OPENWAVE SYSTEMS INC
11 months at this Job
- Master of Science - Information Systems
- Bachelor of Science - Information Systems and Actuarial Science
Summary: Built data preparation macros for the Alteryx Toolbar, as well as automated workflow application architecture, to improve accuracy and save months of development. o Core technology used: Alteryx, SQL, Batch Scripting. o Created macro tools for the Alteryx toolbar that resolve the most common, repetitive and time-consuming data tasks applicable to every workflow. o Used sophisticated Alteryx architecture to engineer macro tools with simple interfaces needing no coding or syntax. o Built the following Alteryx toolbar tools, which ensure data quality and save valuable time: o Browse Change - compares any 2 files or data streams, displaying all data differences. o Time Machine - performs myriad Date & DateTime operations, eliminating the Formula Tool. o Validate Row Count - quickly alerts users to the most common data preparation errors. o Validate Data - evaluates and flags data based on numerics, population, uniqueness, or custom criteria. o And many others. o Created an "Automated Data Lifecycle Framework": a set of reusable ETL templates and parameterized architecture to automate the full analytic data lifecycle, resulting in: o Tens of thousands in savings vs. Alteryx Server or API licenses. o Business users empowered with analytic applications, visualizations and reports. o Rapid availability of new, standardized, 100% reliable data. o Dramatic time savings in development and maintenance.
Summary: Built data preparation macros for the Alteryx Toolbar, as well as automated
workflow application architecture, to improve accuracy and save months of development.
o Core technology used: Alteryx, SQL, Batch Scripting.
o Created macro tools for the Alteryx toolbar that resolve the most common, repetitive and time-consuming data tasks applicable to every workflow.
o Used sophisticated Alteryx architecture to engineer macro tools with simple interfaces needing no
coding or syntax.
o Built the following Alteryx toolbar tools, which ensure data quality and save valuable time:
o Browse Change - compares any 2 files or data streams, displaying all data differences.
o Time Machine - performs myriad Date & DateTime operations, eliminating the Formula
o Validate Row Count - quickly alerts users to the most common data preparation errors.
o Validate Data - evaluates and flags data based on numerics, population, uniqueness, or custom criteria.
o And many others.
o Created an "Automated Data Lifecycle Framework": a set of reusable ETL templates and parameterized architecture to automate the full analytic data lifecycle, resulting in:
o Tens of thousands in savings vs. Alteryx Server or API licenses.
o Business users empowered with analytic applications, visualizations and reports.
o Rapid availability of new, standardized, 100% reliable data.
o Dramatic time savings in development and maintenance.
- Lead Data Engineer at Uunu Data
- Senior Data Engineer at Motorcar Parts of America
- Senior Product Engineer & Solutions Architect at Alteryx Inc
- Data Development Director at iConstituent, LLC
1 year, 8 months at this Job
- - Business Intelligence Data Warehouse Certificate
Data warehouse engineer supporting quantitative analysts with emphasis on data quality. Effectively led small team in North Carolina supporting business unit in Boston in hybrid development/QA role.
➢ Performed expert level PL/SQL programming to implement and validate data warehouse solutions in support of quantitative analysis.
➢ Developed Informatica ETL workflows and verified results using SQL in Oracle databases.
➢ Member of small delivery team using Agile methodology. Responsible for all phases of software development, including design, development and testing.
- Software Engineer at Fidelity Investments
- Software Engineer at IBM/Seterus
- Test Analyst at Dex Media
- QA Analyst at Northrop Grumman IT
3 years, 4 months at this Job
- Bachelor of Science - Computer Science
Having been instrumental in the migration of the CLIMATE data warehouse from the London to New York datacenter, was given ownership of the platform and tasked with its transformation. This includes a paradigm shift in data ingestion utilizing Apache Kafka data streaming and utilizing Hadoop based file stores for both input and output of the data topics (many other groups within MS are already using Kafka and our group will implement and engineer a vendor supported version of Kafka.) Applications subscribed to Climate will flow user “click stream” data via Kafka or other queuing mechanisms and be combined with Reference Data sources that will track the client experience and store and deliver that data for both ISB and WM groups for use in “predictive analytics” and regulatory audit compliance. It is highly likely this data will be made available via a new reporting portal for our users driven by a columnar based RDBMS like DB2 BLU or Greenplum backend and front end dashboards (i.e. Tableau) Currently, the legacy platform still utilizes Informatica workflows and ksh/Perl scripts for ETL processing, which will be maintained and updated as the newer technologies are engineered and integrated.
- Data Warehouse Manager at Morgan Stanley
- Project Technical Lead at Morgan Stanley
- DB Engineer at Morgan Stanley
- Application Support Analyst at Morgan Stanley
4 years, 8 months at this Job
- Bachelor of Business Administration - Business Administration