Technologies used: Power BI, SQL Server, T-SQL, Oracle, AWS Cloud services including Redshift, RDS/EC2, S3, Machine Learning, Microsoft Azure, Power BI, SSRS, SSAS, DbVisualizer, QueryBuilder, SQL, Visual Basic, File Maker Pro, Python Tasks/Achievements:
• Worked in various capacities including Team Lead, Data / Business Analyst, developer, UAT, and production support
• Ability to coordinate and work with multiple teams, architects, managers and business users
• Extracted, interpreted and analyzed data to identify key metrics and transform raw data into meaningful, actionable information
• Clean and manipulate complex healthcare datasets in order to create the data foundation for further analytics and the development of key insights (MSSQL server, R, Excel)
• Reporting and dashboarding analytical results to client using Power BI & Excel Pivot
• Prototype functionality in proof of concepts (POCs) on emerging technologies like Cloud Computing, Big Data and IoT to help customers validate use cases
• Developed and migrated life cycle solutions across various engagements to production in private and public cloud environments
• Lead team of developers (onsite/offshore) on IoT/Big Data platform (YANTRA) built using AWS services
• Strong dimensional data modeling skills
• Used various available features of Selenium to develop efficient and error free Automated Test Scripts.
• Conduct ad hoc analysis for special projects for the client and various internal groups.
• Collecting, validating, and organizing core data for use in metrics and process improvement.
• Mentoring other team members. SQL Responsibilities:
• Involved in creating multiple database objects such as stored procedures, user defined functions, views, and indexes
• Extracted data from multiple sources using SSIS
• Utilized star schema to design the data mart
• Involved in creating complex measures, KPIs, aggregation by DAX in SSAS cubes as per business requirements
• Extensively worked on SSAS cube performance optimization Projects: DentaQuest: Analyst on enterprise Datawarehouse for healthcare and insurance data in private and public cloud deployment architecture. Circor International Inc: Built a master dataset of Customers, Products, and Suppliers across the enterprise in private and public cloud deployment architecture on AWS in 2012 (during cloud computing infancy stages). The BI system greatly helped them evaluate the On-Time Delivery, Supplier Performance, develop Forecast Models and aid in Overall Corporate Performance Retail CloudHouse Enterprise Solutions: Designed, modeled and crafted an enterprise solution for multi-unit retail and quick food service industry. The leading-edge AWS cloud technology based solution was incorporated in keeping performance high and costs low using Amazon Redshift. NewBalance - Enterprise Datawarehouse design, development and support in private and public cloud deployment architecture Staples - AWS cloud based Predictive analytics solution to streamline business development initiatives for corporate business unit. PCS-CIM (Pippette Calibration Services - Customer Information Manager): Helped in conducting JAD Sessions with customers and key project team members. Worked on the Analysis, Design and Testing of a digital receipt for the completed work orders and provided options to integrate work orders with QuickBooks to generate invoices. Helped in drastic reduction of manual operations and centralization of data across teams while maintaining reliability, and ease of access. Played a key role in communicating regularly with users, technical teams and senior management to collect requirements, describe software product features and product strategy. Quadgraphics - Responsible for researching historical cost information when submitting lane bid requests to customers for potential business. Bid Quoting tool will systematically research historical actual cost information, compare these costs to Q/G's target cost baselines, generate a baseline quote, and retain this information in a centrally established and easily accessible location. Miscellaneous - Developed and introduced many automation tools for reporting and analyzing from heterogeneous datasets. Developed a tool for creating xml file for negotiated lanes contracts in MercuryGate Other Technologies used: Oracle, OBIEE, SQL Server, SSIS, SSRS, SSAS, Visual Basic, Microsoft Project, RUP (Rational Unified Process), UML (Unified Modeling Language), SQL, QlikView, Microsoft Access, ASP.NET, T-SQL, Java Script, Java, HTML, CSS, selenium Web driver, Cucumber, Eclipse IDE
- Cloud Engineer / Data Architect/SQL Architect at Cloud Engineer / Data Architect/SQL Architect?
- Developer / Business Systems Analyst at Sigma Communications
- Business Systems Analyst / Software Developer at Subaru Isuzu Automotives
12 years, 1 month at this Job
• Responsible for the overall design, updates and support of the Alliance Enterprise Data Warehouse
• Responsible for ensuring the appropriate use of identifiable and de-identifiable patient data, and insuring compliance with privacy and HIPAA requirements
• Created DW entities for state GEF (Global Eligibility File), IRIS (Incident reporting), EDI models (837,834,999), census data, and others
• Responsible for EDW covering 4 catchment counties of IDD/Mental Healthcare data
• Architect for Microstrategy layer of EDW as well (MSTR Developer/Architect)
• Other EDW responsibilities include SSIS, SQL jobs/scheduling, supporting Analysts/Report Writers
• Current projects include Tailored Plan initiatives NDC drug modeling
• Optmized DW stored procs and assisted Analysts/Scientists with reporting stored proc creation and logic.
- Data Architect at Alliance Behavioral Healthcare
- BI Data Architect at TMW Systems
- Lead SQL Developer at TMW Systems
- Datawarehouse Developer at Sensus
11 months at this Job
- B.S. in Business Management - Management Information Systems
Project: North American Exploration & Production - Bakersfield, CA Assigned Role: Data Architect Project & Assignment Description: Produced the Data Mapping document for the ForeSite Well and Component Assembly application. ForeSite is comprised of three domain subject areas: Component Assembly Bill of Material, the Well and Subsurface Jobs. In the first domain, A Manufacturing Catalog Item can have multiple Components; each of which can have multiple Assembly Components. In the Well domain, a Well Origin tracks the initial characteristics for a Well. Subsequent events for a Well are captured in Well Origin Notes. When a Well become enabled to pump, the event is captured as a Wellbore Completion. In the third domain, an Authorization for Expense Data initiates the Sub-Surface Job beneath land or water. Any event or status change is captured in the SubSurfaceJobStatusChangeLog.
- Data Architect at Chevron
- Data Architect at BBVA Bank
- Solution Architecture Practitioner at BBVA Bank
- Business Analyst at Liberty Mutual
8 months at this Job
- B.A. - Classics
- M.A. - Psychology
• Responsible for technical data governance, enterprise wide data modeling and database design and developed multi-dimension data models to support BI solutions developed as well as other common industry data from external systems.
• Working with business partners and team members, gather and analyze requirements translating these into solutions for database designs supporting transactional system data integration, reports, spreadsheets, and dashboards.
• Involved in Planning, Defining and Designing data base using Erwin on business requirement and provided documentation.
• Responsible for Big data initiatives and engagement including analysis, brainstorming, POC, and architecture.
• Implemented Kafka High level consumers to get data from Kafkapartitions and move into HDFS
• Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data
• Created Complex SSAS cubes with multiple Fact and Measure groups, and multiple dimension hierarchies based on the OLAP reporting needs.
• Advanced information management and new data processing techniques may be applied to extract the value locked up in this data called Hadoop (HDFS) along with processing large data sets in parallel across a Hadoop cluster and the utilization of Hadoop MapReduce framework.
• Created SSAS tabular semantic model in Direct Query mode with multiple partitions, KPI's, hierarchies and calculated measures using DAX as per business requirements.
• Designed facts and dimension tables and defined relationship between facts and dimensions with Star Schema and Snowflake Schema in SSAS.
• Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
• Researched and developed hosting solutions using Azure and other 3rd party hosting and software as a service solution.
• Used Spark streaming to receive real time data from the Kafka and store the stream data to HDFS using Scala and NoSql databases such as HBase and Cassandra.
• Used SQL on the new AWS Databases like Redshift and Relation Data Services and orked with various RDBMS like Oracle 11g, SQL Server, DB2 UDB, and Teradata 14.1, Netezza.
• Created SQL tables with referential integrity and developed SQL queries using SQL Server and Toad
• Worked with Azure Machine Learning, Azure Event Hubs, Azure Stream Analytics, PivotTables working with up to 140 million-record, multi-table, data sets in SQL (MS SQL Server, SAS Proc SQL, etc.)
• Created Tabular Data Models and implemented POWER BI for POC in Share Point Environment
• Partnered directly with the Data Architect, clients, ETL developers, other technical data warehouse team members and database administrators to design and develop high performing databases and maintain consistent data element definitions
• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
• Created logical and physical data models using Erwin and reviewed these models with business team and data architecture team.
• Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
• Created SQL scripts to find data quality issues and to identify keys, data anomalies, and data validation issues.
• Responsible for full data loads from production to AWS Redshift staging environment and responsible for creating Hive tables, loading data and writing hive queries.
• Designed different type of STAR schemas for detailed data marts and plan data marts in the OLAP environment.
• Produced and enforced data standards and maintain a repository of data architecture artifacts and procedures.
• Provides architectures, patterns, tooling choices and standards for master data and hierarchy life cycle management. Environment: Erwin 9.6, Informatica v10, Power Pivot, SQL, Microsoft Azure, MS Excel, MS Visio, Rational Rose, SSAS, Pig, Hive, CSV files, Hadoop, MongoDB, HBase, Kafka, Sqoop, AWS S3, AWS EMR, AWS Redshift, Python, XML files, Linux, AWK, Aginity, Teradata SQL Assistant, Oracle12c.
- Sr. Data Architect at Data Modeler
- Sr. Data Architect at Data Modeler
- Sr. Data Modeler/Data Analyst at Data Modeler
- Data Analyst at Data Modeler
2 years, 2 months at this Job
Sr. Data Architect,Modeler.
Sr. Data Architect,Modeler.
- Sr. Data Architect,Modeler at Kansas City Southern Railways
- Informatica Data Director for supporting management of MDM data at Multi Load
- Sr. Data Analyst,Modeler at JPMorgan Chase
- Data Analyst,Modeler at Team Health
1 year, 2 months at this Job
• Assisted in standardization of Master Data connection points and naming principles
• Point data architect for 5 year strategic planning initiatives (multi-continent)
• Worked with business process owners to design workshops aimed at re-engineering mission critical activities
• Served as an advocate to promote better data management tendencies and habits
- Data Architect at Exxon Mobil
- Client Services Senior Associate at
- Personal Banker (February-July) at Wells Fargo
5 months at this Job
- Bachelor of Business Administration in Management Information Systems - Management Information Systems
Brown Brothers' Harriman, New Jersey.
• As a Technical Data Architect, responsible for all data related aspects in BBH engagement
• Work with technology and business stakeholders to understand problems / pain areas, assess the current state gaps, visualize target state solution
• Define high level solution and recommendations for future state architecture
• Gathering of tactical and strategic requirements for data integration and data warehouse/ data lake, data preparation, data visualization
• Develop conceptual & logical data models for data platform
• Define and document data transformation rules
• Assist other team leads / team members from data perspective
• Lead a team of hands on data engineers to work on latest data technologies, Cloud, Big Data etc.
• Work with Enterprise Data team and other central teams to align with organization's standards around data modeling and data governance
- Sr Data Architect at Enterprise Data Strategy
- Sr Data Architect at Brown Brothers
- Project Manager / DW Architect at Corporate Solutions
- Project Manager / DW Architect at CapitalOne
1 year, 5 months at this Job
- Bachelor of Technologies - Computer Science
Develop, validate, publish, and maintain logical and physical data models required to migrate existing independent data sources into the Data Warehouse. Translate data models to database structures optimized for analytical reporting and business analysis requirements. Teradata, Hadoop/Hive, SQL, ERWin, TOAD, SVN, MS Windows, Unix/Linux. Key Projects: Indirect Material Management- GM’s Indirect Material Management Team spent weeks on a monthly basis compiling analytic reports to manage material orders, setting re-order points, and planning material dispositions. As the Data Architect on an improvement project, we automated the process leveraging the Data Warehouse resources. We provided an enhanced set of reports that generated automatically overnight at month-end. Between labor savings and better indirect material management, the project resulted in savings of tens of millions of dollars per year. Maven Analytic Reporting – Maven is a short-term vehicle rental service pioneered by GM. Third-party transactional systems for Maven did not include analytical reporting functionality. I designed the database structures for ingestion of Maven data to the Data Warehouse. Using this platform, optimized for analytical reporting, Data Scientists began working on analysis of trends, volumes, and strategic decision-making to enhance Maven’s profitability.
Develop, validate, publish, and maintain logical and physical data models required to migrate existing independent data sources into the Data Warehouse.
Translate data models to database structures optimized for analytical reporting and business analysis requirements.
Teradata, Hadoop/Hive, SQL, ERWin, TOAD, SVN, MS Windows, Unix/Linux.
Indirect Material Management- GM’s Indirect Material Management Team spent weeks on a monthly basis compiling analytic reports to manage material orders, setting re-order points, and planning material dispositions. As the Data Architect on an improvement project, we automated the process leveraging the Data Warehouse resources. We provided an enhanced set of reports that generated automatically overnight at month-end. Between labor savings and better indirect material management, the project resulted in savings of tens of millions of dollars per year.
Maven Analytic Reporting – Maven is a short-term vehicle rental service pioneered by GM. Third-party transactional systems for Maven did not include analytical reporting functionality. I designed the database structures for ingestion of Maven data to the Data Warehouse. Using this platform, optimized for analytical reporting, Data Scientists began working on analysis of trends, volumes, and strategic decision-making to enhance Maven’s profitability.
- Data Architect at GENERAL MOTORS
- Business Analyst at Farm Credit Bank of Texas
- Sr Analyst Programmer at EMERSON PROCESS MANAGEMENT
- Oracle Developer at HIGHLINE DATA
2 years, 9 months at this Job
- Bachelor of Arts in Computer Information Systems - Computer Information Systems
- Associate in Business Management - Business Management
Data architect and team lead overseeing the Quantitative Finance Data Team, analyzing incoming needs to build out requirements and solutions in order to delegate those out to the team for completion.
• Manage the master set of ETLs to extract data from Bank of America Enterprise authorized data sources to build the Current Position as well as Actuals.
• Work with forecast analysts to design and build automated controls to ensure the accuracy and overall completeness of data used in the forecasts.
• Experience in designing and developing automated ETL systems to improve the daily and monthly processes in order to reduce unnecessary manual intervention and improve the overall quality and performance.
• Broad knowledge of consumer and commercial financial products including deposit accounts, credit cards, investment securities, loans, and debt issuance.
• Actively work with analysts to explain and understand changes to the month-over-month balance sheet position.
• Extensive experience designing, developing, and deploying data processing solutions using SQL Server, SQL Server Data Tools, and Visual Studio.
• Experience in both the waterfall and Agile methodology, with added experience as the scrum master.
• Extensive experience in adapting to meet the needs of a rapidly changing environment.
• Excellent at multitasking; able to efficiently plan and prioritize projects.
- Lead Data Architect at Bank of America
- Senior Business Intelligence Engineer at Evicore Healthcare
- SQL Server Developer at Nielsen
4 years, 5 months at this Job
- BS - Computer Science
Developed the business plan, marketing plan, marketing strategies and operational procedures Strategic Trader. Performed data analysis and market research for various instruments. As part of my technical improvement, attended numerous training sessions for Data Modeling and Data Architecture. Mar 2015 - Jun 2016 USPTO Alexandria, VA Performed Database model reviews based on enhancement of existing models. Reversed engineered existing models to add new entities and relationship. Supported development team with implementation issues. Worked with developers to ensure requirement and objectives were met. Performed analysis, user requirement for new projects. Developed Logical and Physical Models for new projects. Developed several star schemas for USPTO reporting group. Performed data model conversion from IBM Data Architect to ER Studio.
- IBM Data Architect at Business Intelligence Tech, Inc
- at Business Intelligence Tech, Inc
- Data Architect/ Data Analyst/Data Modeler and Business Process Analyst at Department of the Treasury
- Data Architect and Business Process Analyst/ Project Lead at AARP
4 months at this Job
- Masters - Information Systems
- Bachelor - Electrical and Electronics Engineering