Anthem Blue Cross is one of the nation's leading health Benefits Company offering a broad range of medical and specialty products with nearly 40 million members and net income of $960 million.
• Developed and managed a Risk Adjustment Data Validation (RADV) database to track performance metrics of the annual RADV audit resulting in an increase of chart collection rates by 45%.
• Performed a claim data analysis of members selected for the RADV audit to determine the best medical records to chase that have the best chance to validate their condition resulting in an increase of validation rates by 18%.
• Streamlined a Microsoft Excel VBA program to create screenshots from the Mainframe system for the RADV Enrollment validation audit resulting in a 100% validation rate, an increase of 15% from previous year.
• Developed and managed a High Priority MIA HPO database for Medical Investigations Analysts (MIA) to outreach to members needing to see their providers closing approximately 25,000 gaps of members in this bucket.
• Created a Commercial and Medicaid Provider Incentive data file for Finance to send incentive checks to providers saving our department approximately $1,000 each month on Information Technology resources.
• Performed Ad Hoc reporting for business partners that providers required to obtain medical records needed to submit to CMS for transfer payments helping our department increase revenue by 39%.
• Performed Quality Control audits on Commercial Risk Adjustment (CRA) Plan Presidents reports identifying errors of information resulting in 100% accuracy of these reports month over month.
- Data Warehouse Analyst at Anthem Blue Cross
- New Item/Speed to Shelf Analyst at Unified Grocers
- Consumer Product Strategy Analyst III at Bank of America
2 years, 7 months at this Job
- B.S. - Business Administration and Management
Assist with transition from various assorted regional ERP systems to one company wide SAP ERP system. Document current workflows and mandatory fields. Validation of invoices in the SAP Transportation Management module.Assist the Master Data Manager with assessment of the current state of ACCO's master data landscape. Create Excel graphs to turn data into an easy to understand picture to answer questions and allow for decisions to be made quickly.
- Global Data Warehouse Analyst at SALEM GROUP assigned to ACCO Brands
- Reporting Services Developer at Data Specialists, Inc.
- Application Analyst at Foremost Farms USA
- PROGRAMMER ANALYST at Collaborative LLC
2 years, 2 months at this Job
- B.B.A. - Management Computer Systems
In Lockheed Martin Rotary and Missions Systems we have the ability to collaborate across functions in a multinational and multi-company environment. Actively assisting in updating and developing F-35 Support Equipment mechanisms, for monitoring program progress and intervention and problem solving with team leads. Working with multifunctional departments such as project management and planning, project team and other integrated project teams to accomplish.
Provide lifecycle patient support with the following responsibilities:
• Working with multifunctional departments such as project management and planning, project team and other integrated project teams to accomplish support equipment data integrity.
• Develop resolutions to meet productivity and manage database projects, assisting internal and external clients.
• Developing new systems that assist in monitoring changes to the support equipment database, where we are defining requirements for database operations and configuring it for use by other members of the support equipment team.
• Involved with team leads to coordinate across multifunctional departments and assist in mapping all country specific data to one common dataset; to be utilized by all countries, corporations and departments involved in support equipment logistics.
• Involved with team leads to develop alternative methods that would assist in time management in administrative efficiency for all mechanisms and monitoring related to support equipment integration.
- Data Warehouse Analyst Associate at Rotary and Missions Systems-Lockheed Martin
- Certified Pharmacy Technician at CVS Pharmacy in Target
- Clinical Patient Service Coordinator, Cook Children's Home Health at Cooks Children's Home Health
- Barista at Starbucks
1 year, 10 months at this Job
- Bachelors of Arts in Interdisciplinary Studies - Interdisciplinary Studies
Data Team (Contractor)
• Led SME interview meetings to create a DRD(Data Requirement Document) for 15+ banking systems to be decommissioned or re-coded.
• Analyzed SME's response to the DRD for each individual system. Included all inbound data elements, business name/definition, physical name, granularity, frequency, etc.
• The team collectively consolidated 30+ DRD's and created a unique superset of data elements.
- Senior Data Warehouse Analyst at Union Bank
- Senior Business Technology Consultant at Union Bank
- Business Intelligence Manager - Capital Markets at ReadyCap Commercial, LLC
- Manager/Director of Business Intelligence/EDW Team at Ten
4 months at this Job
- Computer Programming and Information Systems - Design & Implementing a SQL Server
• Experience with T-SQL, SSRS, SSIS, SSAS, Tableau, SharePoint, and ASP.NET web forms.
• Worked primarily with SQL Server databases but also gained some experience with PostgreSQL databases.
• Promoted to BI Team Lead after successfully completing multiple projects and providing excellent customer service.
- BI TEAM LEAD / DATA WAREHOUSE ANALYST at Data Coordinating Center
- PROGRAM ANALYST at WREGIS
- DATA COORDINATOR / DATA ANALYST at Western Electricity Coordinating Council (WECC)
- SERVICE COORDINATOR / COMMERCIAL SOLUTIONS & RESPONSE MANAGER at Contact Me
1 year, 8 months at this Job
- Master's - Software Engineering and Database Technologies
- Bachelor's - English
- Bachelor's - Meteorology
- Data warehouse analyst at Honeywell
- Senior Analyst at Accenture
- Systems Engineer at Tata Consultancy Services Limited
- Senior Developer at TCS-Infra Lab /SBI/EIS
3 years, 2 months at this Job
- Diploma - Embedded & Robotics
- Bachelor Of Engineering - Mechatronics
- - Research and development
Member of the department IT, MIS & Datawarehouse of Santander Private Banking International (SPBI). Leading offshore development of new modules for the informational and operative system with US stakeholders based on Miami’s office. Update and optimization of the ETL process to ensure data quality. Part of the team in charge of the datawarehouse of SPBI. Support to Database Base Administrator functions.
- Senior IT Data warehouse Analyst at Santander Bank Switzerland
- Development team leader at everis Spain
- Analyst programmer at everis Spain
4 years, 8 months at this Job
- Bachelor in computer science engineer
Employed by DoubleStar, Inc. as a Consultant to the Pennsylvania State System of Higher Education in Harrisburg, PA
• Project management
• BI strategy design and implementation
• Training plan design and implementation
• Documentation, briefs and elevator pitches
• Project Marketing and Communication
• Data Governance & Quality Assurance
• Customer Service software selection
• Intra-departmental Data Sharing agreements.
- Project Manager and Data Warehouse Analyst at DoubleStar, Inc
- Site Manager and Business Intelligence Consultant at DoubleStar, Inc.
- Project Manager and Business Intelligence Analyst at DoubleStar, Inc.
- Data Warehouse Project Manager and Team Lead at DoubleStar, Inc
6 years, 2 months at this Job
- BBA - Computer Science/Accounting
- Certificate - Project Management
- Certificate - Human Resources
• Extracting personnel information from ERP/SAP for various applications including maintaining the metadata used to support Broward Schools’ Single Sign-On Application Launchpad, Identity Management Process and Web Reporting.
• Meeting with various clients to document user specifications and coordinate with technical department to determine best practice in meeting requirements and deliverables
• Responsible in maintaining SyncSort’s MIMIX Share integrity during System Conversion to Power 9 iSeries
• Creating dashboards for various data analytics using Power BI
• Creation and maintenance of DB2 tables, associated keys and indexes, lookup tables and various data views the makeup the Broward School’s Data Warehouse and its usage among various District applications including Virtual Counselor, Pinnacle and B.A.S.I.S. (Behavioral & Academic Support Information System)
• Work with Production Control to ensure program accountability via Source Check In/Out process.
• Communicate with Service Desk to address customers’ request and Data Warehouse accessibility
• Maintenance of the District’s web reporting application (Hyperion)
• Responsible for synchronizing the Data Warehouse via Batch and ‘Real-Time’ processing
• Creating and/or updating data tables in Microsoft SQL Server for use in various web applications and AS400/iSeries environment
• Assuring reliable and consistent availability of data tables and various web components used in accessing data tables via ODBC connectivity
• Perform ETL processing via nightly cycle and vendor purchased ‘Real-Time’ replication application
• Responsible for archiving data on a yearly basis to maintain integrity, consistency and quicker query response time.
• Works with AS400/iSeries Systems Analyst to optimize system resources and coordinate installation of vendor software installation
• Collaborate with various members of the district to coordinate yearly ‘ROLLOVER’ process
• Trained various employees in overall Data Warehouse methodology
- Data Warehouse Analyst/Senior Programmer at School Board of Broward County
- Senior Computer Programmer Analyst at Florida Atlantic University
- Computer Programmer Analyst at Miami-Dade County Public School Board
14 years, 5 months at this Job
- Bachelor of Science degree - Computer Information Systems
While engaged at Guitar Center, CA from 2011 to Present: * Analyze, develop and implement DWH best practices in data warehouse environment. * Develop and evaluate data design. * Review and validate data load into the data warehouse for accuracy. * Review documentation, peer code and conducted code walkthroughs * Responsible for gathering requirements and designing data warehouse * Document and test all queries used to pull data for reporting purposes. * Interact effectively with user community to produce analysis results and analytics requirements. * Ensure data integrity and timely output of daily, weekly, and monthly scheduled jobs. * Perform data replication, extraction, loading, cleansing, and data modeling for data warehouses. * Work on performance tuning on Netezza by using effective bright light optimization techniques to avoid overhead on source, target and intermediate components. * Review documentation, peer code and conduct code walkthroughs. * Responsible for gathering requirements and designing data warehouse. * Provide on-call support regarding production system. * Creating shell scripts to simplify the database tasks. * Work with system supporters, developers and DBAs in automating. * Implementing best practices to ensure optimal performance. * Designing reusable, scalable and maintaining ELT templates. * Assist users with acceptance testing and supervise system and unit test. * Address data-related problems in regards to systems integration, compatibility, and multiple -platform integration. * Identify and develop opportunities for data reuse, migration. * Environment: ETL Tool, Datastage, Oracle 10g/11g production databases on Linux 5, 6 platform and Data warehousing on IBM Netezza 7.2.05/126.96.36.199/6.0 Datastage While engaged on Netezza PureData Upgrade Project: As part of the PureData Netezza Upgrade, we are Migrating Netezza Twin Finn server to PureData Server. As we migrated to New Server we need to setup all the environments for news server and established the connection with DataStage. * Prepare the DB Migration checklist for Go live. * Copy all the data from TwinFinn to PureData Server. * Validation of data between TwinFinn and PureData. * Validation of all views in all environment between TwinFinn and PureData. * Migrate all the jobs found issue during Parallel Load (Refer ETL jobs tab for the same). * Migration of EDW_LOAD_STS_CNTRL table. * Testing job and connection from DataStage Dev (10.192.4.43) to TwinFinn. * Testing Job Connection from Watson to DataStage Dev (10.192.4.43). * Testing job and connection from DataStage Prod (10.192.4.44) to LP Host (NTZ-PRX-001-LP). * Testing job and connection from LP Host (NTZ-PRX-001-LP) to DataStage Prod (10.192.4.44) in odbc.ini file we need to change server name to PureData (changes TwinFinn to PureData). * Refer Configuration Steps sheet for the same. * Login into DataStage Administrator change the value for NzServer. * Currently it is pointing to TwinFinn as part of migration we need to point it to NTZ-PRX-001-LP (Proxy), Refer Configuration Steps sheet for the same. * Change the Value for Nz-Usr (Earlier it was petl) , need to changed it to new value -o ServerAliveInterval=30 petl. * Analyze the Requirements. * Prepare EDD and Mapping document for all the Development/Enhancement Changes. * Build and Unit testing will be done and share the results to GC Customer for Sign off and moving the codes to higher Environments. While engaged on Riversand PIM Enterprise Project: As part of the GC.com to ATG Go Live, GC.com promotions will be created in ATG Promo Engine (PEN) and it will not be created in JBS Promo Wizard. The purpose of this project is to get the GC.Com site promotion master data from ATG GC.com. JBS will continue to send the M&A and WWBW promotions and MF, M123 and GC Promotions will be loaded from ATG. All the Vendor, Taxonomy and SKU data that are coming from Heiler will be decommissioned and it will be flowing from Riversand, TIBCO to EDW. * Analyze the Requirements/CRs from Analyst. * Prepare EDD and Mapping document for all the Development/Enhancement Changes. * Build and Unit testing will be done and share the results to GC Customer for Sign off and moving the codes to higher Environments. * Deliver the code, Validate the check list decode sanity check in system testing environment. * Fixing for defects raised during Dev, QA and UAT Validation, promote the codes to PRODUCTION Environment. While engaged on Metric Standardization Project: In order to capture some new metrics required by business for reporting purpose, this project has been initiated. * Analyze the impact of requirements throughout the project including downstream systems. * Work on development and also implemented the zone maps in the process. * Review documentation, peer code and conduct code walkthroughs. * Prepare unit test cases and did testing for additional information in the flow. While engaged on Performance Tuning Project: In order to optimize and reduce the overall load retention on the server, performance tuning was performed on ETL process for some of reports and Dashboard. * Analyze the run time statistics of the process. * Work on optimizing the SQL's use in the process. * Prepare unit test cases and did testing for additional information in the flow. * Achieve gain of 80% over run time and performance of the entire process. While engaged on Chessys Export Project: Building a new export process for the third party 'Chessys'. New export process would include the Sales Balance related store-wise information to be generated for last 8 days' time period. * Requirement gathering and prepared the design document for the new process. * Develop the required DB objects and ETL jobs for generating the files. * Prepare unit test cases and did testing for all the downstream systems. * Develop the UNIX shell scripts for transferring the files to the third party server. * Software and Tools Used: Netezza, DataStage 8.5, UNIX. While engaged on Time Dimension Project: Time Dimension is used by all the facts in the Guitar Center Retail and other process to capture and calculate the metrics on timely basis. This project was initiated to extend the Fiscal and Merchandise calendar for three years. * Perform impact analysis for the calendar extension on Time Dimension process and other processes. * Mapp the Fiscal Calendar provide by source system to the Merchandise Calendar manually. * Loading the extend calendar in the process. * Prepare unit test cases and did testing for all the downstream systems. * Software and Tools Used: Netezza, DataStage 8.5, UNIX.
- ETL Developer/Tester and Data Warehouse Analyst at Syntel Inc
8 years, 2 months at this Job