Work From Home Database Warehouse Developer.
• Using Informatica PowerCenter v.10.0., with emphasis on ETL, HADOOP training, Big Data training, Teradata Studio.(all apps accessed from the Nutanix dashboard). Ability to work independently and manage one's time. Knowledge of the full software development lifecycle: from business/systems analysis, through requirements gathering and function specification authoring, to development, testing and delivery. Ability to troubleshoot issues and make system changes as needed to resolve issue.
• Properly using the Agile system with daily meeting and meeting with other team members.
• Proper use of Teradata Studio Data warehousing; SQL experience; Experience with in Large Enterprise Data Warehouse environments.
• Working Healthcare environment;
• Modernization system development - bringing up Teradata Data Warehouse, Hadoop, and revamping Aster; Sustainment of the Stabilization platform including critical CR/TR fixes until we transition to the Modernization platform Operations and infrastructure support across all environments - development, test and production DaVINCI development on the HSDW platform
- Data Warehouse Developer - SSI at Super System Inc
9 months at this Job
- MS - Information Technology Project Management
- BS - Business
- A.A.S. - Business Data Processing
Description: The University of Chicago is one of the world's great intellectual communities. The university has different reporting systems. The Integrated Reporting Facility system (IRF), Research Application (ERA) , The Student system (SDW) and the donor system (Griffin) are a few to mention, each serving its own set of user community. Worked in a team for Business Information Services which belongs to ITS.
Environment: OWB (10/11 R2), ODI 12C, Oracle Database (10g/11g/12), Workday, Toad, Visio, Sybase Power Designer, CA-ESP, Focus Code, MS SQL Server 20082005, Business Objects XIR4/XIR3/XIR2, Crystal Reports XIR2, Designer, CMC, Web intelligence XIR4/ XIR3 , OBIEE 11g. Cloudera 5.4.7, Spark, R Studio.
• "Excellence award" was given by University of Chicago ITS Department in the Year of 2012 for outstanding performance with multiple skills.
• Attended Cloudera Training "Developer Training For Spark and Hadoop".
• Attended ODI training from Oracle.
• Attended OWB Developer Training.
•Attended In Houses Crystal reports and Universe Designer Training From SAP.
• Student Information Analytics (SIA) - worked with the implementors to install Oracle BI apps and OBIEE for student information analytics. Good knowledge on the OBIA architecture and installation.
• OWB to ODI migration for GEMS and AURA Project including Data Move and Data Load, Data Copy modules. Apply Patch for OWB and Run the Migration for Import the OWB Maps to ODI and Create Topology, Locations and Data Models in ODI for the AURA and GEMS Subject Area DW.
• Resolve the migration errors for complex and critical Maps in ODI.
• Create Packages and Scenarios in ODI and Create Load Plans for Move/LOAD/COPY and Schedule them.
• Lead Integration Person for the Workday-IRF-SDW Integration Project which is completed.
• Lead ETL Developer for the SDW and IRF-SDW integration Projects.
• Lead ETL Developer for the Student Data Warehouse for Registration roll out and Lead Architect Design and Build the ETL flow for the Graduation Project Currently in Progress.
• Extensive use of the OWB for the ETL development for the Student Data Warehouse.
• Extensive knowledge in Student Life cycle and how the data flows from Source to target systems.
• Extensively used different transformation in the OWB maps including Post Process and Pre Process transformations, Pivot and UnPivot etc.
• Used Oracle Analytics to achieve complex and critical data transformation in OWB Maps like list agg, first value and last value etc
• Extensively worked on the data analysis of the source data to get the knowledge\understand the data flow.
• Designing of the ETL specs using Visio to have clear process how the data flows from source to target, this is considered as the ETL specs for the developers.
• Worked on the Base Column mapping to understand the data flow from source to target.
• For IRF - SDW understood the legacy code which was written back 30 years ago in Focus code\SAS and rewritten the ETL specs and Coded in the OWB map.
• Extensive use of the Stored Procedures in the IRF SDW integration project to Pull data from IRF and Send it to SDW system then Send to BO pilot reporting system.
• Worked on Create table\Views\Indexes\Stored Procedures\functions in Oracle and also in SQL Server 2008.
• Extensively worked on the table validation after each new development of ETL by writing the SQL scripts.
• Strong debugging skills when we have issues with the nightly loads.
• Worked closely with Sr. Data Architect for resolving the ETL\ table design issues.
• Worked on the Power Designer for Data modeling \Business Process flow diagrams and Instruction how to code them in ESP.
• Worked closely with DBA\Prod shop (ESP team) \Student Systems team\ Web Services team to achieve different goal.
• Involved in the Business Objects upgrade project from XIR2 SP2 to XI R3, migrating universes, users, reports and security groups for all systems.
• Trained power users and other business users with the upgraded version of the query tool WebI and DeskI XIR2.
• Worked as a bridge to solve and rectify business issues /needs by interacting with the end users and providing prompt solutions.
• Experience in universe design and development for various reporting systems using BO Designer XIR2.
• Experience in implementing Joins, Contexts, Alias, Derived Tables, Aggregate Aware, Index Aware and other key designer functionalities to resolve data and performance issues.
• Experience in developing the ETL in SQL Server 2005 for Business Objects reporting.
• Tested the reports/universes in the new BO XIR3 environment for possible object mismatch errors and performance issues.
• Represented the Business Intelligence team for status and update meetings to transfer information between the development team and the end users.
• Enhanced the existing student universe by including new entities including Billing and Financial Aid information.
• Documented business specific reporting standards for universe, reports, design, and security.
• Created and documented business procedures with respect to Business Objects administration.
• Provided production support for the existing Business Objects systems.
• Experience in deployment of universe and reports across multiple universe and document domains.
• Attended the Annual TDWI conferences in Chicago for more depth knowledge in data ware housing projects.
- Sr Data Warehouse Developer at The University Of Chicago
- Business Intelligence Analyst at Cars.com / Classified Ventures, LLC
- Business Objects Developer at AMERICAN IMAGING MANAGEMENT /WellPoint Inc
- Decision Support Analyst at VOLKSWAGEN CREDIT INC
9 years, 4 months at this Job
Member of the Data Warehousing team involved in designing, developing and documenting of the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems feeds using SQL, PL/SQL scripts, Informatica and Talend
• Implemented slowly changing dimensions (SCD) to update dimensional schema using ETL tool Informatica
• Extensively worked on Informatica Designer Components such as Source Analyzer, Transformations Developer, Command Tasks, Mapplet and Mapping Designer, Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
• Strong experience in ETL and Data Warehousing using Informatica Power Center clients Designer, Repository Manager, Workflow Manger and Workflow Monitor
• Used Data Stage Parallel Extender stages namely Sequential, Lookup, Change Capture, Funnel, Transformer Stage, Column Export stage and Row Generator stages in accomplishing the ETL Coding
• Developed Talend ETL jobs to migrate the data from flat files to database by using Writestatus, tFileDelete, Prejob, tFilelist, tJavaRow, tSchemaComplianceCheck, tVerticaOutputBulk components as project is migrating from Informatica to Talend tool
• Involved in the continuous enhancements and fixing of production problems. Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances
• Developed PL/SQL triggers and master tables for automatic creation of primary keys. Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart
• Performed SQL, PL/SQL tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE
• Used bulk collection for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines. Created PL/SQL scripts to extract the data from operational database in to simple flat files using UTL_FILE package
• Developed Unix Shell scripts in order to run PL/SQL scripts and developed JIL scripts to schedule Unix Shell scripts
• Excellent knowledge of SQL database/instance tuning, optimizing complex SQL statements for best possible performance
• Worked on creating Autosys Jil scripts and CA Workload Automation for scheduling Talend jobs
• Involved in the design process for dimensional modeling for Core layer using Star Schema and also developed logical data models and physical data models using ER-Studio
• Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement
• Created indexes like index and bitmap index and b-tree indexes on database tables
• Optimized stored procedures and long-running queries using indexes and execution plans
• Gained expertise with migrating data to Vertica from Oracle and SQL Server Database
• Documented and mapped interaction (source-to-target, source-to-ETL mapping/STTM) between business processes, information, and data for projects which are of medium to high complexity and moderate to high risk
• Prepared end-user documentation, technical documentation, dimension map, and contact list and published them into production
• Working with an Agile, Scrum methodologies to ensure the delivery of high quality work Tableau Desktop: Performed Descriptive analysis by developing Interactive Dashboard's and Stories using Tableau
• Created charts like box plots, scatter plots, line graphs, Bar charts and Histograms by connecting to databases like Oracle and Vertica
• Performed Geospatial analysis to show the frequency distributions of the various quantities
• Comprehensive knowledge of Parameters, filters, Dashboard actions, calculated fields and Level of Detail (LOD) expressions Tableau Server: Background in installing and upgrading of Tableau server and server performance tuning for optimization
• Experience with creation of Users, Groups, Projects, Workbooks and the appropriate authentications for Tableau Server logons and security checks
- Data Warehouse Developer at Wells Fargo Bank
- Data Warehouse Developer at Agility Technologies LLC
- Data Warehouse Developer at Agility Technologies LLC
- Data Warehouse Developer at Tanla Solutions LTD
2 years at this Job
- B-TECH in Information Technology - Information Technology
• Collaborated with Data Warehouse Architect and Data Science Directors to implement DW and ETL solutions in a high volume TV advertising data analytics platform.
• Supported 3TB data warehouse, dozens of Agent jobs and SSIS packages.
• Tripled the performance of ETL which consumed vendor flat files for daily national ratings data using advanced SSIS skills to parallelize objects. Threads could be parameterized at the job level.
• Supported data science and BI teams by developing Power BI dashboards and the source data stored procedures for analytics.
• Enhanced finance and internal teams reporting needs by creating Power BI solutions including DAX calculations and measures.
• Assisted OLTP teams by performance tuning queries and reducing blocking and job failures.
- Data Warehouse Developer/DBA at Havas Edge
1 year, 7 months at this Job
Experience with full development cycle of a Data Warehouse, including requirements gathering, design
implementation, and maintenance.
• Worked with Type 1 and 2 dimensions, Fact Tables, Star Schema design, Operational Data Store (ODS), leveling and other Data Warehouse concepts.
• Data cleansing, data quality tracking and process balancing checkpoints.
• Create flexible data model design scalable, reusable, while emphasizing performance and business needs.
• Worked on Extracting Data from Cosmos database and Convert to .tsv file and Landed on ADLS.
- Azure Data warehouse Developer at Wipro Technologies Limited
- Technology Analyst at Infosys Limited
- SQL Server BI Developer at Pacific Life Insurance
- SQL Server BI Developer at Hot chalk (Remote)
3 months at this Job
- Bachelor's - Computer Science and Information Technology
served as a programmer and Analyst in the development for Homesite's data warehouse and business intelligence solutions/reports requested by Homesite Insurance management, statutory entities, and other internal and external customers. Responsible for developing and enhancing the data warehouse internal and external data requests and analytic capabilities of the company. Responsibilities include Develop/ maintain/ Schedule SSRS reports using Microsoft VS / BIDS and deploy them to SharePoint. Develop and maintain ETL processes using Microsoft BI Stack (SSRS, SSIS) and database administration skills. Documenting technical requirements / modifications for the design of new reports. Analyze, develop and automate DOI data requests, Internal and external data calls/reports using SQL, SSIS, SSRS, and SharePoint. Working with guidance from architects to develop an improved ETL process. Partnering with business users to scope programming efforts and develop time estimates. Assessing potential limitations and risks associated with the solutions. Providing recommendations on the evolution of the architecture of the overall data warehouse and business intelligence platform/capability. Continual performance tuning, Capacity planning for future growth needs. Work with Homesite's development team to implement solutions. Ability to work both independently and in groups. Capacity to solve problems related to data, Intricacies of working with financial data. Works collaboratively across business units and IT to facilitate business requirements gathering for reports. Analyze new and existing business processes utilizing metrics and analytics tools, and make suggestions for change. Evaluate and interpret end user information requirements, and develop and implement appropriate technical solutions.
- Data Warehouse Developer at Homesite Insurance
- Senior ETL (SSIS/SSRS Developer) at BCBS OF MA
- Data Analyst (Digital Information Systems) at STATE STREET CORPORATION
- Databases Developer at FISH & RICHARDSON COMPANY
6 months at this Job
Profile and understand large amounts of source data available, including structured and semi-structured activity data. Work with data originators and client data analyst to fill gaps in collected data, transform source data, facilitate analysis of the data, and to load the data into the data warehouse. Design and develop ETL. code to extract, transform and load the source data from various formats into a SQL Server data warehouse. Provide day-to-day support and mentoring to clients who are interacting with their data. Participate in and facilitate requirements gathering, analysis and design. Participate in client meetings, technical requirements gathering, and be able to create technical documentation.
- Senior Data Warehouse Developer at Exigo
- Senior Business Intelligence Analyst/Developer at TD Ameritrade
- Business Intelligence Analyst/Developer II at TD Ameritrade
- Database/Reports Developer at PSR
10 months at this Job
- DBA Certificate
- Bachelor of Science - Information Systems
• Writing New T-SQL, PL/SQL/ Teradata queries for Table design and creation.
• Involved in Data Warehouse Design and Development by using star Schema Design Method.
• Migrated Data from SQL server 2012 to SQL Server 2014.
• Implemented stored procedures with proper Set Commands and enforced business rules via checks and constraints.
• Writing T-SQL, PL/SQL/MySQL Joins, correlated and non correlated Sub Queries to avoid complexity to join Data sets and Common table Expressions (CTE) for Temporary result and Temporary tables and Pivoting tables and functions.
• Performance tuning to optimize any existing queries to speed up performance with modifications in removed unnecessary columns eliminated redundant and inconsistent data by applying Normalization techniques, established joins and created indexes whenever necessary.
• Review/modify current database design for better performance
• Implement database management tasks backup and Restore and recovery and disaster recovery and high availability.
• Used to work with troubleshooting and resolving database integrity issues, performance issues, blocking and deadlocking issues, replication issues, connectivity issues, security issues etc.
• Creating SSIS packages and Modify/optimize existing SSIS packages to accomplish ETL Process for Data Ware House Development.
• Designed, developed and deployed custom reports to Report Manager in MS SQL Server environment using SSRS 2012 to Create Data Driven, Drill Through, Drill Down, Tabular, Matrix reports & created charts, graphs using SQL Server 2012 Reporting Services (SSRS).
• Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
• Connecting to different data sources to blend and modeling of data within the Tableau server 9.0 to create interactive dash boards for reporting and analyzing of data.
• Worked with Power BI mobile app and created mobile layouts to view reports and dashboards effectively
• Installed On premise data gateway and scheduled daily data refresh in Power BI.
• Detail-oriented Financial Analyst with hands-on experience in quantitative / statistical analysis, and forecasting budgeting, accounting, and forecasting with Excel VBA.
• Develop new VBA code (Excel) and Modify/optimize any existing VBA (Macros) code if necessary, Pivot tables.
• Working with SharePoint Designer to create work flow intex forms using share point 2013.
• Created Attributes, Metrics, security filters and other report objects, drilling, defined hierarchies using Micro Strategy Desktop.
• Upgraded the MicroStrategy 8.0.1 version on Windows to the 9.0.1 Version on UNIX.
• Subscribe and connecting to SQL AZURE Data Bases and writing the scripts Create Data Base Objects Within the SQL AZURE.
• Interact With front end applications by Web and Client/Server Application development using Microsoft .NET Framework with C# and VB.NET as programming languages.
• Working with Linux and UNIX Operating system and shell scripting migration process.
• Proven experience working with large teams, in Agile/Scrum models required. Environment: MS SQL Server 2012/2014, T-SQL, SSIS 2008/2012, SSRS 2012/2014 and MySQL5.0.22 and 5.1.24Oracle 11g/10g, Oracle9i/8i Tableau Desktop 9.0
- Data warehouse Developer at CVS Health
- SQL BI Developer at Florida Power & Light
- SQL BI Developer at Active Health Management
- SQL Server SSIS/SSRS/SSAS Developer at National City Bank
1 month at this Job
- Bachelors of Engineering - Engineering
• Successfully delivered cross-functional applications for the Enterprise Data Warehouse (EDW) for fortune 50 company
• Led multiple enterprise projects for the EDW such as Payment Card Industry (PCI) and DB2 Upgrade
• Collaborated with business, management, and technical stakeholders during the System Development Lifecycle (SDLC)
• Skilled in data integration tools such as Ab Initio, Informatica, and Talend
• Skilled in data analytic tools such as SQL, Tableau, Excel, Google Cloud Platform (GCP), and R
• Reputation for team player with customer service focus that delivers
- Data Warehouse Developer at THE HOME DEPOT, INC
- EDW Data Integration Lead at THE HOME DEPOT, INC
- EDW Senior Developer at THE HOME DEPOT, INC
- Senior Developer at THE HOME DEPOT, INC
25 years, 4 months at this Job
- MBA - Decision Support Systems
- BS - Computer Science
Project: Central Data Warehouse & Liquidity Risk Reporting Datamart
● Analyze business requirement documents , understand and transform functional requirements received from business users into technical requirement documents.
● Create ETL process sourcing data from SMBC's Central Trading Data Warehouse into Liquidity Risk Data Mart which is designed specifically to cater Risk Reporting
● Develop Oracle Procedures/Functions/Triggers and Packages using Oracle PL/SQL (Toad) to create datasets consumed by regulatory and Risk Reporting Data layer.
● Provide Technical support , bug fixes, enhancements , query performance improvements to the Application Environment: Oracle 11g, PL/SQL, Data Warehousing UNIX, Perl, Linux
- Senior Data warehouse Developer at Sumitomo Mitsui Banking Corporation
- Senior Datawarehouse Developer and Systems Analyst at Fidelity Investments
- Senior Datawarehouse Developer and Systems Analyst at Wellington Management LLP
- Senior Data warehouse Developer at Citco Fund Services
1 year, 5 months at this Job