Having 11 years of experience in Information Technology with Technical and Functional experience in maintaining data warehouses/marts. Proficiency in Azure Data Factory and Tools, Azure Cloud components, Data warehousing, ETL process, OLAP systems, Business Intelligence.
- Extensively used ELT methodology for Data Extraction, Loading and Transformation process in a corporate-wide Solution using Azure Data Factory,Data Lake, Databricks Delta Lake,Azure Sqldb,USQL ,Databricks,Pyspark and Powershell,Azure synapse AzureData Storage Explore components.
- Azure Storage component, Data Lake Store(Gen1,Gen2), Access policy, Container/Blob Access & Security Audit for Different operation.
- AAD group Creation ,RBAC role on access provisioning and Security
- Actively involved in Design, build process to ingest and process ADF pipeline for Sales force and different Source system. Supporting IR service and monitoring for High availability for 24*7 Tumble & scheduled Jobs.
- Involved in building ARM template for Pipeline, Data Lake catalog Objects & other PowerShell script deployment using VSTS(Azure DevOps).
- Client interaction on Data anomaly, Data modelling or Powerbi report issues with respect to transitional & On Prem Data.
- Experience in complete project life cycle (design, development, testing and implementation) of Client Server and Web applications.
- Expert in Building, Deploying and Maintaining the Applications.
- Experienced in preparing and executing Unit Test Plan and Unit Test Cases after software development.
- Involved in Agile Scrum methodology that leverages the Client On Prem and Cloud data platform Development and used version control tool VSTS(Azure DevOps).
- Helping in different POC with Bigdata ecosystem using Databricks,Pyspark and other Azure Bigdata component as per customer requirement.
- Used ETL methodology for Data Extraction, Transformation and Loading process in a corporate-wide ETL Solution using Informatica Power Center different Versions
- Proficient in working with various Informatica Client tools like Designer, Workflow Manager, and Workflow Monitor
- Experience in using various Informatica Designer Tools like Source Analyzer, Transformation Developer, Mapping Designer, and Mapplet designer.
- Experience in using Full pushdown optimization to achieve performance gain in Job end to end execution in Production enjoyment.
- Well versed with Dimensional Modeling (Star and Snow-flake Schemas) for building Data warehouse.
- Good Experience in working with various data sources like Flat files and Databases
- Created data Mappings to extract data from different source files, transform the data using Filter, Lookup, Update Strategy, Aggregator, Expression, Joiner Transformations etc., and then loaded into data warehouse
- Good acquaintance with RDBMS Concepts and PL/SQL & Extensive knowledge of DWH Concepts
- Implemented Type 1, Type2 slowly changing dimension mappings as per the business requirements.
- Performed various kinds of tests that include Unit Testing and UAT
- Worked on Mappings for ensuring the Performance Tuning
- Working on various tasks like command task, event wait task, decision task etc., and Knowledge of UNIX Shell Scripts
- Worked on Teradata external loader & utility like Mload,Fast Load. Modified Betq scripts.
- Good Analytical, interpersonal, problem solving, communication skills and ability to perform independently and as part of a team.
- Performed Core ETL testing work for Informatica,Database migration Projects with test cases,senarios and document preparation during Unit and System Testing for proper validation.