Sunil (RID : 6v6vlsd1c3lb)

  • Azure Data Engineer
  • Hyderabad, India

Rate

₹ 98,000 (Monthly)

Experience

4 Years

Availability

Immediate

Work From

Any

Skills

Azure Data factoryAzure data BrickAzure Data lakeAzure Blob StoragePySparkAzure SQLMS SQL Server

Description

Sunil Taaduri

CAREER OBJECTIVES
Experience as Azure data engineer & on ETL transformations using azure cloud technologies such as Azure Data Factory and Azure Data Bricks, Azure SQL, Azure data lake storage Gen1 & Gen2, Azure data studio, Azure data explorer

PROFESSIONAL SUMMARY
● 4+ years of total experience in IT industry and 3.2 years of experience as a Data Engineer involved in projects with extensive usage of ADF and ADB &ADLS.
● Strong working experience with Azure Services like Azure Data Factory, Azure Data Bricks, Azure Blob, Azure Data Lake Storage Gen1 Gen2, Key Vault, Azure SQL DB and Synapse
DWH.
● Hands-on experience in Azure Data factory and its Core Concept's like Linked Services, Datasets, Data Flows, Pipelines, Activities and Triggers.
● Designed and developed data ingestion pipelines from on-premise to different layers into the ADLS using Azure Data Factory (ADF V2).
● Excellent knowledge of ADF building components –Integration Runtime, Self-Hosted Integration Runtime and Experience with integration runtime and its types
● Experience with integration of data from multiple data sources.
● Having experience on Azure Cosmos DB and Rest Api.
● Knowledge on Data Extraction from On-Premise Sources and Delta Extraction methods. from Source Systems to ADLS.
● Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline.
● Strong Knowledge on Parameterization of Linked Services, Datasets, Pipelines and Activities.
● Worked on Get Metadata Activity, look up, Store Procedure, For each, IF and execute Pipeline activities.
● Manage data recovery for Azure Data Factory Pipelines.
● Implemented and created pipelines like incremental loading and Slowly changing dimension pipelines.
● Having very good experience in implementing poly base mechanisms& Slowly Changing Dimension Mechanism.
● Pipeline execution methods (Debug vs Triggers).
● Having good knowledge on Azure Data Bricks notebooks.
● Good Knowledge on Star Schema /Snowflake Schema, Fact and Dimension Tables.
● Handling Microsoft Azure Virtual Machines, Moniter and manage Azure Data Factory.
● Having working Experience on Spark Architecture.
● Good Lerner and Willing to Adopt New Challenges and Technologies.

● Created mount points to connect to adb from adls and also very good working experience on Delta lake tables.
● Having strong exposure on Spark architecture and clusters creation.
● Having Experience on Real time Streaming Analytical Data.
● Good knowledge in Data Warehousing concepts like OLTP and OLAP.

TECHNICAL SKILLS
 Azure Technologies : Azure Data factory (ADF), Azure data Bricks (ADB), Azure Data lake, Azure Blob Storage,Pyspark & Azure SQL Database.
 Cloud Data ware house: Azure synapse Warehouse.
 Database : Microsoft SQL server, My Sql, Azure SQL.

EXPERIENCE
● Currently working in Cognizant as Azure Data Engineer from Aug 2019 to Nov 2023

EDUCATION
● B-tech from Jawaharlal Nehru technological University Hyderabad in 2019
PROJECTS

1. Project Name: Corporate Mobility Portal
Role: Azure Data Engineer
Environment: MS SQL, System Integration

Mind wireless (MW) is a client who provides cellular services to the US based companies. MW has clients like BMW, Coke, and Danaher etc. We develop portals for this client in which end users can make use of MW services.MW acts as a mediator between service providers and cell phone consumers in Enrollment, Procurement of cellular devices.MW Portals has users like Administrator, User, Special User, and Proxy User. These users can log into the portal and enroll a device, procure a device and generate reports. MW portals have workflows which are specific to the client.
Responsibilities :
● Created pipelines using Azure Data Factory.
● created an Azure Data Factory pipeline that incrementally copies data from an Azure SQL database to Azure Blob storage.
● Created data bricks notebooks.
● Worked on Azure key vaults to use it in linked services.
● managed credentials like passwords, access keys, and SAS tokens by storing them in Key Vault as secrets.
● Created event-based triggers in data factory to capture data from ADLS.
● Given support and fixing the issues during the test phases.
● Worked on Azure SQL Synapse Analytics Deadpool.
● Creating the linked service for source and target connectivity based on the requirement.
● Created pipelines with extensive usage of activities like C

Submit Query icon