Technical skills:
Languages/Technologies Informatica Power Center, SQL, Unix Shell Scripting, Python, AWS
Databases SQL Server, AWS database services, Snowflake, Redshift
Tools The rally, Jira, Notepad ++, Git, Jupyter
Operating Systems Window (98/2000/XP/2007)
CORE COMPETENCIES:
· Working as an ETL/BI Developer in Agile methodology.
· Played a vital role as a developer involved in the analysis, design, coding, and testing of the application.
· Major work on ETL/BI using Informatica Power center, AWS ETL services, Snowflake procedures
· Created scripts using shell scripting and python
Responsibilities:
Being an ETL/BI Developer and adopting of AGILE methodology, Sprint planning, Estimations, development, execution, reviews, testing, and retrospective are done by an individual for the allocated responsibility under the supervision of the Team lead and Scrum Master.
Key Achievements
· Awarded Applause award for the efforts put into the Regulatory project for one of the leading Asset Management companies of U.S.
· Awarded Spot award for the hard work and commitment
· Received Appreciation from the Manager for exhibiting team spirit.
· Received an Award for excellent performance from Infosys, India.
· Received Insta Award for 0-defect code delivery into Production during 2016 in previous organization.
· Received Appreciation from the Manager of a previous organization for leading the offshore team in effective code delivery.
Professional projects:
Working with Delta cubes Technologies as a Data Engineer from January 2020 to Present
Project title: Hastings IFRS17 Size:10
Technologies and tools: AWS S3, Snowflake, SSIS, SQL Server
Role: Data Engineer
Responsibilities:
· Create an external stage to read data from S3
· Create snowflake procedures to perform validations, transform data, and load to tables.
Project title: HR Analytics Size:10
Technologies and tools: AWS Glue, S3, RDS, Redshift, SQL Server, Lambda, DMS, Lake formation, Snowflake
Role: Data Engineer
Responsibilities:
· Creation of workflows and jobs using AWS glue to transform and load data
· Build data pipeline from on-prem data sources to the landing zone (AWS S3 bucket) and to cloud databases like Amazon Redshift and Snowflake.
Project Title: Asset Management Team Size:45
Technologies and tools: Informatica Power center, Unix shell scripting, Tivoli workload scheduler, Python, AWS, Snowflake
Role: ETL Developer
Responsibilities:
· Creation of mappings and workflows where source and targets are of type XML and flat file
· Created python scripts to perform preprocessing on S3 files using AWS Lambda.
Project Title: BI2020 Team Size:30
Technologies and tools: Azure SQL Data warehouse, Hive
Role: ETL Developer
Responsibilities:
· User story requirements and analysis to meet the stakeholder’s requirements.
· Analyzing existing ETL processes, converting them into stored procedures and Mapping specifications.
Copyright© Cosette Network Private Limited All Rights Reserved