SANTOSHKUMAR
Key Expertise
Overall having 6.4 years of experience in MS Power BI, SQL Server, Azure SQL, SSIS, Knowledge on Tableau Desktop, Looker. Snowflake
Career Summary
- Worked as Lead Developer in Confidential, Bangalore from May2023 to Oct-2023
- Previously Worked as a Consultant in Confidential, Bangalore from April 2021 to Sept 2022.
- Previously Worked as executive in Confidential, Mumbai from Aug 2015 to Feb 2020.
Experience Summary
- I am a quick learner and has ability to meet tight deadlines and work under pressure
- I had a good experience in Power Query, Power Pivot and Power View and creating the effective reports and dashboards in Power BI Desktop.
- I had worked on DAX functions like Time intelligence, Information, Aggregate, Filter and Value, Logical and Text.
- I had worked with gateways, scheduling the refresh and creating reports and dashboards.
- I had worked on Creating Workspace, Apps, access permission (Admin, Member, Contributor, and Viewer), Sharing reports, and apps.
- I had worked with different sources MS SQL Server, Azure SQL, SharePoint, SSAS Tabular, and Excel.
- Writing SQL queries (Sub queries, Joins)
- Writing SQL queries (Views, Indexes)
- Implemented RLS
- Implemented Incremental Refresh
-
Tools
Power BI,SQL Server, Azure SQL, Excel Knowledge on Google Looker, Tableau Desktop,Snowflake
Operating System
Windows, Virtual Machine
Skills
Educational Qualification & Certifications
- MCA in Computer Application with 72% in KNS Institute of Technology, Bangalore (2010 - 2013).
- Six-months extensive training on Snowflake DB at Udemy Academy Bangalore.
Project Details
Sills
Project 1:
Project Name
Factsheet
Duration
Dec’ 22 – Oct’ 23
Team Size
8
Description
Factsheet, Showing the index’s performance with time series,
Role & Contribution
Role: Development
Contribution:
- Involved in data gathering
- Connected with Data Engineer team for creating View
- Imported view in Query editor
- Involved in cleansing and preparing data in Power Query
- Involved in Modeling (relationship in Power Pivot)
- Wrote DAX functions (like Time intelligent functions, calendar functions, filter functions, aggregate functions etc.
- Implemented Incremental Refresh
- Used default visuals and Custom Visuals
- Created Bookmarks, menu bar etc.
- I was using Premium Per User (PPU – 48-time refresh) license
- Interaction with the team manager and DB team members
- Preparation of technical documents like User Manual
- While creating POC source was Excel
- Pulled data from REST API,DWH,MS SQL Server using CI/CD pipeline, center trigger point id Azure Logic APP
- Loaded into Azure Lake gen2.
- For Power BI Source was Azure SQL DB
- For picture pixel perfect purpose used report server.
- For report sever source was Dataflow
Tools
Power BI, Report Server, Azure SQL DB, Data Lake gen2,Dataflow,Rest API ,DWH
Project 2:
Project Name
Ticketing Data Quality (client HUL)
Duration
August’ 21 – Aug’ 22
Team Size
1
Description
Data Quality Playbook the Data Quality guidelines and standards for all development purpose of this document is to define the Data Quality standards and provide guidelines to implement the same across Analytics solutions.
Role & Contribution
Role: Development