Talent's Information
-
Location
Hyderabad, India
-
Rate
$12.0 per Hour
-
Experience
4.1 Year
-
Languages Known
English,Hindi
Available for
About Saichand
Total 4.1 years of experience in IT.
Having 2.3+ years of experience in Microsoft Azure Cloud technologies
Having 1.9+ years of experience in SQL.
Diverse experience in Development and Maintenance projects
Hands-on experience in Azure Analytics Services – Azure Data Lake Store (ADLS), Azure Data Lake Analytics (ADLA), Azure SQL DW, Azure Data Factory (ADF), Mapping Dataflows, Azure Data Bricks (ADB) etc.
Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data Sets, Pipelines, Activities.
Designed and developed data ingestion pipelines from on-premise to different layers into the ADLS using Azure Data Factory (ADF V2).
Good knowledge on polybase external tables in SQL DW.
Good knowledge on Azure Stream Analytics.
Experience in building Azure Data Bricks (ADB) Spark-Scala Notebooks to perform data transformations.
Good exposure to DataFrames and Spark SQL
Designed and developed audit/error logging framework in ADF.
Orchestrated data integration pipelines in ADF using various Activities like GetMetadata, Lookup, ForEach, Wait, Execute Pipeline, Set Variable, Filter, until, etc.
Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of single pipeline.
Automated execution of ADF pipelines using Triggers
Have knowledge on Basic Admin activities related to ADF like providing access to ADLs using service principle, install IR, created services like ADLS, logic apps etc.
Extensively worked on data source types – SQL server, Teradata, flat-files, JSON, CSV, GZIP etc.
Experience in SQL – Joins/co-related queries/sub-queries etc.
Good knowledge of stored procedures, functions, triggers, views, etc.
Good knowledge on code configuration tools like Gitflow, VSTS etc.
Strong documentation skills.
Worked extensively on Source entry utility, Data file Utility, Screen Design Aid,
Programmer Development Management.
Perform analysis of complex functional requirements and conduct impact analysis independently with minimal SME inputs
Experience in conducting walkthroughs to onsite peers.
Involved in peer code review as a senior member to ensure delivery of zero-defect projects.
Played both the roles of a developer and a tester for few project releases.
Experience in documenting unit test plans and test results for the functionality under development.
Tech Stack Expertise
-
Azure
Azure Data Factory V2
4 Years
Work Experience
Azure Data Engineer
- January 2019 - November 2022 - 3 Year
- India
Projects
Development And Support
- September 2018 - November 2022 - 51 Months
-
Involving in creating Azure Data Factory pipelines that move, transform, and analyze data from a wide variety of sources
Transform the data to Parquet format and File based incremental Load of data as per Vendor refresh schedule
Creating Triggers to run pipelines as per schedule
Configuring ADF pipeline parameters and variables
Working with different Mapping Data Flow transformations like Derived Column, Select, transformation, Variety of Source and Sink
Create pipelines in Parent and child pattern
Creating Triggers to execute pipelines sequentially
Monitoring Dremio Data lake Engine to deliver data to the customers as per business needs
Creating Email Alerts on Failure of pipelines.
Deployed the codes to multiple environments with the help of CI/CD process and worked on code defect during the SIT and UAT testing and provide supports to data loads for testing
Implemented reusable components to reduce manual interventions
Working on Azure Databricks to run Spark-Python Notebooks through ADF pipelines.
Using Databricks utilities called widgets to pass parameters on run time from ADF to Databricks.
Using Databricks utilities called widgets to pass parameters on run time from ADF to Databricks.
Extensively worked in data Extraction, Transformation and Loading from Source to target.
Involved in analysis, design & testing environment.
Used Formats and Data Stores Designer to import the source and target database schemas, and the designer to map source to the target.
Extracted data from different sources such as SQL Server, Oracle to load into SQL database
Ensuring proper Dependencies and Proper running of loads (Incremental and Complete loads) via Workflow Monitor.
Maintained warehouse metadata, naming standards and warehouse standards for future application development.
Monitoring jobs every day using Workflow monitor.
Involved in preparation and execution of the unit, integration and end to end test cases.
Successfully delivered given tasks on time.
Soft Skills
Industry Expertise
Education
B.sc in B.Sc
Satavahana University Karimnagar.- June 2016 - June 2018