Talent's Information
-
Location
Pune, India
-
Rate
$11.0 per Hour
-
Experience
6 Year
-
Languages Known
English,Hindi
Available for
About Apoorva
A performance driven ambitious software developer with an extraordinary blend of and technical knowledge. Ability to communicate and motivate team members to enhance strategic goals and bottom line objectives. Creative problem solving and troubleshooting skills complemented by meticulous attention to details that will result in the success of an organization by developing new applications or improving existing ones.
Tech Stack Expertise
-
Kotlin
Spark
1 Years -
MySQL
MySQL
0 Years
Work Experience
Data Engineer
- January 2017 - January 2023 - 6 Year
- India
Projects
Yahoo Marketing Communications
- January 2017 - January 2018 - 13 Months
Process the data of yahoo users and analyze their data for various business requirements like there are different newsletters of yahoo for which users have subscribed/unsubscribed so that based on the analysis ,other team sends marketing bulk emails to users accordingly.
Hara Development
- January 2018 - June 2019 - 18 Months
Written codes in Bash Script(Shell Script) using Hive(Beeline Queries) to extract data form Hive tables, transform it using various JOINS, hive functions, hive partitions(dynamic and static), hive performance tuning , etc in various hqls in hive and store this transformed data into target hive tables, and then dump the data from target hive tables to flat files so that these flat files were sent to other team to process them in ab-initio. Created jil files for scheduling these Scripts in Autosys so that the data transformation and flat files creation is automated on monthly basis. Also started developing this project in pyspark for achieving better performance.
Clinical Trial Software
- June 2019 - December 2019 - 7 Months
Processing the data of patient’s conditions after several different time intervals which had data being recorded since couple of decades (several years). Importing and exporting data from RDBMS into HDFS and Hive using Sqoop. Performed Partitioning, Bucketing, Loading data using queries and files, Joins, Query Optimization, Sub queries, Configuring hive properties, SerDe Library. o Transformed and analyzed the data by running Pig Latin to analyze user behavior. Performed Data transformation, Data filtering and cleansing, Parallelism, Joins, HCatalog, Optimization, Piggybank/UDFs o Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way
Soft Skills
Industry Expertise
Education
in Bachelor of Engineering
Maharasta University- June 2010 - June 2013