Home > Candidates > Lalit Shimpi
Lalit Shimpi

pune maharastra, India

Phone: xxx-xxx-xxxx

Email: xxx@xxxx.xxx



  • Looking For: Google Cloud Data Engineer, data engineer

  • Occupation: IT and Math

  • Degree: Master's Degree

  • Career Level: Experienced

  • Languages: English

Career Information:

Sign up to view Lalit Shimpi's full profile.

Highlights:

Skills:GCP, BigQuery, SQL, ETL, Data Engineer, Pyspark


Experiences:

Lead 12/2024 - current
UST Global, Pune, Maharashtra United States
Industry: Consulting Services
Accomplished IT professional with around 8 years of experience specializing in Oracle, Snowflake, Informatica, IICS, IIB Migration, Big Data and Google Cloud Platform. • Strong hands-on experience in BigQuery, Airflow, Dataproc, Dataflow, Hadoop, Spark, Pyspark, and SQL. Adept at leading teams in agile environments, ensuring adherence to best practices for cloud architecture.
Expertise in designing scalable, secure, and resilient cloud architectures on Google Cloud Platform (GCP) to meet client needs. • Deep understanding and experience with core GCP services BigQuery, Cloud Functions, Cloud Storage, Cloud SQL, Compute Engine, Cloud Composer and more along with BigData technologies. • Experience with GCP’s data services like BigQuery, Cloud Storage, Pub/Sub, and Dataflow for building data pipelines, analytics solutions, and big data systems. • Knowledge of DevOps practices and CI/CD pipelines using GCP services • Hands-on experience in planning and executing cloud migrations, including moving on-prem workloads to GCP, and hybrid cloud architectures. • Familiarity with tools like Stackdriver (Cloud Monitoring & Logging) for monitoring, logging, and optimizing cloud infrastructure performance. • Ability to lead cross-functional teams, collaborates with developers, and work with stakeholders to implement architecture that aligns with business goals. • Experience in automating operational tasks using Python, Bash, or--
Senior Engineer 02/2022 - 11/2023
Datametica Solutions, Pune, Maharastra India
Industry: Product Based
Retail: Those project focuses on migrating and modernizing the current data warehouse and analytics workloads from Snowflake and Informatica to Google Cloud Platform (GCP), including the transition of Azure RDS stored procedures (SP), IBM Integration Bus (IIB) ESQL, and Snowflake stored procedures into BigQuery. The migration will ensure a smooth transfer of data, ETL processes.
Migrated and modernized Azure RDS stored procedures, IIB ESQL, and Snowflake stored procedures to BigQuery. • Led migration projects from Netezza to BigQuery using Informatica Power Center as the ETL tool for data mapping. • Executed migration projects from Oracle to BigQuery utilizing BigQuery as the ETL tool. • Designed and implemented code migration from Informatica Power Center to BigQuery using SQL. • Optimized existing queries in BigQuery for improved performance and resource efficiency. • Designed and implemented code in dimensional modeling, including facts and dimensions. • Created tables with effective partitioning and clustering strategies in BigQuery to manage costs. • Designed and developed data pipelines for historical data transfer using IICS, Informatica Power Center, and Google Cloud Platform (GCP).--
Senior BPA 01/2019 - 01/2022
TCS, Mumbai, Maharashtra India
Industry: Consulting
Telecom : This project was for a Australia next-generation telecommunications company that provides the best range of Australian and international movies, entertainment, lifestyle, documentaries, news and sport through a range of broadcast and streaming services. They had a wide range of data which flows from multiple sources to Oracle Data Warehouse and Hadoop Datalake where we worked on migrating to Google Cloud platform with high performance solutions.
• Developed and optimized ETL pipelines using PySpark to process large datasets efficiently. • Conducted performance tuning of PySpark applications, achieving a reduction in job runtime. • Collaborated with cross-functional teams to define data quality requirements, ensuring data integrity. • Data validation and error handling within ETL workflows, improving data reliability. • Created and scheduled ETL jobs using Apache Airflow, ensuring timely delivery of data to stakeholders. • Participated in table creation, selecting partitioning and clustering strategies in BigQuery for cost-effectiveness. PROJECT EXPERIENCE:--
Senior Analyst 05/2017 - 01/2019
Capgemini India, Mumbai, Maharashtra India
Industry: Consulting
Retail: Carnival Cruise Line (CCL) specializes in family-oriented cruise products, offering a diverse range of activities and entertainment to enhance guest experiences. Princess Cruises (PCL) focuses on premium cruise products, featuring refined services, gourmet dining, and cultural enrichment programs.
? Maintenance & support to Oracle production, development & test database environments. ? Creation and maintenance of table spaces and database objects and data refresh using data pump utility like EXPDP and IMPDP, also creating users and grant the required roles & privileges. ? Configuring Oracle networking and managing listener using LSNRCTL utility , Checking CPU Utilization, Blocking sessions, Invalid Objects also Worked performance monitoring and extracting ASH, ADDM, AWR reports.--

Education:

RCPIT Shirpur 08/2011 - 08/2015
Dhule, Maharashtra, India
Degree: Bachelor's Degree
Major:Electronics and Telecommunication
Passed BE in Electronics and telecommunication with distinction in 2015


APJ Abdul Kalam University Indore 08/2018 - 08/2020
Indore, Madhya Pradesh, India
Degree: Master's Degree
Major:Computer Technology and Application
Passed M tech in Computer Technology and Application in 2020 with distinctions

Download Resume(Available to Employers Only):

data engineer resume GCP , SQL,ETL, Data Engineer



More About Lalit Shimpi
Please sign in or sign up an employer to view Lalit Shimpi's personal information.

  • Phone: xxx-xxx-xxxx
  • Email:xxx@xxxx.xxx
  • Visa: -
  • Work Authorization: -
  • Expected Salary: -
  • Intests & Hobbies: -