Job Description :

Key Responsibilities:

 

As a GCP Data Engineer, you will be responsible for the following key tasks:

 

ETL Development: Create and maintain ETL processes for data extraction, transformation, and loading from various source systems.

Data Integration: Collaborate with data analysts, data scientists, and business stakeholders to ensure the data warehouse meets their requirements.

Design, Develop, and Deploy: Build data dictionaries/systems and pipelines. Design, develop, and deploy code for Cloud, ETL, Visualization, Design, Modelling, SQL, and other tools, languages, and technologies.

GCP Expertise: Leverage your experience and knowledge of GCP services such as BigQuery, Dataproc, AirFlow DAGs, cloud composer, etc.

Migration Expertise: If you have experience in GCP Migration or On-Prem to Cloud Migration, it would be a significant advantage.

Data Modeling and Optimization: Develop and maintain data models to support data analysis and reporting. Monitor and optimize data pipelines for performance and scalability.

Qualifications:

 

Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.

Minimum of 3 years of experience as a Data Engineer.

Strong proficiency in SQL and Python.

In-depth knowledge of GCP services (BigQuery, Dataproc, AirFlow DAGs, cloud composer, etc.).

Familiarity with data migration, especially from On-Prem to GCP, including code refactoring.

Strong problem-solving skills and the ability to translate complex business requirements into efficient data solutions.



Client : Healthcare

             

Similar Jobs you may be interested in ..