Job Description :

Our Client is looking for a GCP Data Engineer. This is a hybrid role located in Dearborn, Michigan to candidates must be onsite day 1 and living in Michigan

This is a high level role and in depth GCP experience required. Please read all of job description

Please see highlighted skills in role for main skills needed.

There is a general hacker rank test that must be taken in order to submit to the client!

Please send photo id and linkedin id.

Must be on our W2 (Fastrek)

We can pay $75 hour with single health benefits OR paid time off (cannot pay both for the first year so will bump pay to the hourly rate mentioned),

Automobile Client

GCP Data Engineer

Location: Dearborn, Mi- hybrid role -2-3 times in office a week

Additional Information:

GCP Certification preferred

Job Description:

We're seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices.

You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP).

You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP.

Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must.

We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform.

You will:

Work in collaborative environment including pairing and mobbing with other cross-functional engineers

Work on a small agile team to deliver working, tested software

Work effectively with fellow data engineers, product owners, data champions and other technical experts

Demonstrate technical knowledge/leadership skills and advocate for technical excellence

Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Datawarehouse principles

Be the Subject Matter Expert in Data Engineering and GCP tool technologies

Skills Required:

Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.

Implement methods for automation of all parts of the pipeline to minimize labor in development and production

Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products

Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting

Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management This includes designing and deploying a pipeline with automated data lineage.

Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.

Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC.

Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.

Experience Required:

In-depth understanding of Google's product technology and underlying architectures

5+ years of analytics application development experience required

5+ years of SQL development experience

3+ years of GCP Cloud experience with solution designed and implemented at production scale

Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Cloud Build, Airflow, Cloud Composer etc.

2 + years professional development experience in Java or Python, and Apache Beam

Experience developing with micro service architecture from container orchestration framework - Extracting, Loading, Transforming, cleaning, and validating data

Designing pipelines and architectures for data processing

1+ year of designing and building Tekton pipelines

Experience Preferred:

Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP

Experience with DataPlex or Informatica EDC is preferred

Experience with development eco-system such as Git, Jenkins and CICD

Exceptional problem solving and communication skills

Experience in working with DBT/Dataform

Experience in working with Agile and Lean methodologies

Team player and attention to detail

Performance tuning experience

Strong drive for results and ability to multi-task and work independently

Self-starter with proven innovation skills

Ability to communicate and work with cross-functional teams and all levels of management

Demonstrated commitment to quality and project timing

Demonstrated ability to document complex systems

Experience in creating and executing detailed test plans

Education Required:

Bachelor's degree in computer science or related scientific field

IT or related Associated topics: data architect, data center, data integrity, data manager, data management, data scientist, data warehousing, sql, sybase, Teradata

Education Preferred:

GCP Professional Data Engineer Certified

Master's degree in computer science or related field

2+ years mentoring engineers

In-depth software engineering knowledge

 
 
 

             

Similar Jobs you may be interested in ..