Job Description :

Hi All

Please work on below role.

Role : AWS Data Engineer
Location : Scottsdale AZ (Day 1 onsite)
Hire type : Contractor.

Indent : SF_OP_167467-1-1

Must have :

IAM , AWS , Glue , S3 Redshift , Kinesis , Python/Java , Scala RDS SQL Server

AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect.

Needs someone who can understand and work on AWS echo system - Glue , Kinensis, DynamoDB etc. ) and the access permissions and especially terraform infra as code.

It is combination of skills AWS DE + some hands-on knowledge in AWS infra.

Position Summary:
We are seeking a highly skilled AWS Data Engineer with extensive experience in AWS technologies to join our team as a contractor.

The ideal candidate will have a strong background in designing, building, and maintaining data pipelines, and must be capable of contributing immediately to our ongoing projects.
Key Responsibilities:
- Utilize AWS services such as Kinesis, S3, Glue, Redshift, and RDS SQL Server for data processing and storage.
- Implement data ingestion processes to handle streaming and batch data.
- Ensure data quality and integrity through robust ETL processes.
- Collaborate with other data engineers and the Cloud engineering team to develop and deploy data pipelines in AWS.
- Optimize and tune data processing workflows for performance and cost efficiency.
- Monitor and troubleshoot data pipeline issues to ensure continuous data flow and reliability.
- Document data architecture, processes, and workflows.
Qualifications:
- Bachelor's or master's degree in computer science, Engineering, or a related field.
- Minimum of 5 years of experience in data engineering, with a focus on AWS technologies.
- Proven experience with AWS services including Kinesis, S3, Glue, Redshift, and RDS SQL Server.
- Strong proficiency in SQL and experience with database design and optimization.
- Expertise in ETL/ELT processes and tools.
- Familiarity with data warehousing concepts and best practices.
- Experience with data modeling and schema design.
- Proficiency in programming languages such as Python, Java, or Scala.
- Knowledge of data governance and security best practices in a cloud environment.
- Excellent problem-solving skills and the ability to work independently with minimal supervision.
- Strong communication and collaboration skills.
Qualifications:
- AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect.
- Experience with other AWS services such as Lambda, Cloudwatch, Kinesis, Firehose, Event bridge, Redshift, DynamoDB, IAM, RDS SQL Server
- Familiarity with big data technologies like Apache Spark or Hadoop.
- Experience with reporting and visualization tools like Tableau
- Knowledge of DevOps practices and tools for CI/CD such as Jira and Harness.

 

Please use below template for submission

Submission Matrix:

Candidate's Full Name as per / Visa copy

 

PTalent Applicant ID

 

Indent ID

 

Current Location, State

 

Contact Number

 

Email ID

 

LinkedIn Profile: Link

 

US work authorization (with validity and extensions)

 

First entry to USA (Year) and Visa status during entry (mandatory)(Exempt for USC/ GC)

 

Willingness to relocate

 

Educational qualification (along with Year of passing and university details

 

Video / In-person interview availability (Yes/No)

 

Currently on Project (Yes/No)

 

Any Certification

 

Interviewed with Persistent in the past 3 months (If yes, please provide details)

 

Rate / Salary

 

Joining Time/Availability to join

 

Skill Matrix:

Skill

Years of experience

   
   
   
 
             

Similar Jobs you may be interested in ..