Title: Data Engineer
Location: 100% Remote
Duration: 12 Months
Key skills: Expertise in Python, snowflake and AWS
Responsibilities:
Build and Manage Data Pipelines:
Create and maintain data pipelines to move and transform data using Python and pandas.
Cloud Services:
Use AWS tools like S3 for storage, CloudFormation for setup, CloudWatch for monitoring, API Gateway for APIs, and Kinesis Streams for real-time data.
Work with Snowflake:
Manage and optimize data in Snowflake, ensuring efficient storage and access.
Write SQL Queries:
Use SQL to pull, update, and analyze data.
Version Control and Deployment:
Use GitHub for tracking changes and Azure DevOps for automating deployments.
Monitor Systems:
Track and address issues with data systems using AWS CloudWatch and other tools.
Collaborate and Document:
Work with other teams to understand data needs and document processes.
Ensure Data Quality:
Implement practices to keep data accurate and secure.
Required Skills:
Python: Experience with Python and pandas for data tasks.
AWS: Familiarity with AWS services like S3, CloudFormation, CloudWatch, API Gateway, and Kinesis Streams.
Snowflake: Experience with Snowflake for managing data.
SQL and NoSQL: Ability to write and optimize Both SQL/NoSQL queries.
Azure DevOps: Use Azure DevOps for continuous integration and deployment.
GitHub: Manage code versions and collaborate using GitHub.