Job Description :

 Design, develop, and maintain scalable data architectures to manage large and complex data sets.

- Implement and optimize ETL pipelines to extract, transform, and load data from various sources, ensuring high data availability and integrity.

- Work with Snowflake to manage and optimize data warehouses, ensuring reliable data access for analysis and reporting.

- Utilize Snowplow to track web and mobile events, ensuring data collection pipelines are accurate and scalable.

- Collaborate with business teams to translate business requirements into actionable data solutions and workflows.

- Write optimized SQL queries to extract, transform, and analyze data from different sources.

- Use Python for building automation scripts, manipulating data, and integrating with other services.

- Integrate AWS services such as S3, Lambda, and Kinesis to optimize data storage, processing, and analytics across the platform.

- Ensure data quality, security, and compliance across all data pipelines and architectures.

- Contribute to knowledge-sharing and a collaborative, learning-focused environment.

- Create documentation for data architectures, workflows, and business mappings.

 

We are an equal opportunity employer.

 

 All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.

             

Similar Jobs you may be interested in ..