Job Description :
Hi, Hope you are doing well, Please find the job description given below and let me know your interest. Position: 100% Remote Data Engineer With Strong PostgreSQL and Tableau server skills Location: 100% Remote Duration: 1 Year Job Description: Contract Duration: 1 Year Work Environment: Remote Responsibilities: we are looking for a contractor to help us build dashboards. We need someone who is skilled with using PostgreSQL to create/modify complex views and who has advanced Tableau server skills. Ideally this person will have experience working with data center infrastructure data (racks, devices, GPUs, power utilization) and incident data (MTTR, availability Much of our data comes from Nautobot and Jira. They should be comfortable working directly with end users to clarify requirements and mock-up dashboardsquot; Dashboard Development: Design and develop interactive dashboards and visualizations using Tableau, ensuring that data is presented in a clear, insightful, and impactful way for various business units. Data Management: Extract, transform, and load (ETL) data from multiple sources, primarily using PostgreSQL databases, ensuring data accuracy, performance, and scalability. PostgreSQL Expertise: Write complex SQL queries, optimize database performance, and ensure data consistency and reliability within PostgreSQL databases. Nautobot Integration: Leverage experience with Nautobot to automate and integrate network data into dashboard solutions, enabling visibility into data center and network infrastructure. Jira Analytics: Build Jira dashboards and reports to track project progress, issue tracking, and team performance metrics. Data Analysis: Conduct data analysis to identify trends, outliers, and business opportunities by querying, manipulating, and interpreting large data sets. Collaboration: Work closely with stakeholders, including IT, network engineers, data center managers, and project management teams, to understand business requirements and translate them into effective data solutions. Automation & Reporting: Automate the collection, storage, and reporting of key metrics and operational data for consistent, up-to-date access by key decision-makers. Performance Tuning: Optimize database and dashboard performance, ensuring minimal latency and real-time data access. Documentation: Develop and maintain technical documentation for all solutions, queries, and processes, ensuring that the data pipeline is well-understood and can be managed effectively by other team members. Data Center Operations Support: Apply knowledge of data center operations to build metrics, track infrastructure performance, and provide insights on racks, devices, GPUs, power utilization and incident data. Qualifications: Experience: 3+ years of experience in data engineering, business intelligence, or related roles with a focus on PostgreSQL and Tableau development. Database Skills: Strong proficiency in PostgreSQL, with a demonstrated ability to write complex SQL queries, perform data modeling, and optimize database performance. Tableau Expertise: Proven experience in designing and building Tableau dashboards, with a strong understanding of data visualization best practices. Nautobot & Network Data: Experience working with Nautobot or similar network automation tools, with the ability to integrate network data into dashboards and reports. Jira Integration: Experience in creating Jira reports and dashboards, with a solid understanding of issue tracking and project management workflows. Technical Skills: Proficiency in data management, ETL processes, and working with large datasets. Familiarity with Python or other scripting languages is a plus. Data Center Knowledge: Understanding of data center operations, network infrastructure, and performance metrics related to servers, storage, and networks. Communication: Excellent written and verbal communication skills, with the ability to present technical concepts to both technical and non-technical stakeholders. Problem Solving: Strong analytical and problem-solving skills, with a keen ability to troubleshoot and optimize both data pipelines and visualizations. If you are interested, please share your updated resume and suggest the best number & time to connect with you