Position: Senior Data Engineer II
Location: Hybrid (1-2 days onsite), downtown Chicago
Duration: 12
Interview Information:
- 3 rounds- 1 30 minute interview with HM, 2nd interview with team of 2 Data Engineers, final step interview with HM/his boss
Descriptive Summary:
At our client, we are united in our ambition and drive to move forward. We share core values that help us achieve excellence: collaboration, talent empowerment, service, inclusion, respect and gratitude. Our people are our greatest asset, and we invest in the brightest talent and encourage a diversity of perspectives and strengths to create dynamic teams that operate at the pinnacle of their field. Our talented professionals show up every day knowing they will engage in meaningful work, continuous learning and professional development.
As one of the world’s leading law firms, we serve a broad range of clients with market-leading practices in private equity, M&A and other complex corporate transactions; investment fund formation and alternative asset management; restructurings; high-stakes commercial and intellectual property litigation; and government, regulatory and internal investigations. We handle the most complicated and sophisticated legal matters because we don’t just meet industry standards, we create them. We bring innovation and entrepreneurialism to every engagement and, as a result, have long-standing client relationships with leading global corporations and financial sponsors. With 6,500 employees (including 3,500 lawyers) operating from 20 offices across the United States, Europe, the Middle East and Asia, we are one of the largest law firms in the world and a top financial performer.
Essential Job Functions:
-
Owns and drives client data integration solutions in terms of design, build and deployment, DevOps with best-in-class data models, data quality and data architecture standards
-
Possesses strong data capabilities in terms of data analysis, data models, and hands on expertise in crafting and deploying data pipes using Azure data platform and tools, as well as enterprise ETL tool Talend, leveraging its DQ, DI and Data Catalogue features.
ESSENTIAL FUNCTIONS (This list is not exhaustive and may be supplemented and changed as necessary.)
-
Accountable for the technical leadership regarding the data integration solutions and delivery. Ensuring a sound and best in class design, with enterprise implementation, deployment and operational meets the technical quality standards
-
Responsible for planning and coordinating in carving out the needed dev/test environments, as well as defining and managing code branching/config strategies supporting concurrent releases
-
Works with Data and Enterprise architecture team to define the data integration design/coding/deployment/operational standards and technology stack
-
Responsible for data operations, in terms of scheduling, successful execution, and reconciliation of the data pipes in production
-
Works collaboratively with other dev teams to guide and review their deliverables against the set standards
-
Works collaboratively with Data Analytics, applications, DBA, and cloud operations teams to ensure end to end integrity and usage of data assets.
-
Provides inputs in shaping K&E DevOps and DataOps practices
-
Partners with internal and external data experts to infuse innovation with focus on cross training and upskilling the existing teams.
OTHER FUNCTIONS (This list is not exhaustive and may be supplemented and changed as necessary. Delete if not applicable.)
Qualifications & Requirements/Education, Work Experience, Skills
-
Bachelor’s degree in data, computer science or relevant discipline.
-
8+ years of experience in ETL, ELT and data engineering
-
At least 3+ years of working experience on Azure data platforms
-
Experience working in agile delivery, Jira usage and other agile delivery best practices
-
Data architecture, Data Modeling, and data visualization experience is a plus
-
Ability to interact with business, other teams to create data mapping documents, ETL architecture/design artifacts, performance improvements, improve delivery & operational excellence
Technologies/Software:
8+ years of end-to-end implementation experience of deploying enterprise data warehouse, data mart and data lake solutions
3+ years of working experience with Azure data solutions including but not limited to ADLS, Data Bricks, ADF, Synapse etc
Azure ADLS/Databricks administration experience
5+ years of Implementation and maintenance experience with Talend DI, DQ capabilities
Demonstrable understanding of Data Governance, and enabling technical tools and technologies
Certificates, Licensures, Registrations:
Certification in Azure cloud stack, Talend Data Integration Certified Administrator/Developer will be a plus