Job Description :
HI friends, I hope you all are doing well. If you have any suitable profile then please let me know. Implementation partner : TCS Location : Cary, NC (Day One Onsite) Job Description/Skills: Azure Data Engineer Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. Develop quality code with thought through performance optimizations in place right at the development stage. Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements. Building and Implementing data ingestion and curation process developed using Big data tools such as Spark(Scala/python/Java), Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc. Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/ADF/Data Bricks and Cosmos DB and CDP 7x Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable and maintainable ETL code Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. Proficiency and extensive Experience with Spark & Scala, Python and performance tuning is a MUST Hive database management and Performance tuning is a MUST. (Partitioning / Bucketing ) Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. Strong analytic skills related to working with unstructured datasets Performance tuning and problem-solving skills is a must. Code versioning experience using Bitbucket/AzDo. Working knowledge of AzDo pipelines would be a big plus. Monitoring performance and advising any necessary infrastructure changes. Strong experience in building designing Data warehouses, data stores for analytics consumption. (real time as well as batch use cases) Eagerness to learn new technologies on the fly and ship to production Expert in technical program delivery across cross-functional / LOB teams Expert in driving delivery through collaboration in highly complex, matrixed environment Possesses strong leadership and negotiation skills Excellent communication skills, both written and verbal Ability to interact with senior leadership teams in IT and business Preferred Expertise in Python and experience writing Azure functions using Python/Node.js Experience using Event Hub for data integrations. Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API) Experience ingesting using Azure data factory, Complex ETL using Data Bricks. Eagerness to learn new technologies on the fly and ship to production
             

Similar Jobs you may be interested in ..