-
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in a specialty in lieu of every year of education.
-
Atleast 4 years of experience in Information Technology.
-
Atleast 3 years of Pyspark experience.
-
At least 2 year of experience in Hadoop, Spark, Python & PySpark
-
Good experience in end-to-end implementation of data warehouse and data marts
-
Strong knowledge and hands-on experience in SQL, Unix shell scripting
Preferred Qualifications:
-
Good understanding of data integration, data quality and data architecture
-
Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
-
Good understanding of Agile software development frameworks
-
Experience in Banking domain
-
Strong communication and Analytical skills
-
Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
-
Experience and desire to work in a global delivery environment