Apply proven expertise and build high-performance, scalable data warehouse applications.
Intelligently design data models for optimal storage and retrieval
Deploy inclusive data quality checks to ensure high quality of data
Experience designing and developing data models, integrating data from multiple sources, building ETL/ELT pipelines, and supporting all aspects of the software development lifecycle (SDLC)
Securely source external data from numerous global partners.
Optimize existing pipelines and implement new ones, maintenance of all domain-related data pipelines
Ownership of the end-to-end data engineering component of the solution
Collaboration with the program’s SMEs, data scientists
Support on-call shift as needed to support the team.
5+ years’ experience in data engineering, proven expertise of applying ETL/ELT/DWH/ best practices
Proficiency in LAMP and the Big Data stack environments (Hadoop, Hive ,Presto & Spark)
Competence with relational databases (Oracle,MySQL, Teradata)
experience working with enterprise DE tools, ability to learn in-house DE tools quickly
coding and scripting experience with Python, Java, PHP, SQL, CLI
Nice to Have
Exposure to Learning Management System (Cornerstone OnDemand LMS, SAP Litmos LMS or any other LMS)
Involved in extracting the data from LMS such as (Cornerstone OnDemand and/or any other LMS)
Experienced in analyzing the LMS data and etc.