Design, develop, own, and maintain ETL/ELT data flows across a constellation of data sources and systems
• Aggregate and store quality data in an efficient and transparent manner for reporting, analytics, and data science uses
• Implement tools for monitoring and ensuring data quality and consistency
• Build, support, and improve custom tools necessary for data and analytics self-serve initiatives
• Work closely with Security and Operations teams to develop and enforce proper data security and privacy practices
• Work cross-functionally and communicate in an effective manner
• Investigate, advocate for, and proactively obtain new data sets
Requirements:
• Proven experience creating and maintaining fault-tolerant data pipelines using relational, non-relational, and cloud-based data warehouse systems
• Data modelling and data architecture experience
• Initiate and drive projects to completion with minimal guidance in a fast-paced, dynamic environment
• Detail-oriented, inquisitive by nature, with a can-do attitude and a passion for quality results and an interest in learning new technologies
• Strong coding skills and experience with SQL, Python, and Java
• Proven experience securing DB services for SQL and noSQL environments (MySQL, Kibana)