Previous Job
Previous
Senior Data Architect
Ref No.: 22-00679
Location: Wayne, New Jersey
 Senior Data Architect: The candidate will be responsible for
• Leading cross functional teams in the design, build, and operation of a cloud-based data operations platform.
• Create and drive a strategy for a robust, fault tolerant, near real-time, and scalable data operations capability to supply our Finance community with operational, analytics and reporting ready data.
• Build, lead and inspire a high performance cross functional team to deliver high impact outcomes.

Responsibilities include but are not limited to…
• Secure, scalable and resilient data operations platform in AWS
• DevOps orchestration for infrastructure and data pipeline
• Data pipeline development and operation
• Data lake and data warehouse
• Data catalog and quality metrics
• Platform security, threat modelling and role-based access control
• Data extraction, integration, and ingestion solutions
• Re-platform legacy data stores and ETL workflows
• Analytics platform and tools as a service
• Strong ownership in understanding business needs and driving business valued outcomes
• Collaborate with other business, data, governance and IT stakeholders to drive solutions
• Concisely articulate user stories and what success looks like to the development team
• Define and prioritize the backlog of work for the development team
• Maintain clear and transparent communication with stakeholders
• Lead teams in solving problems and making decisions under uncertainty
• Research and evaluate emerging technology for business value
• Actively contribute to team ceremonies: sprint planning, daily stand-up, sprint review and retrospectives

Required qualifications:
• 10+ years leading in the design and implementation of big data solutions
• 5+ years in cloud data engineering, cloud engineering or devops engineering
• 5+ years architecting data integration or data extraction solutions
• 5+ years designing data pipelines (streaming) for optimal delivery of analytics ready data
• Experience with streaming data services (Kafka, Kinesis, MKS, DMS)
• Experience with data warehousing solutions like (Redshift, Snowflake)
• Experience with Spark and EMR
• Expert in Python and SQL
• Strong problem solving leader, who can coach and facilitate others to solve problems
• Strong understanding of Scrum, Agile and DevOps
• Familiar with networking fundamentals, OSI model, and networking security practices
• Comfortable with microservices architecture principles
• Fluent in English

Preferred qualifications:
• AWS Certification: Solutions Architect Professional, SysOps Administrator, Developer, DevOps Engineer, Security, Advanced Networking
• Experience building secure systems compliant with ISO 27001 and NIST 800-53
• Experience with Docker and Kubernetes
• Experience with change data capture extraction (Qlik Data Integration, Attunity replicate, AWS DMS)
• Experience with Apache Airflow
• Experience with threat modeling