Job Description
Position Overview:
This is a critical leadership and hands-on role within our Information Technology Enterprise Data Services Group, focused on accelerating company's multi-year journey to build a next-generation data platform on Snowflake to enable advanced AI and GenAI capabilities. You will serve as a Staff Engineer-level technical leader, responsible for the architecture, design, and implementation of robust, scalable enterprise data solutions.
Key Requirements & Core Technical Skills (Must-Haves)
To be effective from day one, candidates must have hands-on, proven experience with the following four core technologies. This experience will be the primary focus of the technical evaluation:
- Snowflake (Database/Data Warehouse): Deep expertise in modern cloud data warehousing architecture and performance.
- dbt (Core/Cloud) (Data Transformation): Expert-level proficiency in modeling and transforming data within Snowflake.
- Python (Programming Language): Core skill for data pipeline development, scripting, and dbt integrations.
- AWS (Cloud Computing): Practical experience with relevant AWS services (e.g., EC2, S3, Airflow).
What You'll Do
Technical Leadership & Strategy
- Define and Drive Strategy: Shape the technical roadmap, data engineering strategy, and best practices for the Enterprise Data Services group.
- Solution Architecture: Lead the development of both high-level (working with Enterprise Architecture) and low-level solution designs for enterprise-scale data ecosystems, ensuring alignment with business goals.
- Mentorship & Excellence: Provide technical guidance, code review, and mentorship to Data Engineers and project teams, fostering a culture of engineering excellence and delivery focus.
Hands-On Engineering & Delivery
- Pipeline Development: Lead the design and implementation of highly scalable, high-performance data pipelines, primarily utilizing dbt for transformation and AWS Managed Airflow/Zena for orchestration.
- Advanced Coding: Write and maintain clean, reusable, and high-quality code in SQL, Python, Shell, and Terraform, with a focus on performance and maintainability.
- Data Modeling: Design and review conceptual, logical, and physical data models to support new and evolving business requirements.
- Quality & Governance: Champion data quality, governance, and cataloging practices across the platform.
Collaboration & Project Management
- Agile Leadership: Lead agile ceremonies, ensure a delivery-focused mindset, and drive the timely execution of data initiatives.
- Cross-Functional Collaboration: Work closely with Architects, Data Designers, QA Engineers, and Business stakeholders to deliver cohesive, customer-centric data products.
- Issue Resolution: Perform root cause analysis and implement effective solutions for complex, high-priority data issues.
What You'll Bring
- Extensive Experience: Proven track record of architecting and delivering complex, high-impact data projects from inception to production.
- Core Technical Stack: Mandatory expertise in Snowflake, dbt, Python, and AWS.
- Advanced Expertise: Deep knowledge of relational databases (e.g., PostgreSQL, Aurora) and modern coding practices in SQL and Python.
- Resilience & Communication: Exceptional communication and presentation skills (technical and business), with the ability to thrive in fast-paced, high-pressure environments.
- Domain Knowledge (Asset): Familiarity with insurance industry processes and systems.
- AI/ML Exposure (Asset): Experience in operationalizing AI/ML and GenAI models.
- Certifications (Asset): Having two or more certifications such as SnowPRO Advanced Data Engineer, dbt Developer, or AWS Cloud Practitioner is a strong advantage, though a willingness to attain them within 3-6 months is required.