Job Description
Job description
We're hiring on behalf of an international client for a contract-to-hire position. This is a 100% remote role,
Role Summary
As a Data Engineer, you'll participate in the development of scalable, cloud-native data solutions. You'll architect high-performance pipelines, select the right tools for each scenario, and collaborate across teams to drive data-driven decisions.
Key Responsibilities
Design and implement ETL/ELT pipelines using Snowflake and Airflow tailored to business needs.
Build backend services in Golang to support real-time and batch data processing.
Choose the right tools (e.g., Kafka vs. Pub/Sub, dbt vs. SQL) based on performance, scalability, and cost.
Work closely with analysts, data scientists, and product teams to translate requirements into technical solutions.
Optimize Snowflake performance through advanced SQL and data modeling.
Ensure data governance, security, and compliance across systems.
Mentor junior engineers and promote engineering best practices.
Monitor pipeline health and proactively resolve issues.
Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
Minimum 4 years of experience in data engineering.
Strong proficiency in Golang and Snowflake.
Hands-on experience with Airflow and GCP services (BigQuery, Cloud Functions, etc.).
Deep understanding of SQL, data modeling, and pipeline orchestration.
Ability to assess and recommend tools based on scenario-specific needs.
Familiarity with Kafka, REST APIs, dbt, or Terraform is a plus.
Excellent communication and leadership skills.
️ Core Technologies
Snowflake ️
Golang
Airflow ️
GCP ️
SQL & Data Modeling
ETL/ELT Architecture
Tool Selection & Optimization