Software International logo

GCP Data Architect - Remote

Software International

Toronto, Canada

Share this job:
$110 - $140 Posted: September 6th, 2025

Job Description

Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.

We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.

Role: GCP Data Architect

Type: Contract

Duration: 6 months to start + potential extension

Location: Toronto, ON - remote with occasional office visits

Rate: $110 -$140 CDN/hr C2C depending on overall experience

GCP Data Architect - Role Overview

We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.

Key Responsibilities

1. Data Strategy, Security & Governance

  1. Define and implement enterprise-wide data strategy aligned with business goals.
  2. Establish data governance frameworks, data classification, retention, and privacy policies.
  3. Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).

2. Data Architecture & Modeling

  1. Design conceptual, logical, and physical data models to support analytics and operational workloads.
  2. Implement star, snowflake, and data vault models for analytical systems.
  3. Implement S4 CDS views in Google Big Query

3. Google Cloud Platform Expertise

  1. Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
  2. Implement cost optimization strategies for GCP workloads.

4. Data Pipelines & Integration

  1. Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
  2. Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
  3. Leverage integration tools such as Boomi for system interoperability.

5. Programming & Analytics

  1. Develop complex SQL queries for analytics, transformations, and performance tuning.
  2. Build automation scripts and utilities in Python.
  3. Good understanding of CDS views, ABAP language

6. System Migration

  1. Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
  2. Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.

8. DevOps for Data

  1. Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
  2. Apply infrastructure-as-code principles for reproducible and scalable deployments.

Preferred Skills

  1. Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow.
  2. Strong SQL and Python programming skills.
  3. Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
  4. Knowledge of data governance frameworks and data security best practices.
  5. Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
  6. Experience in Google Cortex Framework for SAP-GCP integrations.
Back to Listings

Application Disclaimer

You are now leaving Techaroundworld.com and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.

Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.