Robert Half logo

Data Engineer

Robert Half

Toronto, Canada

Share this job:
Posted: 2 hours ago

Job Description

<p>A client of Robert Half is looking for a highly skilled <strong>Data Engineer</strong> to design, build, and optimize data pipelines and infrastructure that enable advanced analytics and business intelligence. The ideal candidate will have strong experience in big data technologies, cloud platforms, and hands-on expertise with <strong>Databricks</strong>.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li><strong>Data Pipeline Development:</strong></li><li>Design, develop, and maintain scalable ETL/ELT pipelines using Databricks and other tools.</li><li>Integrate data from multiple sources into data lakes and warehouses.</li><li><strong>Data Architecture & Modeling:</strong></li><li>Implement robust data models for analytics and reporting.</li><li>Ensure data quality, consistency, and governance across systems.</li><li><strong>Performance Optimization:</strong></li><li>Optimize Spark jobs and Databricks workflows for efficiency and cost-effectiveness.</li><li>Monitor and troubleshoot data pipeline performance issues.</li><li><strong>Collaboration & Support:</strong></li><li>Work closely with data scientists, BI engineers, and business stakeholders to deliver data solutions.</li><li>Provide technical guidance on best practices for data engineering and cloud architecture.</li></ul><p><br></p><p><strong>Required Qualifications</strong></p><ul><li>Bachelor's degree in Computer Science, Information Systems, or related field.</li><li><strong>5+ years of experience</strong> in data engineering and big data technologies.</li><li>Strong proficiency in <strong>Databricks</strong> and <strong>Apache Spark</strong>.</li><li>Expertise in SQL and relational databases (e.g., SQL Server, PostgreSQL).</li><li>Experience with cloud platforms (AWS, Azure, or GCP) and their data services.</li><li>Hands-on experience with data lake and data warehouse architectures.</li><li>Proficiency in Python or Scala for data processing.</li><li>Solid understanding of ETL/ELT processes and data governance principles.</li></ul><p><br></p><p><strong>Preferred Skills</strong></p><ul><li>Experience with Delta Lake and Lakehouse architecture.</li><li>Familiarity with CI/CD pipelines for data workflows.</li><li>Knowledge of big data tools (Kafka, Hadoop).</li><li>Exposure to machine learning or advanced analytics.</li></ul><p></p>
Back to Listings

Create Your Resume First

Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.

It's fast, easy, and increases your chances of getting an interview!

Create Resume

Application Disclaimer

You are now leaving Techaroundworld.com and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.

Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.