Job Title: Data Engineer
Location: Remote
Contract Duration: 12 Months (Strong Potential for Extension)
Hiring For: A Leading Global SaaS Company in Link Management & Analytics
Tech Stack: GCP, Python, SQL, Airflow, Stitch, Census, Golang
About the Role
We’re hiring a Data Engineer to join a highly technical and product-focused team supporting a well-known SaaS company that powers link management and analytics at global scale. This platform is used by millions to track, shorten, and analyze URLs—driving insights for brands and digital marketers worldwide.
As a Data Engineer, you’ll build and maintain cloud-native data pipelines, ensure data accuracy and reliability, and support initiatives around data privacy and compliance. You’ll be working with modern tools in the Google Cloud Platform (GCP) ecosystem and collaborating closely with data architects, software engineers, and analysts.
This is a remote, 12-month contract with high potential for extension.
Key Responsibilities
- Develop and maintain scalable, production-grade ETL/ELT pipelines using Python, SQL, and Airflow on GCP.
- Implement and monitor data workflows to ensure high reliability, performance, and data quality.
- Work with data integration tools like Stitch and Census to streamline third-party data ingestion and syncing.
- Collaborate with analytics and product teams to deliver clean, validated, and timely data for reporting and insights.
- Ensure adherence to data governance, privacy, and GDPR compliance, including implementation of data anonymization strategies.
- Participate in data modeling discussions and contribute to schema design for analytical and operational use cases.
- Write and maintain documentation for pipelines, data sources, and operational procedures.
- Monitor and troubleshoot pipeline failures, data quality issues, and system performance problems.
Required Experience
- 8+ years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with Python, SQL, and Airflow.
- Proven experience working in Google Cloud Platform (GCP), especially with tools like BigQuery, Pub/Sub, and Dataflow.
- Solid understanding of data architecture, data quality, and governance best practices.
- Experience with data anonymization and GDPR compliance.
- Strong communication skills and ability to collaborate across cross-functional teams.
- Bonus points for experience with Golang, financial data, slowly changing dimensions, or snapshot-based data modeling.
Why Join This Project?
- Work with a widely recognized tech product with real-world impact
- Be part of a forward-thinking, data-driven engineering culture
- 100% remote flexibility
- Competitive contract rates with long-term potential
- Work on complex, high-scale data systems supporting millions of users