Data Engineer Opportunity

NEXT Ventures company

Subscribe to our Telegram Channel

Data Engineer in LATIN AMERICA

Remote 5 days ago

Job Title: Data Engineer (Remote - South & Central America, Contract)


Location: Remote (Open to candidates in South & Central America), Must be able to work Pacific Time Zone Hours

Job Type: Contract - 1 Year

Compensation: $3,000 USD per month

Language: Written and spoken English in a professional environment required


About the Role:

Our e-commerce fashion client is seeking a Data Engineer to join their team on a fully remote, long-term contract. In this role, you will support the development and optimization of ETL pipelines, data processing workflows, and cloud-based data infrastructure. You will work with Snowflake and implement RBAC (Role-Based Access Control) while collaborating with senior engineers to enhance data reliability and security.


This is a great opportunity for an aspiring data engineer to gain hands-on experience with modern cloud data technologies in a fast-paced environment.


Key Responsibilities:

  • Develop and maintain ETL pipelines to process structured and semi-structured data efficiently.
  • Work with Snowflake to manage data ingestion, transformation, and storage.
  • Implement RBAC policies to ensure proper data security and access control.
  • Assist in migrating and optimizing data pipelines across AWS, GCP, and Azure platforms.
  • Support Snowflake staging, Snowpipes, and data sharing processes for cost-efficient data storage and retrieval.
  • Collaborate with cross-functional teams to ensure data integrity, security, and performance optimization.
  • Troubleshoot data ingestion, performance bottlenecks, and integration issues in a cloud environment.


Required Skills & Experience:

  • 1+ years of experience in data engineering, ETL development, or cloud-based data processing.
  • Hands-on experience with Snowflake for data warehousing, Snowpipes, RBAC, and SQL transformations.
  • Ability to use English professionally (written and spoken)
  • Strong SQL skills, including experience with T-SQL, SnowSQL, and query optimization.
  • Exposure to cloud platforms such as AWS, GCP, or Azure in a data engineering context.
  • Basic knowledge of data encryption, masking, and security best practices.
  • Some experience with data ingestion from APIs and third-party sources (e.g., social media or marketing platforms).
  • Strong problem-solving skills and the ability to work independently in a remote-first, collaborative team environment.


Nice to Have:

  • Familiarity with Apache Airflow for workflow orchestration.
  • Basic experience with Python, PySpark, or DBT for data transformation and pipeline automation.
  • Exposure to Fivetran connectors for data ingestion from various sources.
  • Understanding of CI/CD pipelines, Docker, and Kubernetes for deployment automation.
  • Experience with BI tools like Tableau or SSRS for reporting and data visualization.

Apply now

Subscribe our newsletter

New Things Will Always Update Regularly