Brillio Romania is a dynamic and rapidly growing company with over 160 employees and offices in Cluj, Oradea, and Bucharest. With its fast-paced growth, the company has consistently demonstrated an unwavering commitment to client satisfaction.
At Brillio Romania, we know our success comes from the innovative contributions and brilliant work of our people. So, we make fostering a positive work environment our top priority. Employees at Brillio Romania not only thrive, but they also have the chance to build long and fulfilling careers, fostering a sense of stability and dedication that ultimately benefits both the company and our valued clients.
Role: Data Engineer - Based in Romania (Fully Remote) - Collaboration type: CIM
We are seeking a highly skilled Data Engineer with strong cloud experience to design, build, and maintain scalable data solutions. The ideal candidate will have expertise in big data technologies, cloud platforms (AWS, Azure, or GCP), and data pipeline orchestration to support our growing data infrastructure needs.
Key Responsibilities:
- Design, develop, and optimize scalable ETL/ELT data pipelines on cloud platforms.
- Implement and maintain data lake, data warehouse, and database solutions.
- Work with structured and unstructured data to ensure efficient data processing and storage.
- Collaborate with data scientists, analysts, and software engineers to deliver high-quality data solutions.
- Ensure data integrity, security, and compliance with industry standards.
- Automate data workflows and optimize system performance.
- Monitor and troubleshoot data infrastructure to ensure high availability and reliability.
Required Qualifications:
- 3+ years of experience in data engineering or a similar role.
- Strong experience with cloud platforms (AWS, Azure, or GCP), particularly with data-related services (e.g., AWS Glue, Redshift, BigQuery, Azure Synapse, etc.).
- Proficiency in SQL and Python for data processing and transformation.
- Experience with big data technologies (Spark, Hadoop, Kafka, etc.).
- Knowledge of orchestration tools like Apache Airflow or AWS Step Functions.
- Familiarity with containerization and DevOps practices (Docker, Kubernetes, Terraform, CI/CD pipelines).
- Strong problem-solving skills and the ability to work in a fast-paced environment.
Preferred Qualifications:
- Experience with streaming data processing (e.g., Apache Flink, Kafka Streams).
- Exposure to machine learning pipelines and MLOps.
- Understanding of data governance and compliance frameworks.
- Certifications in AWS, Azure, or GCP related to data engineering.
If you are passionate about working with cutting-edge data technologies and want to make a significant impact, we’d love to hear from you!