Data Engineer (relocation to Brussels, Belgium) Opportunity

Hexa Consulting company

Subscribe to our Telegram & Twitter Channel

Data Engineer (relocation to Brussels, Belgium) in PORTUGAL

Visa sponsorship & Relocation 1 year ago

We are seeking a talented Data Engineer willing to relocate to or currently located in Brussels, Belgium.


In this role, you will have the exciting opportunity to enhance and maximize the business value of our Observability Platform. You will collaborate with cross-functional teams and individuals globally to understand their observability needs and implement innovative solutions.


As a Data Engineer, your contributions will be pivotal in delivering a smooth and exceptional service observability experience for our internal users. You will apply your technical expertise in data engineering, ELK (Elasticsearch, Logstash, Kibana), and DevOps tools like Jenkins, Git, and Ansible to develop robust and efficient solutions.


Key Responsibilities


  • Work closely with global teams to identify and implement observability solutions that address their specific needs.
  • Design, develop, and maintain data pipelines, ETL processes, and data models for our Observability Platform.
  • Ensure smooth integration of various data sources into Elasticsearch, using Logstash and other pipeline technologies.
  • Collaborate with DevOps and engineering teams to automate deployment pipelines and integrate observability tools using Jenkins, Git, and Ansible.
  • Troubleshoot complex technical issues across the platform and provide timely solutions to ensure consistent observability service.
  • Optimize the Observability Platform's performance to enhance its efficiency and business value for internal users.
  • Ensure data models, JSON schema design, and validation are well-implemented and meet industry best practices.
  • Contribute to platform scaling, monitoring, and real-time data processing using Kafka and other relevant technologies.
  • Provide expert-level support for RHEL (Red Hat Enterprise Linux) and ensure it integrates seamlessly with the Observability Platform.
  • Engage in cross-functional communication and collaboration to meet the organization's evolving needs.


Required Skills & Qualifications


  • Bachelor’s degree in Computer Science, Information Systems, or a related field (equivalent experience may be considered).
  • Extensive hands-on experience in data engineering, with a focus on designing and implementing data pipelines and ETL processes.
  • Proficiency in the ELK Stack (Elasticsearch, Logstash, Kibana). ELK Data Engineer certification is a plus.
  • Strong knowledge of DevOps tools such as Jenkins, Git, and Ansible.
  • Expertise in RHEL (Red Hat Enterprise Linux) and Kafka for real-time data processing.
  • Solid understanding of data modeling concepts and expertise in JSON schema design and validation.
  • Experience in Java development is preferred.
  • Proven ability to troubleshoot and solve complex technical issues.
  • Excellent cross-functional communication and collaboration skills, with the ability to work effectively with global teams.


Additional Considerations


  • Strong problem-solving skills and ability to work independently and in a team setting.
  • Experience in supporting and maintaining large-scale data engineering platforms.
  • Familiarity with observability, monitoring, and logging best practices is desirable.


Why Join Us?


  • Official full-time employment with an indefinite contract from the start
  • Great relocation package
  • Health insurance provided
  • Hybrid work model: 2 days on-site, 3 days remote
  • Access to Udemy Business subscription with thousands of workshops and courses
  • Collaborative and innovative work environment
  • Engagement in diverse and challenging projects to keep you motivated and enhance your skills
Apply now

Subscribe our newsletter

New Things Will Always Update Regularly