Big Data Engineer - Trading/Fintech $120K -$140K Remote Opportunity

Call For Referral company

Subscribe to our Telegram Channel

Big Data Engineer - Trading/Fintech $120K -$140K Remote in PORTUGAL

Remote 2 months ago
Our Clients work in the Blockchain space so please only apply if this is an area of interest

About Our Client

Our Cientt is the fastest Telegram bot on Solana, with over $10 billion in traded volume. We empower traders with advanced on-chain trading tools like DCA orders, limit orders, and wallet copy-trading, offering a seamless, innovative experience.

Why Join Us?

Our Client is synonymous with speed, innovation, and cutting-edge trading solutions. This is a unique opportunity to lead and build the data infrastructure for our project, collaborating with an elite team to shape a product that directly impacts thousands of active users in a fast-growing ecosystem.

Role Overview

We are looking for a Big Data Engineer to take ownership of data architecture, ensuring scalability, low latency, and reliability. The ideal candidate will lead the design and implementation of data pipelines, real-time processing systems, and analytics platforms that support trading decisions and insights.

Key Responsibilities

Data Architecture Design: Maintain a scalable, high-performance data architecture tailored for real-time trading data, trading events, and analytics.

Tool Selection: Identify and integrate the most effective big data tools and frameworks to handle the ingestion, processing, and storage of Solana-based blockchain data.

Real-Time Data Processing: Build and maintain stream-processing systems using tools like Apache Kafka, Spark Streaming, or Flink for real-time price feeds and trading events.

Optimize Data Storage : Design and optimize storage solutions using a combination of in-memory databases (e.g., Redis) for active trading data and scalable databases (e.g., Cassandra, ClickHouse) for analytics.

Performance Monitoring: Monitor, troubleshoot, and optimize the performance of the data pipeline to handle high-throughput scenarios, such as trading spikes.

Scalability: Implement caching strategies and horizontal scaling solutions to maintain low latency and high availability.

Security : Deploy monitoring systems (e.g., Prometheus, ELK Stack) to oversee system health, data flow, and anomalies.

Collaboration: Work closely with engineering, product, and analytics teams to align data solutions with business goals.

Troubleshooting: Resolve issues in the big data ecosystem and ensure high availability and reliability.

Requirements

Technical Expertise:

Proficiency in distributed computing principles and large-scale data management for financial or trading systems.

Proficiency in tools like Kafka, Spark, and Flink

Strong expertise in stream-processing frameworks like Spark Streaming, Apache

Flink, or Storm.

Hands-on experience with Hadoop ecosystems, including HDFS, MapReduce, and

YARN.

Proficiency in TypeScript with 5+ years of experience

Proficiency in ETL tools and frameworks, such as Apache Nifi, Airflow, or Flume.

Expertise in messaging systems such as Kafka, RabbitMQ, or Pulsar.

Strategic Insight:

Proven ability to design and implement scalable data solutions for high-performance systems.

Strong understanding of data lifecycle management and governance principles.

Team Collaboration:

Strong communication skills to collaborate across technical and business teams.

Self-starter with the ability to work independently and solve complex problems.

Preferred Experience

Familiarity with the Solana ecosystem and crypto trading platforms.

Experience to implement P&L analytics, user behavior insights, and predictive trading

models using tools like SparkML or TensorFlow.

What We Offer

Remote Flexibility: Work from anywhere while contributing to a high-impact role.

Growth Opportunities: Be a key player in defining data infrastructure.

Challenging Projects: Work with cutting-edge technologies and tackle complex data

challenges.

Collaborative Culture: Join a team that values innovation, expertise, and efficiency.

Interview Process
1 Recruiter / HR Call

2 Technical Interview

3 Founder / CEO Interview

Apply now

Subscribe our newsletter

New Things Will Always Update Regularly