Data Scientist (Remote) Opportunity

Empiric company

Subscribe to our Telegram Channel

Data Scientist (Remote) in EUROPEAN UNION

Remote 4 months ago

Role: Data Scientist (2 Openings)

Rate: €400 per day

Contract Type: 3-month rolling contract (project expected to run for 18–24 months)

Work Arrangement: Fully remote

Interview Rounds: Two stages


Project Overview:

You'll be contributing to a large-scale energy trading and risk management (ETRM) initiative. The focus is on developing and deploying advanced machine learning solutions to support forecasting, analytics, and automation goals.


Essential Skills & Experience:

  • Strong background in time-series forecasting using models like Prophet, ARIMA, SARIMA, XGBoost, Random Forest, ElasticNet, Ridge, Lasso, and Linear Regression.
  • Hands-on experience with Python machine learning libraries including scikit-learn, sktime, and darts.
  • In-depth understanding of time-series feature engineering, including creation of lag variables, rolling window metrics, Fourier transforms, and approaches to seasonality.
  • Capable of optimizing and fine-tuning deployed predictive models for enhanced accuracy and stability.
  • Practical experience using Azure Machine Learning SDK (both v1 and v2), particularly for:
  • Managing assets (data, models, environments)
  • Creating and debugging ML pipelines for feature engineering, training, and deployment
  • Scheduling jobs and deploying endpoints


Additional Skills (Nice to Have):

  • Familiarity with unsupervised learning techniques such as K-Means clustering.
  • Experience designing and building scalable data solutions using cloud-native tools like Azure Data Factory, Azure Databricks, Azure Data Lake, and Azure Key Vault.
  • Ability to create and maintain resilient data pipelines for ETL processes, logging, and transformation using Azure Data Factory.
  • Skilled in large-scale data manipulation and analysis using PySpark and Python.
  • Collaborates effectively with product and engineering teams to translate business goals into data-driven solutions.
  • Experience setting up CI/CD pipelines within Azure DevOps for machine learning and data infrastructure.
  • Committed to data engineering best practices including governance, security, and performance tuning.
  • Stays updated with modern data technologies and methodologies, constantly pushing to evolve existing frameworks.


Apply now or reach out to Odin Lawton.

Apply now

Subscribe our newsletter

New Things Will Always Update Regularly