We have an upcoming project based in Europe , expected to last at least 6 months.
Requirements
- Solid experience in Databricks , including hands-on data engineering and analytics.
- Strong proficiency in SQL and Python / PySpark .
- Experience with cloud platforms ( AWS, Azure, or GCP ).
- Previous experience in similar projects (e.g., Abbot or Bayer) is a plus.
Responsibilities
Design, develop, and maintain data pipelines and workflows in Databricks .Work with large datasets, performing ETL and data transformation tasks using SQL and Python / PySpark .Collaborate with the team to implement cloud-based data solutions and optimize performance.Support data analytics and reporting requirements as needed.Languages
English level B2+ (Upper-Intermediate or higher)Time zone
Europe (CET)Benefits
Monthly salary in USD100% remote work opportunity15 working days of paid vacation per year (available after 6 months)Up to 10 national holidays (based on the project team location or the client's country)5 paid sick leave daysHealth insurance reimbursement (up to $1,000 gross per year)