Talent.com
This job offer is not available in your country.
Data Engineer AWS @ apreel Sp. z o.o.

Data Engineer AWS @ apreel Sp. z o.o.

apreel Sp. z o.o.Remote, Czech Republic
19 days ago
Job description

Data Engineer AWS

We are seeking a skilled and proactive Data Engineer (Mid or Senior) to join our growing data team. In this role, you will design, build, and maintain modern data pipelines and architecture in a cloud-native environment using AWS, Databricks, Python, and dbt. You’ll work closely with analysts, data scientists, and business stakeholders to ensure clean, scalable, and reliable data delivery across the organization.

Offer :

  • Location : Poland / REMOTE
  • Employment : B2B contract with apreel
  • Rate : up to 170 zł / h

Required Skills & Experience :

  • Solid experience as a Data Engineer (2+ years for mid, 4+ for senior level)
  • Strong hands-on experience with AWS data services (e.g., S3, Glue, Redshift, Lambda)
  • Proficiency in Python for data engineering tasks
  • Practical knowledge of Databricks and the Spark ecosystem
  • Experience with dbt (Data Build Tool) for data transformation and modeling
  • Familiarity with modern version control and CI / CD workflows (e.g., Git, GitHub Actions, Jenkins)
  • Nice to Have :

  • Knowledge of data orchestration tools (e.g., Airflow, Prefect)
  • Experience with data quality tools and observability frameworks
  • Exposure to infrastructure as code (e.g., Terraform, CloudFormation)
  • Experience working in Agile / Scrum environments
  • Data Engineer AWS

    We are seeking a skilled and proactive Data Engineer (Mid or Senior) to join our growing data team. In this role, you will design, build, and maintain modern data pipelines and architecture in a cloud-native environment using AWS, Databricks, Python, and dbt. You’ll work closely with analysts, data scientists, and business stakeholders to ensure clean, scalable, and reliable data delivery across the organization.

    Offer :

  • Location : Poland / REMOTE
  • Employment : B2B contract with apreel
  • Rate : up to 170 zł / h
  • Design, develop, and optimize ETL / ELT pipelines using Databricks and dbt  , Build and manage data models, data lakes, and data warehouses on AWS  , Write efficient and scalable code in Python to process large datasets  , Collaborate with cross-functional teams to understand data needs and deliver solutions  , Ensure data quality, observability, and performance across the entire pipeline  , Support continuous integration and deployment of data workflows  , Implement data governance, security, and compliance best practices  ] Requirements : AWS, Data pipelines, Databricks, Python, dbt, ETL, Data models, Data warehouses, Continuous integration, Security, AWS S3, Glue, Redshift, AWS Lambda, Data engineering, Spark, Git, GitHub, Jenkins, Airflow, Infrastructure as Code, Terraform, CloudFormation Tools : GIT, Jenkins.

    Create a job alert for this search

    Data Engineer • Remote, Czech Republic