Senior Data Engineer (Healthcare domain)

Bulgaria Europe Europe (remote) Poland Romania Big Data Engineering Python

Необхідні навички

Python / expert
Databricks / expert
Azure / strong
SQL / expert
Kubernetes / strong

Are you passionate about building large-scale cloud data infrastructure that makes a real difference? We are looking for a Senior Data Engineer to join our team and work on an impactful healthcare technology project. This role offers a remote work format with the flexibility to collaborate across international teams.

At Sigma Software, we deliver innovative IT solutions to global clients in multiple industries, and we take pride in projects that improve lives. Joining us means working with cutting-edge technologies, contributing to meaningful initiatives, and growing in a supportive environment.

Клієнт

Our client is a leading medical technology company. Its portfolio of products, services, and solutions is at the center of clinical decision-making and treatment pathways. Patient-centered innovation has always been, and will always be, at the core of the company. The client is committed to improving patient outcomes and experiences, regardless of where patients live or what they face. The Customer is innovating sustainably to provide healthcare for everyone, everywhere.

Проєкт

The project focuses on building and maintaining large-scale cloud-based data infrastructure for healthcare applications. It involves designing efficient data pipelines, creating self-service tools, and implementing microservices to simplify complex processes. The work will directly impact how healthcare providers access, process, and analyze critical medical data, ultimately improving patient care.

Вимоги

  • Hands-on experience with cloud computing services in data and analytics
  • Experience with data modeling, reporting tools, data governance, and data warehousing
  • Proficiency in Python and PySpark for distributed data processing
  • Experience with Azure, Snowflake, and Databricks
  • Experience with Docker and Kubernetes
  • Knowledge of infrastructure as code (Terraform)
  • Advanced SQL skills and familiarity with big data databases such as Snowflake, Redshift, etc.
  • Experience with stream processing technologies such as Kafka, Spark Structured Streaming
  • At least an Upper-Intermediate level of English

Обов'язки

  • Collaborate with the Product Owner and team leads to define and design efficient pipelines and data schemas
  • Build and maintain infrastructure using Terraform for cloud platforms
  • Design and implement large-scale cloud data infrastructure, self-service tooling, and microservices
  • Work with large datasets to optimize performance and ensure seamless data integration
  • Develop and maintain squad-specific data architectures and pipelines following ETL and Data Lake principles
  • Discover, analyze, and organize disparate data sources into clean, understandable schemas

Чому ми

  • Різноманітність доменів та бізнесу
  • Великий вибір технологій
  • Медична та юридична підтримка
  • Круте та живе ком'юніті професiоналiв
  • Безперервна освіта та можливість росту
  • Гнучкий графік
  • Віддалена робота
  • Стильний та комфортний офіс (для твого комфорту вибору звiдки працювати)
  • Спортивні заходи та спільноти

REF3813F

Поділитися вакансією

надіслати зараз

надіслати
зараз


    АБО

    Перетягни резюме або

    Підтримується: DOC, DOCX, PDF, розмір до 5 Mb

    Take a quiz

    Take a quiz

    Error: Contact form not found.