Senior Data Engineer (Integrations / Data Platform)

Senior Remote
Europe, Albania, Poland, Romania, Ukraine, Latin America
Engineering, Data Engineering, Software Development, Java

Required skills

Java / expert
Scala / expert
SQL / strong
Data Modelling / strong
GCP Dataflow / good

Are you passionate about building scalable, reliable data systems that power real-world solutions? We are looking for a Senior Data Engineer to join our team and work remotely from Europe, Ukraine, or LATAM.

At Sigma Software, we partner with innovative companies to deliver cutting-edge technology solutions. This role offers the opportunity to design and operate a canonical data platform that integrates diverse external data sources, ensuring high quality and flexibility.

Why join us? You’ll work with modern technologies, collaborate with talented professionals across multiple regions, and have the freedom to experiment and innovate while contributing to impactful projects in the AdTech domain.

Customer

Our client is a fast-growing AdTech company focused on solving real-world challenges in digital advertising and data-driven marketing. They specialize in building highly resilient and scalable systems that encourage experimentation and innovation. Their data platform is designed to be both powerful and flexible, enabling engineers to introduce new ideas and leverage the latest technologies to optimize advertising performance and audience insights.

Project

The project involves developing and maintaining a canonical data platform that integrates multiple external data sources. It supports both full data loads and incremental updates, ensures high data quality, and leverages AI-assisted tools for monitoring and issue detection.

Key Technologies: Java, Scala, SQL, GCP Dataflow, REST APIs, AI-assisted tools, event-driven architectures

Requirements

  • At least 5 years of experience as a Data Engineer or Backend/Data Engineer
  • Strong production experience with Java or Scala
  • Proven experience integrating external systems via REST APIs (pagination, rate limiting, token-based authentication)
  • Strong data modelling skills with structured, relational data
  • Solid SQL knowledge
  • Experience with GCP Dataflow
  • Experience operating data pipelines in production environments
  • Strong communication skills and ability to collaborate across teams
  • At least an Upper Intermediate level of English

WILL BE A PLUS:

  • Experience with event-driven architectures or messaging systems
  • Designing canonical schemas across multiple data sources
  • Experience with LLM (Large Language Models)

Personal Profile

  • Proactive and detail-oriented
  • Thrives in collaborative environments
  • Enjoys solving complex integration challenges
  • Comfortable experimenting with new technologies
  • Committed to improving data systems

Responsibilities

  • Design, build, and maintain reliable data pipelines to ingest and process data from external systems
  • Implement integrations using Java or Scala, ensuring scalability and resilience
  • Handle both full data loads and incremental updates efficiently
  • Collaborate with engineering and product teams to align on integration and data needs
  • Develop and maintain a unified internal data model
  • Ensure data quality, detect and resolve incorrect or incomplete data
  • Utilize AI-assisted tools to improve data quality and operational efficiency
  • Monitor and optimize data pipelines for production readiness

WHY US

  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities

REF3987P

Share this vacancy

icon icon

Apply


    Drop your CV here, or

    Supports: DOC, DOCX, PDF, max size 5 Mb

    Take a quiz

    Take a quiz

      Was it comfortable to apply the CV?


      How did you find us?




      Did you hear about us before visiting the site?