Senior/Principal Data Engineer

Brazil Latin America Data Engineering Engineering Python

Required skills

Data Engineering / expert
Python / expert
Data Modeling / strong
ETL/ELT / strong
Data Quality & Monitoring / strong

We are looking for a Senior/Principal Data Engineer to join our team and design a scalable, future‑proof data infrastructure.

Customer

Our customer is a rapidly scaling AI‑driven SaaS platform that helps finance and accounting teams automate critical workflows — from billing and collections to revenue recognition and reporting. Their technology eliminates manual work, accelerates cash flow, and ensures compliance for high‑growth businesses.

Project

This greenfield project focuses on building a robust data platform that will power analytics, business intelligence, and AI‑driven features. You will design scalable data lakes or lakehouses, develop ingestion pipelines, and ensure data quality and observability. The platform will integrate with multiple internal and external systems, enabling self‑service analytics for both business and technical teams.

Requirements

  • 6+ years of experience in data engineering
  • Strong programming skills in Python
  • Proven track record of designing and delivering early‑stage data platforms from concept (v1) to production
  • Strong expertise with modern data tooling (e.g., Snowflake/BigQuery/Redshift, dbt, Airflow/Dagster/Prefect, Fivetran/Airbyte, etc.)
  • Solid understanding of data modeling, ETL/ELT, and pipeline optimization
  • Strong knowledge of data quality, testing, and monitoring best practices
  • Upper‑Intermediate English or higher

WILL BE A PLUS

  • Experience with AI/ML data pipelines
  • Familiarity with finance/accounting datasets
  • Knowledge of compliance frameworks such as SOX or GDPR
  • Experience mentoring junior engineers

Personal Profile

  • Proactive problem‑solver with a hands‑on approach
  • Adaptable to fast‑moving environments
  • Strong communication skills for cross‑team collaboration
  • Ability to take ownership and drive initiatives to completion

Responsibilities

  • Design and implement a scalable data warehouse or data lakehouse to support analytics, reporting, and business KPIs
  • Develop and maintain reliable batch and/or streaming data pipelines from internal databases and external systems
  • Collaborate with stakeholders to translate business requirements into efficient data models and schemas
  • Establish and maintain data modeling standards and best practices
  • Implement monitoring, data quality controls, and observability for all data workflows
  • Provide well‑structured datasets to enable self‑service analytics for BI and data teams
  • Document the data platform, including lineage, definitions, and contracts, to create a shared source of truth for metrics

WHY US

  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities

REF3887A

Share this vacancy

apply now

apply now

    OR

    Drop your CV here, or

    Supports: DOC, DOCX, PDF, max size 5 Mb

    Take a quiz

    Take a quiz

      Was it comfortable to apply the CV?


      How did you find us?




      Did you hear about us before visiting the site?