Senior Big Data Engineer (AdTech)

  • Remote (Ukraine only)
  • Big Data

Are you thrilled about designing the right architecture for a new client and kicking it off at full scale in production? Do you enjoy putting a strong team together and lead it towards a project’s success? Are you keen on sharing your knowledge and educating people around you? Then you are very likely to enjoy this role!

Being our team member means being open minded, proactive, and friendly, supporting, ready to accept the challenge, take responsibility, and never stop learning.


We are working with a rapidly growing US AdTech company. Founded by three ex-Googlers, it has a highly technical team and an excellent technical culture.


The project is about building the next generation of real-time bidding software that enables sophisticated marketers to break free from the limitations and constraints of opaque, one-size-fits-all programmatic buying platforms. Most of the ETL is written in Python and managed by Airflow DAGs.

The Kinesis application is written in Java and handles a variety of scaling and configuration challenges. Having an excellent working knowledge of SQL is critical as we do a number of the ETL steps in Snowflake, and a poorly written query could have a significant performance and cost impact.

You will cooperate closely with Big Data team on a customer’s side to improve a platform looking into new features and become more efficient performance-wise. The team currently works on several tools and applications. By joining our team, you would work with some or all of them:

  • Manage and build on a high scale event parsing and recording system. We use Kinesis to handle billions of events and ship them to S3, Snowflake, databases, and a variety of other logs, both internal and external
  • Manage and build on a set of ETL pipelines that move terabytes of data through Snowflake
  • Operate multiple services that provide real-time data flows to our internal systems (both the UI and the optimization engines)
Meet your team!
  • Responsibilities

    • Contributing to new technologies investigations and complex solutions design
    • Coming up with well-designed technical solutions and robust code
    • Working and professionally communicating with the customer’s team
    • Taking up responsibility for delivering major solution features
    • Participating in requirements gathering and clarification process
    • Developing core modules and functions
    • Performing code reviews, writing unit, and integration tests
  • Requirements

    • Solid experience with Java or Python
    • Solid knowledge in algorithms and data structures
    • Experience with AWS technologies — Glue, Lambda, Step Functions, S3, etc.
    • Experience with developing data pipelines based on one of the mainstream frameworks like Spark/Flink/Presto, etc.
    • Experience with developing data lakes and data warehouses based on one of the mainstream technologies like Hive/Snowflake/OLAP/Hudi, etc.
    • Knowledge in SQL and solid experience with NoSQL or RDBMS technologies
    • Experience with data integrations and different data formats CSV, Protobuf, Parquet, Avro, ORC, etc.
    • Solid understanding of technical excellence and hands-on experience with code reviews, test development, and CI/CD

    Would be a plus:

    • Experience with developing Snowflake-driven data warehouses
    • Experience with developing Kinesis-driven streaming data pipelines