Middle/Senior Big Data Engineer (Big Data Competence Center)

Argentina Brazil Colombia Europe Europe (remote) Latin America Mexico Poland Remote (Ukraine) Ukraine Big Data Engineering

Required skills

Python / good
AWS / good
English / good

Are you looking for a team of professionals on the top of cutting-edge technologies? Are you on the lookout for a place to boost your Big Data career? We are inviting you to join our Big Data Competence Center, which is a part of Sigma Software’s complex organizational structure that combines various clients, interesting projects and activities to grow your professional skills. 


Big Data Competence Center – is a place where we collect the best engineering practices and unite them into one knowledge base to provide the best technical excellence services for our clients. 

We are not just a team gathered to write a code. We all are willing to contribute to the field either by participating in the life of a Big Data Unit, or by constantly growing our own skills.  

In our unit, you can be a Big Data engineer/team lead/architect, or you can become a mentor or a person behind all the new technologies in the team, or an active listener if you will. Whatever you decide, be sure: you’re not alone. Whether it is a challenging task, an unordinary request from the client, or your next project to choose – you will always have a mentor and teammates with likewise mindsets to come up with the best solution. 

We are using an unusual approach of hiring people not for the specific project, but to our team in the first place. It gives us a chance to get to know you better and ensure that we will provide the perfect match between the client’s needs and your professional interests.

We’re acting in various business domains and working with the top range of clients (please see the ones without NDA here: sigma.software/case-studies). 

Our team highly supports employees’ freedom, independence in decision making, desire to deeply understand client’s request and see the root of the problem. We believe that people who are striving to this mindset are bound to succeed as an acclaimed professional & driving force of Big Data development in Ukraine. 

If you have the courage to become the part of the top-notch Big Data community in Ukraine, then grab your chance and let’s make some history together! 


  • 3+ years of experience with Python and SQL 
  • Experience with AWS, specifically API Gateway, Kinesis, Athena, RDS, and Aurora 
  • Experience building ETL pipelines for analytics and internal operations 
  • Experience building internal APIs and integrating with external APIs 
  • Working with Linux operational system 
  • Effective communication skills, especially for explaining technical concepts to nontechnical business leaders 
  • Desire to work on a dynamic, research-oriented team 
  • Experience with distributed application concepts and DevOps tooling 
  • Excellent writing and communication skills 
  • Troubleshooting and debugging ability


Would be a plus:

  • 2+ years of experience with Hadoop, Spark, and Airflow 
  • Experience with DAGs and orchestration tools 
  • Experience with developing Snowflake-driven data warehouses 
  • Experience with developing event-driven data pipelines 


  • Contributing to modern technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability 
  • Working with modern data stack, producing well-designed technical solutions and robust code, implementing data governance processes 
  • Working and professionally communicating with customer’s team 
  • Taking up responsibility for delivering major solution features 
  • Participating in requirements gathering & clarification process, proposing optimal architecture strategies, leading the data architecture implementation 
  • Developing core modules and functions, designing scalable and cost-effective solutions 
  • Performing code reviews, writing unit and integration tests 
  • Scaling the distributed system and infrastructure to the next level 
  • Building data platform using power of modern cloud providers (AWS/GCP/Azure) 


Extra responsibilities:

  • Developing the Micro Batch/Real-Time streaming pipelines (Lambda architecture) 
  • Working on POCs for validating proposed solutions and migrations 
  • Leading the migration to modern technology platform, providing technical guidance 
  • Adhering to CI/CD methods, helping implement best practices in the team 
  • Contributing to unit growth, mentoring other members in the team (optional) 
  • Owning the whole pipeline and optimizing the engineering processes 
  • Designing complex ETL processes for analytics and data management, driving the massive implementation 


  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities

Share this vacancy

apply now

apply now


    Drop your CV here, or
    Supports: DOC, DOCX, PDF, max size 5 Mb