Senior Data Engineer Scala

NIX, a global supplier of software engineering and IT outsourcing services, is looking for a Senior Data Engineer Scala in its office in Budapest (Vaci Greens, 13th district). You’ll be part of a team of professionals who are ready to find the best tailor-made IT solutions for their multinational clients in various industries and solve complex problems.

 

Responsibilities:

  • Collaborate with cross-functional teams to identify, design, and implement real-time data processing features to meet dynamic business requirements.
  • Architect and maintain scalable systems for extracting, transforming, and loading data from diverse sources such as external APIs, data streams and databases.
  • Implement robust solutions adhering to data privacy and security standards, ensuring compliance and data integrity.
  • Monitor industry trends and advancements in real-time data processing, proposing and implementing changes aligned with organizational goals.
  • Share expertise and collaborate with teams on various data engineering and project-related topics.
  • Evaluate, select, and implement tools and strategies for effective real-time data integration scenarios.

 

What we expect from you:

  • 5+ years of commercial experience in data engineering, emphasizing real-time processing.
  • Proficient programming skills in Scala.
  • In-depth understanding and practical experience with distributed computing approaches, patterns, and technologies, especially those related to real-time processing (e.g., Spark Streaming, Akka Streams).
  • Hands-on experience with any cloud platform (GCP, AWS, Azure) and its data-oriented components.
  • Strong proficiency in Scala, SQL and query tuning.
  • Expertise in data warehousing principles and modeling concepts.
  • Competence in using relational databases (PostgreSQL, MSSQL or MySQL).
  • Experience orchestrating data flows using tools like Apache Flink, Apache Kafka or similar technologies.
  • A collaborative team player with excellent communication skills.
  • Minimum English level B2.

 

Will be a plus:

  • Expertise in stream processing using industry standards (e.g., AWS Kinesis, Kafka Streams, Spark).
  • Deep knowledge of data storage design principles and understanding of SQL/NoSQL solutions and their configurations.
  • Experience in modern data warehouse building using platforms like Snowflake, AWS Redshift or BigQuery.
  • Proficiency in Spark internals, including tuning and query optimization for real-time processing.
  • Background in data integration and business intelligence architecture with a real-time perspective.
  • Experience with data lakes and lake-houses (e.g., Azure Data Lake, Apache Hudi, Apache Iceberg, Delta Lake).
  • Familiarity with containerized (Docker, ECS, Kubernetes) or serverless (Lambda) deployment.
  • Knowledge of popular data standards and formats (e.g., JSON, Avro, ProtoBuf).
  • Exposure to data science and machine learning.

 

What we offer:

  • Competitive compensation packages.
  • Stable employment, based on a full-time employment contract.
  • Private health insurance (Medicover Сlinic).
  • AYCM sport pass, providing discounts at various sports facilities in Hungary.
  • Interesting tasks and diverse opportunities for developing your skills.
  • Free training courses, including English.
  • Participation in internal and external thematic events, technical conferences.
  • A spacious office in the heart of Budapest (13th district).
  • All necessary devices and tools for your work.
  • Friendly, motivating atmosphere.
  • Active corporate life.

If you feel you’re ready to join the team, apply for this job now! We’re already looking forward to meeting you!

Munkavégzés helye

Budapest

Jelentkezem az állásra ›