Job Type






Big Data Engineer with Scala (Relocate to Poland) Kazakhstan


Luxoft Kazakhstan Kazakhstan

2 months ago

🔔 Are you ready to relocate to Poland? If YES this is the project for YOU!

Our benefits:

👩⚕️ Private Medical Care in Luxmed and Life Insurance

🏋️♀️ Multisport Card

👨👧👦 Paid referrals

📚 Self-learning libraries

🛫 Relocation package and assistance during all process...and MORE!

Project Description:

Cloud-based reference data platform, a new Finance Risk And Data Analytics capability, that will provide data mastering and distribution capabilities for various reference data domains including Instruments, Ratings, Book Data, Product Taxonomy, Legal Entity, Industries, Countries, Currencies, etc. As a service we provide cleansing, enriching and data quality in a centralized place and offering it to different systems, applications, or users, irrespective of where they are in the organization, or on the network.

As Big Data Engineer you will be building big data solutions to solve some of the organisation's toughest problems and delivering significant business value. This is a... really exciting time to join as you will be helping to shape the Reference Data Mastering and Distribution architecture and technology stack within our new cloud-based datalake-house.

We're a truly global, collaborative and friendly group of people. Having a diverse, inclusive and respectful workplace is important to us. And we support your career development, internal mobility and work-life balance.


• Shape the portfolio of business problems to solve by building detailed knowledge of internal data sources

• Model data landscape, obtain data extracts and define secure data exchange approaches

• Acquire, ingest, and process data from multiple sources and systems into Cloud Data Lake

• Operate in the fast-paced, iterative environment while remaining compliant with bank's Information Sec policies/standards

• Collaborate with others to map data fields to hypotheses and curate, wrangle, and prepare data for use in their advanced analytical models

• Help architect the strategic advanced analytics technology landscape

• Build reusable code and data assets

• Codify best practices, methodology and share knowledge with other engineers in the bank

Must have skills:

• Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts

• Experience in Distributed Processing using Databricks (preferred) or Apache Spark

• Meaningful experience with Scala

• Ability to debug using tools like Ganglia UI, expertise in Optimizing Spark Jobs

• Experience and interest in Cloud platforms such as Azure (preferred) or AWS

• The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets

• Expert in creating data structures optimized for storage and various query patterns for e.g. Parquet and Delta Lake

• Meaningful experience in at least one database technology such as:

- Traditional RDBMS (MS SQL Server, Oracle, PostgreSQL)

- NoSQL (MongoDB, Cassandra, Neo4J, CosmosDB, Gremlin)

• Understanding of Information Security principles to ensure compliant handling and management of data

• Experience in traditional data warehousing / ETL tools (Azure Data factory, Informatica)

• Ability to clearly communicate complex solutions

• Proficient at working with large and complex code bases (Github, Gitflow, Fork/Pull Model) and development tools like IntelliJ

• Working experience in Agile methodologies (SCRUM, XP, Kanban)

• Comfortable multi-tasking, managing multiple stakeholders and working as part of agile team

• Excellent communication skills including experience speaking to technical and business audiences and working globally

• Strong problem solving and analytical skills

• Keen to learn and share new concepts

Nice to have:

Relevant certifications


English: B2 Upper Intermediate

Salary Criteria












Copyright © 2023 Fonolive. All rights reserved.