We are continuing to grow and need to bolster our team with more exceptional Java/Scala developers. In this role you will be responsible for building big data services and pipelines, using Scala, at one of our most innovative clients. You will develop algorithms to match, conflate, and identify anomalies, as well as improve the simplicity, scale, and efficiency of the systems to handle more and more data, along with improving the turnaround time in order to keep the data as fresh as possible. As a part of this role, you are also expected to identify and propose improvements, such as automating parts of the current process and formulating requirements for new functionality and ways of designing such a solution.
This role will suit you if you are fast at identifying solutions and enjoy and employ a great deal of creativity in the problem-solving process. Your colleagues will be among the best and brightest the region has to offer who will both support you in learning more and challenge your thoughts and ideas in order to arrive at the best solution. Despite your substantial experience and knowledge as a developer, it’s natural for you to continue to grow and develop in your role, benefit from your new colleagues and their wisdom, as well as share your previous experiences and help others use that knowledge in their work.
- Able to wear multiple hats, “do what it takes” – ability and attitude;
- Strong programming and design skills;
- Deep knowledge and extensive experience of the JVM, preferably Scala;
- Excellent analytical and problem-solving skills;
- Excellent oral and written communication skills in English.
Extra Merit qualifications
- Event Streaming, e.g Kafka;
- Ability to design and implement APIs and REST services;
- Experience with one or more application frameworks like Spark, Akka, Play Big data ETL, and data streaming;
- NoSQL Databases;
- Experience or knowledge on how to apply Machine Learning or AI on large amounts of data.
Location: Sweden (on-site)