Luxoft is a global leader in high-end software development.
Luxoft is looking for talents with a passion for technology & ready to create original solutions. Once on board, you are invited to expand your knowledge & skills, offering you a continuous learning experience helping you stretch your potential.
So if you’re enthusiastic by the idea of accessing cutting edge technology & innovation to make an impact, why don't you join us?
The successful candidate will help in the design and implementation of an ETL (Extract-Transform-Load) engine as well as a post-ETL analytics pipeline. Developing new ETL jobs and migrate existing ones from the Pivotal platform to the new Cloudera platform. Besides the core components, candidate will also help in the building and operating of a logging, monitoring and alerting infrastructure.
- Bachelor Degree in Computer Science, Information Technology, or similar disciplines
- At least 5 years' experience as a developer in large and complex application development projects, preferably in the banking/finance industry.
- Fluent in written and spoken English
- Strong problem solving skills
- Strong attention to detail
- Fast learner
- Passionate and self-motivated
- Proactive, driven and capable of handling high pressure situations
Must Have Technical experience, knowledge and skills:
- .NET (C#) (4 years minimum)
- Python, Bash, Java
- knowledge about Big data space
Nice to have:
- Both Linux and Windows
- SQL (MS SQL Server, PostgreSQL, Oracle)
- Apache Hadoop - HDFS, MapReduce, Spark, Pig/Hive, Parquet, Flume
- Cloudera Impala
- Messaging systems - RabbitMQ
Luxoft is seeking for a Big Data Application Developer in the Enterprise Services - Data, Reporting & Analytics group.
Enterprise Services provides services that span the organization by delivering enterprise tools, service management, centralized reporting, environment management (including Virtual Banks) and enabling practices such as monitoring, systems management, service continuity and Acceptance into Service, to ensure consistency in products and tools.
The Big Data Application Developer role will be part of the team assigned to develop a new version of Big Data ETL (Extract-Transform-Load) engine for the Apache Hadoop-based Data Lake platform and in migrating existing ETL jobs from a cluster built on the Pivotal platform to a new cluster built on the Cloudera platform.
