Luxoft is a global leader in high-end software development.
Luxoft is looking for talents with a passion for technology & ready to create original solutions. Once on board, you are invited to expand your knowledge & skills, offering you a continuous learning experience helping you stretch your potential.
So if you’re enthusiastic by the idea of accessing cutting edge technology & innovation to make an impact, why don't you join us?
The successful candidate will be asked to put in place automation and tooling that will help manage and grow the clusters with very little manual work. The Hadoop team will take a development centric approach for managing the Hadoop infrastructure and will be using modern tooling/processes. The team subscribes to DevOps principles when it comes to culture, process and tooling.
As a successful candidate for this role you will have:
- Strong ability to code with Bash and Python
- Experience working on Hadoop.
- Experience rolling out Hadoop in production.
- Understanding of the Hadoop ecosystem such as M/R, Spark, Impala, Hbase, Storm, HDFS and Yarn. *Good understanding of different application design patterns for consuming Hadoop
Nice to have:
- Bachelor Degree in Computer Science, Information Technology, or similar disciplines
- Strong Linux background
- 2+ years of experience using any of infrastructure management tools including Puppet, Chef, Salt or Ansible.
- Experience designing Compute, storage and Network layers for Hadoop.
- Good understanding of ETL engines and native hadoop tooling such as Sqoop, Flume, RabbitMQ and Pig.
- Good understanding of traditional data warehouse concepts.
- Experience with DevOps models and is a big supporter of this movement.
- Experience with CI/CD tooling such as Jenkins.
- Loves to code in Python, Ruby or any other programming language.
- Good understanding of Java Virtual Machine.
- Experience working in Scrum or Kanban Teams.
- Good understanding of Source control systems such as SVN.
- Loves working on interesting and challenging problems.
Enterprise Services provides services that span the organization by delivering enterprise tools, service management, centralized reporting, environment management (including Virtual Banks) and enabling practices such as monitoring, systems management, service continuity and Acceptance into Service, to ensure consistency in products and tools.
The Big Data System Engineer will be part of the Big Data Platform team. You will be part of a culture that pushes innovation, automation and promotes creativity for engineers. You will be focusing on Hadoop but also all layers of the stack. You will be working with physical clusters but will be exploring technologies that are innovative in this space which will allow more agility and control for our Hadoop AppDev teams.
