Job Detail

Big Data Engineer with Kafka knowledge

Inseriert am: 04.06.2019
Named as one of Fortune’s List of 100 Fastest Growing Companies for 2019, EPAM is committed to providing our global team of 36,700+ EPAMers with inspiring careers from day one. EPAMers lead with passion and honesty, and think creatively. Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence. No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.




DESCRIPTION

Currently we are looking for a
Big Data Engineer with Kafka knowledge for our Zurich office to make the team even stronger.



Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us:



- Engineer and integrate the platform from a technology point of view;

- Engineer core Big Data platform capabilities.




YOU ARE

- A passionate Big Data Engineer looking for new challenges;

- Proactive, looking for creative and innovative solutions;

- A flexible, open-minded and cooperative person;

- Interested in working in a fast-paced international environment as part of an international team.




YOUR TEAM

You’ll be working in close association the Advanced Analytics (Big Data Analytics) team in Zurich. We offer you a highly motivated and experienced team with in-depth knowledge about Advanced Analytics, building models and algorithms. Our goal is to enable the Business’ for a digital and analytical future! You’ll operate and maintain the whole platform end to end and work closely with the development teams. This role includes significant responsibilities and development possibilities.

Requirements



  • Bachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experience

  • At least 3 years of experience in designing and operating Kafka Clusters (Confluent and Apache Kafka) on-premise

  • At least 5 years of experience in design, sizing, implementation and maintaining Hortonworks based Hadoop Clusters

  • At least 3 years of experience in securing and protection of Hadoop clusters (Ranger, Atlas, Kerberos, Knox, SSL)

  • At least 5 years of experience in designing Big Data architectures

  • At least 5 years of demonstrated experience in gathering and understanding customer business requirements to introduce Big Data technologies

  • At least 5 Years of Experience in configuring the tools from the Hadoop ecosystem, like Hadoop, Hive, Spark, Kafka, Solr, Nifi

  • Experience with IBM Watson Studio Local integration

  • Experience with IBM DB2 is a plus

  • Experience with IBM Power Systems is a plus

  • Experienced in implementing complex security requirements in the financial industry

  • Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality


We offer



  • Experience exchange with colleagues all around the world

  • Competitive compensation depending on experience and skills

  • Regular assessments and salary reviews

  • Develop integration modules for interacting with new systems and applications

  • Opportunities for self-realization

  • Friendly team and enjoyable working environment

  • Corporate and social events

  • Please note that any offers will be subject to appropriate background checks

  • We do not accept CV from recruiting or staffing agencies

  • Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland

Details