Job Detail

Big Data Developer

Inseriert am: 03.01.2020

Stellenbeschreibung


We are looking for a Big Data Developer to support our clients team as soon as possible in the Zurich Office.


 


The candidate will work on the team responsible for development of a Data Lake based on Cloudera technology stack. The project aims to build a scalable platform for sourcing data belonging to ITS division of Global Markets, enriching it further per business requirements and make it available to business users with the help of in-house and/or off-the-shelf dashboard tools. We require a senior Software engineer, who should be capable to pick up all tasks related to the entire lifecycle of product development, i.e. contact with stakeholders, gathering requirements and analysis, design of architecture, implementation, design and write different types of tests, packaging, deployment, support and maintenance software in production; know-how and understand Agile methodology and its tools. • Development of Data analytics software product • Contribute to modernization of the software solution, drive adoption of newer technologies    with a focus on scalability • Automated build, packaging and deployment process for production deployments. Additionally: Work closely with stakeholders on the analysis, design and development of new features Key Responsibilities • Develop software components responsible for ingesting end-of-day and intraday data and    also for data flows involving joins between these data sets • Actively participate in the design and technology review of the software components         developed in the team • Evolve overall architecture of the solution with the use of latest technologies available in      the bank • Work to streamline development process and to improve software performance • Contribute to integration testing (automated and manual) efforts as required • Collaborate with platform management and other team members on the requirements,       preparing the releases and delivering the applications to production • Assist to resolve incidents involving Production system (3rd level support) The main challenges will be: • To develop software components that meets business requirements • To provide robust solutions for the volumes of data the software is expected to process • To develop appropriate software solutions despite challenging timeframes • To comply with policies and standards typical of a large organization • To contribute to re-architecture of software landscape in the division Essentials Skills and Qualifications: 1. You have experience building data warehousing and analytics solutions using one of the      major Hadoop distributions and various ecosystem components (e.g. HDFS, Impala,         Spark, Flink, Flume, Kafka, etc.). 2. 4+ years of experience in Python and/or Scala programming languages 3. Experience with Data modelling and SQL query language 4. Experience in building Production data pipelines using Spark, Spark Streaming and Flink      technologies 5. Experience with Security in Hadoop environment 6. Bash scripting experience Desired Skills and Qualifications: 1. Practical experience with one of the following Big Data platforms: Cloudera or     Hortonworks (min. 1 years) 2. Experience with Elastic technology stack (Elastic / Logstash / Kibana) 3. Experience working with agile methodology and some basic project management skills


 


For this position, we can only consider talents who are eligible to work in Switzerland.


 


Feel free to send me your fully updated application documents including a daytime number where you can be reached.


 

Details