Beschreibung
Sehr geehrte Damen und Herren,wir sind derzeit auf der Suche nach einem Data Engineer für Data Integration und Big Data (m/w)
Zeitraum: ab sofort
Sollte Ihnen dieses Projekt zusagen, können Sie sich jederzeit bei mir melden!
Projektdetails:
• Designs, develops, maintains and tests Data Ingestion solutions “bottom-up” from source (e.g. legacy / mainframe / near real time) to target (files / interfaces / Data marts) using Talend Data Fabric, NiFi, et.al.
• Implements Change Data Capture pipelines
• Supports the team in building-up large-scale data processing systems and complex big data projects
• Builds data processing systems in Hadoop, Spark and Hive (Cloudera Data Platform)
• Understands how to apply technologies to solve big data problems and contributes to the design of Enterprise Data Use Cases
• Focuses on collecting, parsing, managing, analyzing and visualizing large sets of data coming from heterogeneous domains
• Contributes to Data Governance in terms of enabling Data Lineage, Data Cataloging and Data Modeling
• Works in a highly motivated interdisciplinary team with different Business stakeholders, Architects, Data Engineers, and Data Scientists from both Business and IT
qualification
• 3 - 5 years minimum Java Development in Enterprise-grade environments as a must
• Focus on cutting-edge Data Integration & ETL (e.g. Talend Data Fabric, NiFi), as well as Data Replication and Messaging Broker (e.g. Kafka) tools in Big Data Hadoop/Spark/Hive/HBase/Impala/Kudu ecosystems like Cloudera / Hortonworks as a must
• FinTech / Insurance know-how as a plus
• Knowledge in different programming or scripting languages like SQL (ANSI and Dialects), including Bash know-how
• Experience in Software Development Lifecycle and in releasing applications via GIT workflows and automation
• Excellent oral and written communication skills in German and English