The Big Data Technology & Innovation cluster will provide the Data Lake as the single data hub for distributional and operational use and as high performance analytics hub:
- Establish a stable, state-of-the art technology base
- Set up data lake as single data and analytics hub and effectively ingest most important data sources
- Establish data quality and metadata management
- Provide Data Marts and sandboxes for segments and functions with the most important combination of data sources
- Provide new, maintain/integrate existing BI technology within the lake infrastructure
- Ensure precondition for decommissioning of secondary data bases
The team and roles:
In Commerzbank we are building the team of developers - Big Data & Advanced Analytics Engineers - who will be responsible for the continuous development of the Commerzbank central Hadoop Platform witch is one of the foundations of Commerzbank 4.0 Transformation to a digital technology company. Starting from juniors, mid specialists to seniors we are open to welcome five Big Data specialists.
The new innovation driven team will provide the key technology which is based on modern Hadoop Distribution from Hortonworks Data Platform and Data Flow. They are the base stack for our Data Lake and Analytics Platform.
- Preparing upgrade plans, upgrade and engineering of the Hadoop distribution and third party components
- Integrate those changes into Ansilble Playbooks
- Solve challenges like how to embed modern technology into an existing infrastructure
- Support Data ingestion into Hadoop platform with tools like Attunity, Apache Sqoop or NiFi
- Support Hadoop users and enable them for Self Service
- Investigate into future prove technologies make sure that we will keep pace with competitors
Main skills:
- UNIX/LINUX (Red Hat) RH 7.x to adjust Hadoop environment to existing Commerzbank authentication techniques
- Hortonworks Data Platform (HDP) 2.6.x or higher for Development of on premise Hadoop Cluster and Maintaine Hadoop Cluster and 3rd Party Products
- Hortonworks Data Flow (HDF) 3.x or higher; Apache NiFi, Apache Kafka
Alternative Skills:
- Python or Scala
- Spark
- Hive
Foreign Language Skills:
- B2 English (mandatory)
- B1/ B2 German (optional)
Working place & environment conditions:
- Open to business travel to Germany for training
- Self-driven quick learner
- Good problem solving ability and target orientation
- Customer orientation and team working attitude
- Ready to work in cross-locational and international teams in an open space office environment
We offer:
- Friendly multicultural environment, agile teams
- The possibility of professional training and growth (English and German courses; technical training and workshops)
- Stable employment
- Benefits package including: medical care, Multisport Card, group insurance, discounts in modern canteen
- Relax corner where you can recharge your batteries