Beschreibung
Data Lake Cloud DevOps (100% remote)Your tasks
* Innovation and implementation of the cloud Data Lake platform infrastructure as the basis for data engineers to work with their data.
* Optimize existing infrastructure and data transformation pipelines to become more scalable = Self Service ability.
* Optimize data provisioning workflow to allow real time replication of data from on-premise into Data Lake (AWS).
* Implement new features like provider or consumer connections as IaaC (e.g. PowerBI connection towards AWS).
* Implement cost optimizations as part of the platform. * Implement test cases as part of development process.
* Support solution architects to define the software architecture.
* Taking end to end responsibility for changes throughout delivery pipeline.
Your qualification
* 5+ years professional experience in developing and operating software solutions.
* 3+ years professional experience in in handling of large sets of data in the area of big data or data warehousing.
* 2+ years professional experience in building scalable big data solutions in the area of cloud or Hadoop.
* 2+ year experience with AWS and hands on experience to build Infrastructure as code and services on top.
* Demonstrated ability to build processes that support data transformation, data structures, metadata, dependency and workload management.
* Professional experience in designing and developing data pipelines in Python/Spark/Scala.
* Practical experience with Terraform and Airflow is preferred.
Soft skills
* Proactivity; Curiosity; Responsibility; Ideas & Confidence.
* Structured working approach and problem-solving skills.
* Fluent English; German or another CEE language is appreciated, but not mandatory
* 100% remote, company in Vienna
* Scope of Work: 5 Days a Week „Full Time"
* Start: Beginning of Mai 2021
* Duration 6 month
Michael Bailey International is acting as an Employment Business in relation to this vacancy.