We develop tailor-made IT systems that boost the businesses of our clients.
In our rich and constantly growing portfolio you can find:
- systems supporting very complex business processes – our own solutions which have revolutionized the businesses and increased the profits of our partners
- our own pioneering products in terms of concept and technology, created by our Research & Development Department
- last but not least: a huge backlog of innovative ideas and concepts which we will soon have turned into working prototypes, and ultimately into profitable start-ups
Currently we are looking for:
Senior Big Data DevOps Engineer
Place of work: Warszawa
to strengthen significantly our team and move forward with our expertise.
If you want to:
- take part in the development of advanced IT systems
- gain valuable experience in designing and implementing complex IT infrastructure used for processing large amount of data
- work with cutting-edge Big Data (and other) technologies
- develop systems that process a huge number of requests per second
- work with the world's top IT professionals
- carry out projects which address real business challenges
- have a real impact on the projects you work on and the environment you work in
- have a chance to propose innovative solutions and initiatives
it’s probably a good match.
Moreover, if you like:
- flexible working hours
- casual working environment and no corporate bureaucracy
- having an access to such benefits as Multisport and Medicover
- working in modern office in the centre of Warsaw with good transport links
- a relaxed atmosphere at work where your passions and commitment are appreciated
- challenges and many opportunities for development
it’s certainly a good match!
If you join us, your responsibilities will include:
- building, developing and maintaining distributed data processing and data analysis systems
- designing and developing systems' architecture in cooperation with software developers
- cooperation with data processing security experts
- monitoring, analyzing and optimizing systems in terms of its efficiency
- supervising and monitoring services
- automation of infrastructure and incidents management processes
- network and server troubleshooting
We expect:
- at least 5 years of experience working as DevOps Engineer, ideally in Big Data area
- excellent knowledge of Linux (mainly Ubuntu)
- experience in working with networks, servers and services monitoring tools (e.g. Icinga, Check_MK, Zabbix)
- advanced knowledge of virtualization and containers (LXC, Docker, KVM, Proxmox)
- advanced knowledge of continuous configuration automation tools (e.g. Ansible)
- skills in programming using scripting languages (bash, Python)
- skills in practical use of version control systems (Git, GitLab)
- working knowledge of Hadoop (as well as Cassandra and Kafka) clusters administration
- experience with Data Warehouse and Big Data tools (Hadoop, JupyterHub, Spark, Kafka, Hive, ZooKeeper)
- experience with Kubernetes
- knowledge of Linux, Java VM and MySQL tuning
- good skills in the administration of servers, load balancers and HTTP reverse proxy (Apache, Nginx, HAProxy)
- knowledge of network protocols (IPv4, IPv6, TCP, UDP)
- knowledge of any network file system (e.g. MooseFS, Ceph, HDFS)
- knowledge of VPN (OpenVPN, IPsec) and TLS protocols
- knowledge of best practices of servers and networks administration
- teamwork and communications skills
- computer science degree