Spyrosoft is an authentic and cutting-edge software engineering company, established in 2016. We bring our clients ideas to life by recognising their problems and formulating effective and thorough solutions for the automotive, geospatial, industry 4.0, employee education and financial sectors.
Shaping technology, together
We’ve come to Spyrosoft from different places, and we all share enthusiasm for all things software engineering. Nothing is impossible to us – we work on new solutions until we’re confident that they’reperfectly designed and even more perfectly implemented. That’s how we learn, and that’s how we turnproblems into prospects. We trust each other to the point where we know we can rest after work, and we start each day with fresh ideas and great coffee.
There’s more to life than work
Technological meetups, trainings, internal workshops, Community of Interest and Community of Practiceare all special initiatives, created by the experts and for the experts. We make sure that everyone gets achance to learn new things from their coworkers, and to share their knowledge as a mentor or a speaker. We also organise a monthly catch-up for the whole company, so that every employee knows what’shappening.
You can find more about Spyrosoft at www.spyro-soft.com
-
Big Data
-
Hadoop
-
Apache
-
Spark
-
Python
-
Scala
-
Kafka
-
Informatica
- Cassandra
-
Mongo DB
-
CI/CD Tools
-
Scrum
-
Kanban
We are looking for a top-notch technology savy specialists willing to move our projects on the new track! You will use the most advanced technology stack and have an opportunity to implement new solutions while working with top leaders in their industries. As a part of our global team you will participate in international projects based on Java and MS Azure platform.
- 6+ years of commercial experience in Big Data
- Experience working on Hadoop distribution systems (Cloudera /Hortonworks/MapR)
- Strong understanding & familiarity with Hadoop Ecosystem
- Hands-on programming experience in Apache Spark using SparkSQL and Spark Streaming or Apache Storm
- Proficiency in Python or Scala programming languages
- Experience on Data Integration: Kafka, Sqoop, Nifi, Flink, Talend, Informatica, DataStage, Talend
- Experience on Data Processing and Management: HDFS, HBase, Hive, Impala, Cassandra, Mongo DB, Spark, Storm, MPP, MR, Hortonworks-Cloudera Stack
- Knowledge of data modelling tools (Power designer, Erwin, or other accepted standard tool), CI/CD Tools (Jenkins, Marvin, GIT)
- Experience in using Agile methodologies (Scrum, Kanban, etc.)
- Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc.
- Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution excellent inter-personal skills
- Very good English language verbal and written communication, German or other European language would be an advantage but not mandatory
- Luxmed/Medicover
- Multisport
- Training budget