Big Data Solution Architect
Location: Praga
Ref.No BDSA/MK
Main tasks:
- a self-starter, relationship builder and charismatic motivator,
- main point of customer contact for technical engagement and guiding multiple teams,
- experienced in successfully developing enterprise-grade projects and providing sustainable solutions,
- responsible for providing architecture design & development, domain expertise and thought leadership,
- focused on creating higher business value propositions and providing out of the box solutions,
- responsible for collaboration with global Centers of Excellence in EU, India & USA,
- responsible for program sustainability and solutions development throughout the projects full lifecycle.
Tasks:
- design and develop high performance, robust and secure solutions based on HortonWorks ecosystems,
- advise clients on best solution designs and technologies to meet their business requirements,
- successfully drive collaborative “solutions development” projects and ensure IT environment sustainability,
- contribute hands-on experience through all phases of system design lifecycles,
- evaluate platform capabilities and recommend HDFS best approaches focusing on client goals,
- analyze functional & non-functional requirements & severity impacts on the solution designs,
- create networks clusters and enhance monitoring capabilities for “Live Production Environments”,
- innovate constantly, maintain the technical edge and contribute to thought leadership,
- evaluate new technologies, execute proof-of-concepts and develop specialized algorithms,
- provide technical support and advise on HDFS cluster designs and environment management,
- lead project teams, guide on specific technical aspects and formulate best practices,
- support operational teams and clearly articulate “pros and cons” of various Hadoop technologies,
- provide MS Azure –IaaS/PaaS expertise on Hortonworks stacks,
- provide expertise on batch & stream analytics with HDFS, Kafka, Spark, Web-HDFS and Hortonworks stack,
- provide support on Digital Data Ingestion tools i.e. Splunk.
Requirements:
- Masters in Information Technology & Management,
- fluent in English language written and verbal at professional business-level is required,
- experience in normal solutions architecture as well as big data architecture,
- strong and comprehensive experience in requirements gathering, engineering and designing Solution Architectures,
- perform detailed business problems/ solution analysis with customer centricity as the top priority,
- evaluate solutions designs, tools recommendation & cost/benefit analysis,
- hands-on experience in batch & stream analytics, clusters and parallel architectures and MS Azure stack,
- good experience in streaming data processes i.e. Storm, Impala, Oozie, Mahout, Flume, ZooKeeper concepts,
- good understanding of large enterprise DWH, Data Injection techniques and tools,
- good knowledge data governance and compliance regulations i.e. GDPR,
- good knowledge of data integrity, security, access rights and audit trails implementations,
- good knowledge of Data Ingestion and Partitioning work cycles and tools,
- proven hands-on experience in Data harmonization and Data Flow Management & Graph Data Processing,
- good knowledge of software design cycles and developing enterprise grade solutions,
- proven track record in designing and developing robust high performance and secured solutions,
- demonstrated experience in global enterprise IT environments and working with multi-geo teams,
- strong relationship builder engaging technical and business stakeholders on creating solutions,
- proven team-leader experience in managing proof-of-concepts, addressing problems and developing solutions.