Nr ref.: dkw/dataexp/11
ABOUT CLIENT
We are a leading international bank focused on helping people and companies prosper across Asia, Africa and the Middle East.
To us, good performance is about much more than turning a profit. It's about showing how you embody our valued behaviours - do the right thing, better together and never settle - as well as our brand promise, Here for good.
We're committed to promoting equality in the workplace and creating an inclusive and flexible culture - one where everyone can realise their full potential and make a positive contribution to our organisation. This in turn helps us to provide better support to our broad client base.
This is a role within Group Performance to Plan (P2P) team dispersed between Asia & Europe which is part of the Digital Centre of Excellence for Finance Department. The team collects and analysis Bank’s performance and the market environment to drive strategic decision making and support the communication of the Bank’s performance to external stakeholders.
The role suits an individual able to conceptualise, problem solve and think laterally to manage dynamic and unique issues. Creativity and the ability to communicate clearly are essential skills for the role.
RESPONSIBILITIES
- Participate in projects covering full data lifecycle, end-to-end, from design, implementation and testing, to documentation, delivery, support, and maintenance.
- Gather business and functional requirements and translate these requirements into robust, scalable and operable solutions that work well within the overall data architecture.
- Develop processes of the ingestion of data using various programming languages, techniques and tools from systems implemented using Oracle, Teradata, SAP, Hadoop technology stack.
- Evaluate and make decisions around dataset implementations designed and proposed by peer data engineers.
- Build large database models for FP&A including Balance Sheet, PnL, Cost Analytics and Related Ratios.
- Use Dataiku to develop ETL, real time and batch data processes feeding into in-memory data infrastructure.
- Perform and document data analysis, data validation, and data mapping/design.
- Optimize Exasol data infrastructure using partitions, sorting and virtualized views accelerate query times by reducing query planning and reading times.
- Conduct unit tests and develop database queries to analyse the effects and troubleshoot any issues.
- Use query job analytics to analyse BI query plans to optimise execution times irrespective of the size of the underlying data set.
- Develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions.
- Conduct research via focus groups, usability test or other related tools.
- Create tools to store data within the organization.
- Evaluate and make decisions around the use of new or existing software products and tools.
- Participate in data driven user experiences design process.
- Mentor junior data engineers.
REQUIRED SKILLS & QUALIFICATIONS
- Fluent English (C1/C2 level both written and oral).
- Technical background (Computer Science | Information Technology | Mathematics | Physics | Statistics | Electronics | Electrical Engineering | Software Engineering | other related technical disciplines).
- Passion for data, visualisations, technologies and discovering new exciting solutions to the company’s data needs.
- Excellent communication, interpersonal, analytical and problem-solving skills with a strong appetite for finding facts and offering solutions.
- Comfortable communicating and working with colleagues from different time zones and cultures.
- 5+ years of commercial project implementation experience (Database | Data warehousing | ETL | BI | Big Data | Data Science).
- Commercial experience with programming languages.
- Proficient in SQL (DML | DDL | DCL | TCL).
- Commercial experience working on database/data warehouse projects.
- Commercial experience creating ETL processes using data integration tools.
- Commercial experience in Business Intelligence.
- Commercial experience working with Big Data.
- Familiarity with Data Science concepts.
- Ability to interpret written business requirements and technical specification documents.
- Experience in Agile delivery model.
DESIRED SKILLS & QUALIFICATIONS
- Experience with PL/SQL, Lua, etc.
- Experience with Oracle, Exasol, etc.
- Experience with Dataiku, Informatica Power Center, etc.
- Experience with SAP BO, SAP Analytics Cloud, Tableau, etc.
- Experience with Hadoop, Spark, etc.
- Familiarity with Python, R, etc.
- Experience in handling large data sets (multiple TBs) and working with structured, unstructured and geographical datasets.
- Performance monitoring, fine-tuning and optimization of ETL processes & dashboards.
- Experience with UX/UI design, awareness of best practices.
- Experience in Financial Sector / FP&A.