We’re looking for an enthusiastic team player to join the Digitalization Group, working on a microservices architecture. Collaborate between business stakeholders, developers and users globally and fostering communication. You will be a key figure in developing the data engineering capability; ingesting, transforming, storing and modelling data. Work on building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. A self-starter and team player who can work under high levels of pressure to challenging deadlines. You will communicate with a diverse group of stakeholders globally.
- Translate business requirements into functional and technical data designs, develop data and analytical solutions in collaboration with stakeholders.
- Setup ETL process for data from multiple sources using python/logstash. Create solutions that solve business challenges and meet user needs using Kibana/Grafana/React/Python/SQL
- Identify and recommend opportunities for process improvement and liaise with business and technical teams to implement these changes.
- Automate repetitive tasks, developing innovative solutions for the team and wider business.
- Lead projects focused on driving incremental value with business teams, partnering with the business effectively to understand areas for optimization and growth opportunities.
- Maintain and evolve monitoring and alerting capabilities - includes infrastructure performance as well as key business metrics.
- Identify and recommend opportunities for process improvement and liaise with technical teams to implement these changes.
- Engage with stake holders to understand and document business requirements
- Communication with stakeholders of all levels including the ability to advise, train and support clients where necessary.
Skills & Experience
- At least 5 years commercial experience in working as a Data Engineer or similar role with Full Stack abilities.
- Experience of ELK stack/React/Python/SQL coding.
- Designing and building visualizations.
- Deep understanding of Business Analysis, Data modelling and Data Infrastructure and ETL Design.
- Experience in working as part of Agile and multi-disciplinary teams.
- Strong problem solving, analytical and logical skills.
- good experience working with Python (and there packages), to perform Data cleaning, Data quality assessment, Data analysis, Data visualization.
- Good understanding of Kafka, Docker, Kubernetes, Microservices.
- The ability to articulate ideas and proposed to stakeholders across the business, both technical and non-technical audiences.
- Create documentation for technical and non-technical users.
- Willingness to learn new skills and have a Can-do, Will-do attitude.
- Experience with fast prototyping.
- Set KPIs and metrics to evaluate analytics solution given a particular use case.