About FWD Group
FWD Group is a pan-Asian life and health insurance business with more than 12 million customers across 10 markets, including some of the fastest-growing insurance markets in the world. The company was established in 2013 and is focused on changing the way people feel about insurance. FWD’s customer-led and digitally enabled approach aims to deliver innovative propositions, easy-to-understand products and a simpler insurance experience.
For more information, please visit www.fwd.com
For more information about FWD Hong Kong, please visit www.fwd.com.hk/.
PURPOSE:
- A role to be responsible for managing and supporting the infrastructure and data architecture of Group Data Platform across different local countries
KEY ACCOUNTAIBILITIES:
- Working with Group Technology to manage the infrastructure of the data platform in areas of security, performance, fault tolerance and elasticity, etc.
- Managing end-to-end data pipelines and data integration processes, both batch and real-time.
- Monitor, recommend, develop and implement ways to improve data quality including reliability, efficiency and cleanliness, and to optimize and fine-tune ETL / ELT processes.
- Recommend, execute and deliver best practices in data management and data lifecycle processes, including management of ETL / ELT processes, coding and configuration standards, error handling and notification standards, auditing standards, and data archival standards.
- Collaborate with Data Architect, Data Modeler, IT team members, SMEs, vendors and internal business stakeholders, to understand data needs, gather requirements and implement data solutions to deliver business goals.
- BAU support for any data issues and change requests, document all investigations, findings, recommendations and resolutions.
QUALIFICATIONS / EXPERIENCE:
- Bachelor in IT, Computer Science or Engineering.
- At least 3-5 years of using Big Data technologies like Azure, AWS or Hadoop Big Data Solution, strongly prefer candidate with Azure Big Data technologies.
- Minimum 5 years of professional experience in data warehouse, operational data store, and large scale data architecture management in Unix or/and Windows environment.
- At least 5 years of solid hands-on support experience with streaming and batch ETL processes
- At least 3-5 years hands-on experience on Azure AKS, and AWS EKS
- Hands-on experience on real-time data streaming using Kafka/Confluent
- Hands-on data API support experience using Python/Java would be big plus.
KNOWLEDGE & TECHNICAL SKILLS:
- Hands-on experience on Azure Big Data Solution such as Data Factory, Databricks, Gen2, Synapse, PowerBI
- Support experience with various of ETL/ELT frameworks, data warehousing concepts, data management framework and data lifecycle processes
- Strong knowledge in various database technologies (RDBMS, NoSQL and columnar).
- Preferably with a good understanding of data analytics and data visualization, highly prefer Power BI
- Experienced working in insurance industry will be an added advantage.
- Ability to communicate and present technical information in a clear and unambiguous manner.
- Strong ability to work independently and cooperate with diverse teams in a multiple stakeholder’s environment.
- Strong sense of work ownership, high affinity with anything data and a desire for constant improvements.
Interested parties please click Apply Now to apply job.
All applications applied through our system will be delivered directly to the advertiser and privacy of personal data of the applicant will be ensured with security.