Data Architect

NTUC INCOME INSURANCE CO-OPERATIVE LTD
  • Job category
    Information Technology
  • Job level
    Manager
  • Contract type
    Permanent, Full Time
  • Location
    East
  • Salary
    S$7000 - S$13500

Job Description

As Singapore’s leading composite insurer, NTUC Income knows that a data-driven culture is key to doing good business and ultimately, helping our customers reach their financial goals. With data collection already part of the way we work, our data team is pivotal in making data meaningful and actionable, so that we can all work smarter, make better decisions and create the best experiences for our customers, whether that’s in-person with Income advisors or on our online touchpoints.

This omnichannel customer experience is made possible by several artificial intelligence (AI) solutions, that provide customized offerings and increased efficiency for our customers. Our customers rest with complete peace of mind as we walk this journey with them. From developing innovative, data-driven mindsets to promoting continuous skills development, we’re also proud to say that our employees are with us on our transformational data journey. We know it will take everyone working together to build even further on our rich legacy and create value for both our customers and ourselves.


This role will be part of the Data team within our Information Technology department, focusing on transforming the existing data architecture and build the next generation data platform across Income to support the Digital Transformation. The successful candidate will bring a vision to modernize the data architecture and develop a multi-year roadmap to bring that vision to life, collaborating with business and technology leads, application architects, data science and data engineering teams across the organisation to enable large scale machine learning and analytics use cases. He/She will act as a thought leader for the organization, defining data platform processes and best practices – engaging senior stakeholders across the group to ensure their continued buy-in on strategic data initiatives.

  • Design, build and implement end-to-end data driven solutions
  • Define roadmap to transform data architecture focusing on scalability, performance and flexibility throughout the entire data life cycle (ingestion, storage and consumption).
  • Maintain data architecture framework, standards and principles including modelling, metadata, security, master and reference data
  • Define reference architecture as a set of patterns that can leveraged by diverse parts of the direction to create and improve data systems
  • Lead architectural designs solution context diagram and conceptual data model to optimize security, information leverage and reuse, integration, performance, and availability and ensure solutions developed adhere and aligns to the delivered architecture
  • Consult and influence digital application teams regarding solutions. Collaborate with other staff to design and implement effective technology solutions, while using innovative business and technology processes to identify and implement improvement initiatives, eliminate redundancies and maximize the reuse of data
  • Work closely with Solutioning, Infrastructure and project teams to understand their needs and ensure the best data architecture is implemented
  • Provide training and share best practices across teams regarding data architecture design and solution implementation including review and quality assurance
  • Develop and apply industry best practice technology, design and methodology approaches to design data architecture. Research and recommend new emerging technologies, techniques and tools that will add value to the organization
  • Working with project teams to ensure architecture requirements are captured and addressed by designs that uphold the enterprise principles and standards
  • Designing and delivering data architecture on cutting edge green field Data Lake
  • Designing data integration architecture based on solution requirements e.g. real-time, near real-time, batch
  • Architecting conceptual, logical and physical architecture and data models for operational enterprise data and analytics solutions using recognised data modelling approaches (e.g. 3NF relational, Kimball dimensional) in standard modelling tools using UML and the ArchiMate modelling language
  • Owning the definition of data architecture components for information management solutions to facilitate storage, integration, usage, access, and delivery of data assets across the enterprise in both the operational and analytics spaces
  • Working to accelerate the wider Income business by developing and maintaining long-lasting relationships with senior stakeholders, both during and outside of project delivery
  • Engaging talent and helping us diversify our talent force by leading data architecture capability and driving further development of the team
  • Provide direction on linkages between different systems to ensure the information flow is aligned with the information architecture
  • Provide guidance on the implementation of information architecture
  • Recommend innovative solutions to increase efficiencies around the integration of complex systems
  • Assess existing systems to evaluate their usability, usefulness, visual design and content
  • Communicate the design and recommendations to stakeholders
  • Develop strategies for seamless and low-risk migration of data between systems
  • Guide alignment of information management standards with the enterprise architectural plan and information security standards
  • Identify the desired state of a coordinated information flow through the organization

Qualifications

  • Bachelor's Degree in Computer Science or related discipline
  • Min 5 years of experience of designing and developing high performance resilient data driven applications, hands on experience in data analytics, data integration and data modelling
  • Strong experience with traditional data technologies (ODS, Data Lake, Data Warehouse)
  • Experience in designing data patterns to support micro-service based application architecture
  • Track record of successfully building container-based big data architectures on top of Kubernetes
  • Experience in designing systems to efficiently handle real-time and batch use-cases
  • Exposure and understanding in the latest open source technologies across the big data ecosystem, such as distributed storage, real-time event processing and large scale distributed OLAP engines
  • Exposure to Data as a Service concepts and data virtualization
  • Working knowledge of data science processes and best practices, with experience in building scalable architectures through the use of data science workflow orchestrators
  • Hands on experience on DevOps / DataOps / MLOps concepts
  • Strong data architecture and data modelling skills
  • Understanding of how data architecture fits with other architecture disciplines
  • Familiarity with working in agile and waterfall approaches
  • In-depth understanding and knowledge of business and technology trends
  • Good understanding of end-to-end application development - data integration for backend API development and front end UI development
  • Expertise in RDBMS (Teradata/Oracle, PostgreSQL, Vertica etc), PL/SQL, data modeling & ETL/ELT, Snowflake, SingleStore or similar platforms
  • Expertise in Big Data Technologies like Hive, Pig, Impala, Spark
  • Working knowledge on BI tools like Tableau/ Power BI.
  • Experience in performance tuning, capacity planning and sizing
  • Good understanding of cloud methodologies and experience with AWS or GCP tool suites and services
  • Proven track record in strategic planning, individual/team development and service delivery
  • Embodies a collaborative approach in bringing both business and technology stakeholders together to deliver technology solutions that enable tangible business benefits
  • TOGAF/DAMA certification would be good to have
  • At least 2 years of experience in implementing Data Governance
  • Data Integration & Federation: e.g. Informatica BDE, Python and other scripting and integration technologies
  • Data Repositories: e.g. SQL Server, Google BigQuery, Amazon Redshift, cloud storage
  • Experience in implementing RESTful API and services using frameworks such as Springboot or flask

Closing on 10 Nov 2021

orview more job listings from this company