Senior Executive (Smart Infrastructure)

HOME TEAM ACADEMY
  • Job Category
    Information Technology, Public / Civil Service
  • Contract type
    Permanent

Job Description

SE (Smart Infrastructure), Centre for Human Capital and Smart Campus Transformation

 

  1. Centre for Human Capital and Smart Campus Transformation

The Centre for Human Capital and Smart Campus Transformation (CHCT) focuses on HTA’s transformation initiatives.The Centre oversees the implementation of transformational projects under HTA’s Technology-Enabled and Digitalised HTA (TED@HTA) Masterplan 2025, which is a key enabler for the realisation of a HTA Smart Campus. It is also responsible for driving organisational transformation through people development.

 

 

  1. The Job

The incumbent will build, expand and optimise Home Team training and learning data, data pipeline architecture, as well as optimise data flow and collection for cross functional departments. As the main data engineer in the team, the officer will support data analysts on data initiatives and visualisation of data.

 

 

  1. Reporting Structure

The incumbent will report to Assistant Director (Smart Campus Transformation), Centre for Human Capital and Smart Campus Transformation.

 

 

  1. Main responsibilities
  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional/non-functional business requirements
  • Identify, design and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc
  • Build the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources
  • Work with external and internal stakeholders to assist with data-related technical issues and support their data infrastructure needs
  • Keep data separated and secure across departments
  • Work with data and analytics experts to strive for greater functionality in our data systems

 

 

  1. Requirements
  • Have at least 2 years’ experience in similar role
  • Possess a degree in Computer Science, Information Systems or equivalent
  • Have advanced working SQL knowledge and experience in working with relational databases, query authoring as well as working familiarity with a variety of databases
  • Experienced in building and optimising ‘big data’ data pipeline, architectures and data sets
  • Experienced in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Experienced in supporting cross-functional teams in a dynamic working environment
  • Experienced in using software/tools:
    • Big data tools: Hadoop, Spark, Kafka etc
    • Relational SQL and NOSQL databases, including Postgres and Cassandra
    • Data pipeline and workflow management tool
    • Object-oriented/object function scripting languages: Python, Java, C++ etc
  • Working knowledge of stream processing systems: Spark Streaming, Apache Flink, Apache Kafka etc
  • Familiar with building processes supporting data transformation, data structures, metadata, dependency and workload management
  • Strong analytic skills related to working with unstructured datasets
  • Strong project management and organisational skills

 


Closing on 02 Jul 2021