#SGUnitedJobs Infrastructure Engineer, Infrastructure Data & Platform

  • Job Category
    Information Technology, Public / Civil Service
  • Job level
    Middle Management
  • Contract type
    Full Time

Job Description

The Government Infrastructure Group (GIG) is responsible for providing the ICT Infrastructure where the whole government will rely on for its digitalization effort and smart nation initiatives. The infrastructure layer covers the data centres, networks, cloud and endpoints. The modernization of the infrastructure and related services are underway and further efforts will be required to drive the adoption of advanced technologies to bring greater value to the country.

As an Infrastructure Engineer, you will be responsible for designing, coordinating and maintaining data in IT infrastructure. If you are someone with strong infrastructure background and passionate about technology, looking for opportunities to work with a team of practitioners and leading industry experts, we welcome you to join GIG. 

What to Expect:

  • Key contributor for the building and management of new data platform for Whole-Of-Government services

  • Responsible for the development of real-time and batch data processing pipelines to clean, supply and scale data to the Data Analysts team

  • Collaborate with stakeholders to understand problem statements and business requirement, to implement innovative and engaging big data solutions to be consumed by Whole-Of-Government

How to Succeed:

  • Degree in Computer Science, Computer or Electronics Engineering or Information Technology or related disciplines

  • Minimum 3 years of relevant IT experience in deploying and managing complex IT infrastructure systems and services or related roles

  • Familiar with distributed processing frameworks such as Spark

  • Good understanding of programming languages such as Python, Java and Scala

  • Strong command of SQL and NoSQL database concepts

  • Experience with data warehousing

  • The following are added advantages:

    • Building data pipelines, either programmatically or through the use of ETL tools such as Informatica, NiFi and Flink

    • Ability to deploy, manage, and administer Hadoop distributions such as Hortonworks and Cloudera

    • Experience with data science related libraries in Python such as Pandas, Numpy, and Seaborn

    • Understand and have experience in ICT domain such as Enterprise networking, messaging infra and AD
      Powershell / Bash scripting 

  • Good team player with strong analytical and problem solving ability

Closing on 24 May 2021