NINJA LOGISTICS PTE. LTD.
Permanent, Full Time
S$4500 - S$9000
Roles & Responsibilities
- Design, develop and maintain Ninja Van’s infrastructure for streaming, processing and storage of data. Build tools for effective maintenance and monitoring of the data infrastructure.
- Contribute to key data pipeline architecture decisions and lead the implementation of major initiatives.
- Work closely with stakeholders to develop scalable and performant solutions for their data requirements, including extraction, transformation and loading of data from a range of data sources.
- Develop the team’s data capabilities - share knowledge, enforce best practices and encourage data-driven decisions.
- Develop Ninja Van’s data retention policies, backup strategies and ensure that the firm’s data is stored redundantly and securely.
- Solid Computer Science fundamentals, excellent problem-solving skills and a strong understanding of distributed computing principles.
- At least 3 years of experience in a similar role, with a proven track record of building scalable and performant data infrastructure.
- Expert SQL knowledge and deep experience working with relational and NoSQL databases.
- Advanced knowledge of Apache Kafka and demonstrated proficiency in Hadoop v2, HDFS, and MapReduce.
- Experience with stream-processing systems (e.g. Storm, Spark Streaming), big data querying tools (e.g. Pig, Hive, Spark) and data serialization frameworks (e.g. Protobuf, Thrift, Avro).
- Bachelor’s or Master’s degree in Computer Science or related field from a top university.
- Data storage: Percona XtraDB, Elasticsearch, Delta Lake
- Data pipelines: Apache Kafka, Spark Streaming, Maxwell
- Workflow manager: Apache Airflow
- Query engines: Apache Spark, PrestoSQL
- Infrastructure monitoring: Prometheus, Grafana
- Web Backend: Play (Java 8+), Golang, Node.js
- Web frontend: AngularJS, ReactJS
- Mobile: Android, Flutter
- Deployment: Docker, Kubernetes
Closing on 05 Feb 2021orview more job listings from this company