K2 PARTNERING SOLUTIONS PTE. LTD.
S$7500 - S$11000
Job Title (Public Field):
● Review and tune the code for the Spark framework to increase the throughput.
● Establish best practices and guidelines to be followed by tech teams while writing codes in Spark.
● Create a framework to continuously monitor the performance and identify the resource-intensive jobs.
● Able to write code, services, and components in Java, Apache Spark, and Hadoop.
● Responsible for systems analysis Design, Coding, Unit testing, CICD, and other SDLC activities.
● Proven experience working with Apache Spark streaming and batch framework.
● Proven experience in performance tuning Java, Spark-based applications is a must.
● Knowledge of working with different file formats like Parquet, JSON, and AVRO.
● Data Warehouse experience working with RDBMS and Hadoop, Well versed with concepts of change data capture and SCD implementation.
● Hands-on experience in writing codes to efficiently handle files on Hadoop.
● Ability to work in projects following Micro Services based architecture.
● Knowledge of S3 will be a plus.
● Ability to work proactively, independently, and with global teams.
● Strong communication skills should be able to communicate effectively with the stakeholders and present the outcome of the analysis.
● Working experience in projects following Agile methodology is preferred.
● At least 3 to 9 years of working experience
● Knowledge of technologies e.g. Hadoop, Hive, Presto, Spark, Java, RDBMS like Teradata, and File storage like S3 are necessary.
If you are interested in this job please apply by using the APPLY button in the top right corner. In case you are not interested or unavailable feel free to refer a friend and you could earn a cash payout through our Share & Earn program.
Recruitment license number: 16S8116
Closing on 11 Feb 2021