NTUC INCOME INSURANCE CO-OPERATIVE LTD
Permanent, Full Time
S$5000 - S$8000
As Singapore’s leading composite insurer, NTUC Income knows that a data-driven culture is key to doing good business and ultimately, helping our customers reach their financial goals. With data collection already part of the way we work, our data team is pivotal in making data meaningful and actionable, so that we can all work smarter, make better decisions and create the best experiences for our customers, whether that’s in-person with Income advisors or on our online touchpoints.
This omnichannel customer experience is made possible by several artificial intelligence (AI) solutions, that provide customized offerings and increased efficiency for our customers. Our customers rest with complete peace of mind as we walk this journey with them. From developing innovative, data-driven mindsets to promoting continuous skills development, we’re also proud to say that our employees are with us on our transformational data journey. We know it will take everyone working together to build even further on our rich legacy and create value for both our customers and ourselves.
As a Data Modeller, you will be helping to drive innovation, profitability, revenue, and business growth opportunities by applying advanced database modelling skills with Real-time/batch data architecture design principles to support metrics, analytics, and self-service business intelligence across the enterprise. You will provide a deep understanding of data organization, database design, data governance strategy, ER data modelling, metadata, data lineage, data classification and taxonomy, table design, data security, semantic layer view design, data catalogue, data linkage designs supporting a unified data model to be used across the organization for analytics, operational reporting, executive dashboards, scorecards, regulatory reporting, AI/ML, and data insights in the field of big data Hadoop technology.
You will support Data Lake that is at the core to drive data insights and analytics, marketing effectiveness, channel optimization, improved customer experience, customer acquisition, and metrics used to measure compliance and improve operational efficiency. You will work on a cross-functional team of talented ETL developers, business clients, software designers, and data engineers to play a key role in requirements, design, and in developing and delivering the next generation solutions required by the business for insights on new product offerings and services, tactical data store and big data integration, and self-service metrics and business intelligence reporting.
You will gather detailed business requirements, work with enterprise architects and work jointly on data designs to support applications/metrics, design data interfaces and stores as required, create data models, and work towards development and QA activities supporting data governance, Operational reporting, insights, and analytics.
You also need to have good knowledge of Data Management streams like Metadata Management, Data Quality and Profiling as well as Data Security so as to support Data Governance initiatives.
- Master/Bachelor degree holder in Engineering or computer application or anything equivalent
- Min 6 years of experience in relevant data architecture, data strategy and business analysis work experience
- Experience in being part of data architecture team to drive from design to delivery of projects
- Good knowledge in Data Warehousing and Data Lake architectures
- Familiar with DAMA principles and possesses Data Catalog experience
- Experience creating data architecture diagrams/flows, presentation slideware, and other architecture artifacts from requirements and business users using ERwin Data Modeler or a similar toolsets.
- Able to write Hive and Impala queries for analysis purpose
- Able to use Informatica DG tools like Informatica analyst, Axon and EDC.
- Data Modelling Tools: Erwin, ER Studio or Embarcadero.
- Data Governance Tools: Informatica DEQ, Informatica Analyst, EDC, Axon
- BigData Technology : Apache Hadoop(Cloudera), Spark, Kafka, Hbase, No SQL DB, Hive, Impala, Sqoop, Flume, Pyspark
- Data Integration tools: Informatica BDM 10.4.1, Any other ETL Tool
- Strong understanding of ETL Development best practices, Database Concepts, Performance Tuning in SQL and Informatica
- Strong knowledge of technology platforms and environments
- Proven ability to work independently in a dynamic environment with multiple assigned projects and tasks
- Ability to develop complex mappings and workflows in accordance with requirements
Closing on 10 Nov 2021orview more job listings from this company