C

Data Engineer, Singapore Global Network

Careers@Gov · Singapore · Full-time

2-5 years Posted 6 days ago

Quick Summary

  • Design and implement data pipelines using Databricks workflows and notebooks
  • Build and maintain medallion architecture for data processing and transformation
  • Implement data governance practices using Unity Catalog for data discovery and access control

Full Description

[What the role is]

The Singapore Global Network (SGN) is a division in the Economic Development Board tasked to re-energise Singapore's efforts in building a strong and extensive ecosystem of overseas Family, Friends, Fans (3F) for Singapore. SGN leads the Whole-Of-Government effort to coordinate across agencies and strengthen our collective networks whose skills, affinity and influence could help build linkages to global business and talent communities and enhance the international mindshare of Singapore as a leading globally connected city.

[What you will be working on]

As a member of SGN's digital products team, you will develop and maintain data infrastructure and pipelines that enable SGN's analytics and reporting capabilities. You will work primarily with Databricks to implement robust data architectures that support our network analysis and business intelligence needs.

We are looking for a Data Engineer who is comfortable with building end-to-end data solutions using Databricks and AWS services. This role focuses on creating scalable data pipelines, implementing data governance practices, and preparing clean, reliable datasets for reporting and analytics. The role will involve but not be limited to:

  • Design and implement data pipelines using Databricks workflows and notebooks

  • Build and maintain medallion architecture (Bronze, Silver, Gold layers) for data processing and transformation

  • Develop ETL processes to integrate data from various SaaS platforms and internal systems

  • Implement data governance practices using Unity Catalog for data discovery, lineage, and access control

  • Create and maintain Jupyter notebooks for data exploration and collaborative analytics

  • Design scalable data storage solutions using AWS services (S3, RDS, Athena)

  • Build data models and prepare datasets optimised for reporting and business intelligence

  • Implement ML pipelines for basic predictive analytics and data insights

  • Ensure data quality, validation, and monitoring across all data workflows

  • Collaborate with stakeholders to understand data requirements and deliver analytics-ready datasets

[What we are looking for]

Requirements

Resourcefulness, integrity, drive, and strong collaboration skills are essential. We're looking for candidates with:

  • 2-5 years of data engineering experience, with demonstrated ability to work independently on data pipeline projects

  • Diploma or degree in Computer Science, Engineering, Information Technology, Data Science or related disciplines

  • Proficiency with Databricks platform including workflows, notebooks, and cluster management

  • Experience implementing medallion architecture for data lake solutions

  • Hands-on experience with Unity Catalog for data governance and cataloguing

  • Strong Python and SQL skills for data manipulation and transformation

  • Experience with Jupyter notebooks and collaborative data science environments

  • Working knowledge of AWS data services (S3, Athena, RDS)

  • Understanding of ETL/ELT principles and data pipeline design patterns

  • Basic knowledge of machine learning concepts and ML pipeline implementation

  • Singaporean or PR

Preferred Qualifications

  • Databricks Certified Data Engineer Associate or Professional

  • Experience with AWS data services and Government Commercial Cloud (GCC)

  • Previous work integrating CRM systems like Salesforce with data platforms

  • Experience with data visualisation tools and business intelligence platforms

  • Knowledge of data governance frameworks and compliance requirements

  • Understanding of event-driven data architectures and real-time processing

  • Experience with BI tools such as Power BI or Tableau will be a plus

Ready to apply?

This role is still accepting applications

Apply on company's site