Senior Data Engineer

Singtel logo

Singtel

View Salaries, Reviews, and more  

Job Summary


Salary
S$6,734 - S$12,452 / Monthly EST

Job Type
Permanent

Seniority
Senior

Years of Experience
At least 8 years

Tech Stacks
Oracle Analytics Fabric Snowflake Rust Databricks Presto Spark Flink Airflow kafka SQL PostgreSQL Scala MySQL Go Python Java

Job Description

Responsibilities

  • Design, develop and automate large scale, high-performance distributed data processing systems (batch and/or real-time streaming) that meet both functional and non-functional requirements
  • Deliver high level & detailed design to ensure that the solution meet business requirements and align to the data architecture principles and technology stacks
  • Partner with business domain experts, data scientists, and solution designers to identify relevant data-assets, domain data model and data solutions. Collaborate with product data engineers to coordinate backlog feature development of data pipelines patterns and capabilities
  • Own and lead data engineering projects; data pipelines delivery with reliable, efficient, testable, & maintainable artifacts, involves ingest & process data from a large number and variety of data sources
  • Build, optimize and contribute to shared Data Engineering Frameworks and tooling, Data Products & standards to improve the productivity and quality of output for Data Engineers
  • Design and build scalable Data APIs to host Operational data and Data-Lake assert in Data Mesh / Data Fabric Architecture.
  • Drive Modern Data Platform operations using Data Ops, ensure data quality, monitoring the data system. Also support Data science MLOps platform

We are committed to a safe and healthy environment for our employees & customers and will require all prospective employees to be fully vaccinated.

The Ideal Candidate Should Possess

  • Bachelor’s degree in IT, Computer Science, Software Engineering, Business Analytics or equivalent
  • Minimum of 8 years of experience in Data Engineering, Data Lake Infrastructure, Data Warehousing, Data Analytics tools or related, in design and developing of end-to-end scalable data pipelines and data products
  • Experience in building and operating large and robust distributed data lakes (multiple PBs) and deploying high performance with reliable system with monitoring and logging practices
  • Experience in designing and building data products and pipelines using some of the most scalable and resilient open-source big data technologies; Spark, Delta-Lake, Kafka, Flink, Airflow, Presto and related distributed data processing
  • Excellent experience in using ANSI SQL for relational databases like – Postgres, MySql, Oracle and knowledge of Advanced SQL on distributed analytics
  • Experience working in Telco Data Warehouse and / or Data Lake engines – Databricks SQL, Snowflake, etc
  • Proficiency programming languages like Scala, Python, Java, Go, Rust or scripting languages like Bash

banner icon
Prepare For Your Interview in 1 Week?
Equip yourself with possible questions that interviewers might ask you, based on your work experience and job description.
Get Started!

Achieve your dream job with our top-notch tools!

Resume Checker Illustration

Resume Checker

Our free resume checker analyzes the job description and identifies important keywords and skills missing from your resume in just a minute!

Check Now
Resume Checker Illustration

Interview Preparation

Utilizing advanced AI, our tool generates tailored interview questions based on your industry, role, and experience. Practice and receive feedback on your answers in real time!

Let's Prepare
Resume Checker Illustration

Resume Builder

Let us show you the differences between a bad, good, and great resume, and guide you in building a resume that helps you stand out to employers, ensuring you land your next position faster!

Build Resume