Job Summary

$6,800 - $11,900 SGD / Monthly EST

Job Type


Years of Experience
At least 6 years

Tech Stacks
Rest API
Data Extraction
Shell Script

Job Description

You need to be an experienced Informatica BDM technology leader who understands distributed computing, ETL/ELT processes and the Cloudera Hadoop ecosystem in order to design, develop and guide the team to create and manage Data Ingestion pipelines from varies systems.


  • Experience in designing, creating, reviewing and performance tuning of ETL/ELT Informatica BDM jobs
  • Establish, maintain and enforce ETL architecture design principles, techniques, standards and best practices
  • Drive the technical design of ETL reference architecture to ensure high data quality, data integration performance and error recovery/handling
  • Review and assess existing ETL applications to support new features, performance improvements, upgrades, and ongoing sustainability
  • Ability to communicate and summarize workstream status effectively
  • Ability to map user stories-design and design-code traceability
  • Conduct design reviews, code reviews, performance tuning
  • Perform an active, leading role in shaping and enhancing overall Informatica architecture, including standards, patterns and best practices
  • Research and recommend future improvements in the Informatica operational environments
  • Deliver best practice document (Pre-Install Checklist, Architecture, Job Scheduling, Multi-tenancy setup, Fit for Use, Comparative Evaluation, Strength/ Weakness, Upgrade Strategy, Code
    Deployment Strategy) for each tool
  • Develop in-house knowledge repository for best practices, solution documentations, manuals, and procedures
  • Prepare cheat sheet on tool's features for user guide
  • Experience in Data Governance Tools Like Axon, EDC and DEQ
  • Use DES for Streaming and have good knowledge of Kafka and INFA BDM Power Exchange for CDC implementation
  • Experience in Hive, HBase, Kudu and Impala technologies
  • Python and Pyspark knowledge is good to have
  • Guide a team of ETL/DEQ/Axon developers or analysts to create data ingestion or governance workflows or pipelines
  • Deliver big data solution based on premise Hadoop or cloud based systems like AWS. Manage Hadoop cluster, participate in scale out planning & implementation
  • Design ingestion layer for structured & unstructured data (text, voice, xml etc) & implement insurance specific data model for business & analytics use
  • Deliver ELT solution including data extraction, transformation, cleansing, data integration and data management. Implement batch & near real time data ingestion pipelines based on reference architecture like Lambda
  • Ability to augment with new sources of data including internal/external untapped data and contribute to the establishment and maintenance of cloud computing platform and big data services
Job Qualifications

  • Master / Bachelor degree holder in Engineering/ Computer Science or equivalent
  • Min 6 years of ETL/ELT experience and Informatica BDM implementation experience using Hadoop
  • Familiar with Hadoop, Big Data, CCDAs, Data Governance – Axon & DEQ
  • Experience in Shell scripting experience using HDFS, Hive and/or PIG
  • Experience in PWX, CDC and MYSQL
  • 5+ years of experience in dealing with Apache Spark, Hive, Impala, Kudu, Hbase.
  • 5+ years of experience in creating and desiging REST APIs.
  • Big Data Technology : Apache Hadoop(Cloudera), Spark, Kafka, Hbase, No SQL DB, Hive, Impala, Sqoop, Flume, Pyspark
  • Data Integration tools: Informatica BDM 10.4.1, Informatica Axon 7.0, Any other ETL Tool
  • Business acumen: Insurance or banking domain knowledge preferred.
  • Ability in managing a team and be the leader of the pack showing leadership by example.
  • Ability to create Architecture and solutions to help the team meet the soaring data and cloud demands of building data pipelines, AI and machine learning solutions, advanced analytics, open source and other emerging digital technologies
  • Strong analytical and problem solving skills
  • Strong understanding of ETL Development best practices, Strong understanding of Database Concepts, Performance Tuning in SQL and Informatica
  • Strong knowledge of technology platforms and environments
  • Proven ability to work independently in a dynamic environment with multiple assigned projects and tasks
  • Ability to develop complex mappings and workflows in accordance with requirements