I. KEY RESPONSIBILITIES
- Design, build, and maintain scalable data pipelines and ETL workflows for analytics and business intelligence
- Process and transform data from multiple sources into reliable and queryable formats
- Develop and maintain data models, data flows, and related documentation
- Build reports, dashboards, and data solutions that deliver clear and actionable insights
- Work with stakeholders and cross-functional teams to understand business needs and translate them into technical solutions
- Support the design and improvement of data architecture, data integration, and data warehouse solutions
- Monitor, maintain, and optimize deployed data products to ensure quality, performance, and reliability
- Identify trends, patterns, and opportunities from large and complex datasets
- Stay current with modern data engineering, cloud, and analytics technologies
II. JOB QUALIFICATIONS
Must Have
- Bachelor’s degree in Computer Science, Engineering, or a related field
- At least 3 years of hands-on experience as a Data Engineer or in a similar data-focused role
- Strong experience building and operationalizing data pipelines
- Strong SQL skills and solid understanding of data warehousing and ETL concepts
- Experience with Python or other programming languages such as Java
- Experience working with different data sources such as SQL databases, flat files, APIs, or cloud data sources
- Good understanding of data modeling, data integration, and data quality best practices
- Strong analytical and problem-solving skills
- Good English communication skills, both written and verbal, are required
- Ability to work independently and collaborate effectively in a fast-paced environment
CLOUD-SPECIFIC EXPERIENCE
For AWS-focused candidates
- Experience with AWS or related cloud services
- Experience in Big Data, especially Apache Spark
- Experience with ETL and workflow tools such as Airflow, SSIS, Pentaho, or Informatica
- Experience with databases such as MSSQL, Oracle, MySQL, PostgreSQL, or cloud data platforms such as Snowflake
- Experience designing data warehouse solutions from scratch is a strong plus
For GCP-focused candidates
- Strong experience with GCP, especially BigQuery
- Familiarity with related services such as Cloud Storage, Dataflow, and Pub/Sub
- Experience with BI/reporting tools such as Power BI, Tableau, or similar platforms
- Familiarity with Git, CI/CD, and modern DevOps practices
- Understanding of storage technologies such as Data Lake, Relational DBs, NoSQL, and Graph databases
NICE TO HAVE
- Experience with dashboarding and data visualization
- Understanding of Star Schema and other data warehouse design approaches
- Experience working in Agile environments
- Experience in cloud-based analytics ecosystems and modern data platforms
III. Benefits
- “FPT care” health insurance is provided by Petrolimex (PJICO) and is exclusive for FPT employees.
- Annual Summer Vacation follows company’s policy and starts from May every year.
- Salary review 1 time per year.
- International, dynamic, and friendly working environment.
- Annual leave and working conditions follow Vietnam labor laws.
- Other benefits include sponsorship for studying and taking international certification exams and sponsor loan interest policy for Fsofter.
- Shuttle bus for employees.
- Allowance: allowance onsite (if any).