Role Name : Data Engineer / ETL & DWH Engineer
Role Overview
The Data Engineer or ETL & DWH Engineer is responsible for the full lifecycle development and implementation of end-to-end data pipelines and build of Snowflake data warehouse to support business and analytics' needs.
Key Responsibilities
- Provide day-to-day oversight and technical guidance to team of Data Engineers.
- Design and implement scalable data pipelines using Azure Data Factory
- Develop and optimize complex SQL queries for data extraction and transformation
- Build and maintain Snowflake data warehouse environment
- Create and refine relational data models to support business intelligence and analytics
- Collaborate with cross-functional teams to understand data requirements and deliver solutions
- Establish best practices for data engineering processes and documentation
- Ensure data quality, security, and compliance across all data engineering initiatives
- Stay current with emerging technologies and industry trends in data engineering
Required Qualifications
- Bachelor's degree in Computer Science, Information Systems, or related field
- 3-15 years of experience in data engineering roles
- Strong expertise in SQL, including complex query optimization
- Extensive experience with Snowflake, including architecture and performance tuning
- Proficiency in Azure Data Factory for building and managing ETL/ELT pipelines
- In-depth knowledge of relational data modeling techniques
- Ability to manage workload and self-drive to accomplish goals
- Excellent problem-solving and analytical skills
- Strong communication skills and ability to work with both technical and non-technical stakeholders
Preferred Qualifications
- Azure Developer Certification
- Master's degree in a relevant field
- Experience with Python
- Knowledge of data governance and compliance regulations
- Experience with version control systems (Git, Azure DevOps)
- Knowledge of Agile methodology
- Familiarity with data visualization tools (e.g., Tableau, Power BI)
- Experience with big data technologies (e.g., Hadoop, Spark)
(ref:hirist.tech)