Responsibilities
- Possesses 6-10 years of experience in developing and architecting with Snowflake and AWS data engineering solutions.
- Architect enterprise-level Snowflake solutions. Design and implement complex data warehousing solutions. Ability to optimize warehouse configurations and resource utilization.
- Design, build and operationalize large scale enterprise data solutions and applications using AWS services like S3, Glue, Lambda, SNS, EC2, IAM & KMS, and Snowflake (snowpipe, streams, tasks stored procedures).
- Design complex SQL-based data transformations keeping query performance & best practices into consideration. Develop and maintain stored procedures and functions. Design efficient data models and schema structures.
- Architect Python-based data processing solutions. Design reusable Python libraries and frameworks.
- Implement automated testing strategies. Develop complex ETL/ELT processes using Python. Integration of Python with Snowpark.
- Should be able to collaborate with stakeholders to gather requirements, propose the best cloud solutions and implement them effectively.
- Follow established project execution processes, coding standards, and industry best practices.
- Should be aware about AWS, Snowflake and Data security principles.
Qualifications
Must have Skills:
- Extensive experience with Snowflake native features like streams, tasks, stored procedures, snowpipe and data sharing, AWS services: S3, Glue, Crawler, Lambda, SNS, EC2, IAM and KMS, Python Hands on & expertise in SQL.
- Proficiency in CI/CD Pipelines in GitHub.
- Proven track record for handling teams of 2-5 team members.
- Exposure of collaborating with business stakeholders.
Good To Have Skills
Hands on experience/Familiarity with one/more of the following technologies: Database, Data Visualization (Power BI), Scripting, ETL (Informatica PowerCenter), understand data mesh architectures. Nice to have Cloud certifications in AWS, Snowflake.