Role – Data Engineer (Snowflake)
Location – Remote
Experience – 2-4 years
Notice Period – Immediate to 30 days
About Fluidata
We build practical, high-performance data systems for fast-growing companies. Our team works across the full stack - data engineering, analytics, automation, and AI - to turn messy operational data into insights leaders can rely on.
Our team is made up of people who learn fast, build with intention, and work directly on challenges that shape how organizations operate and make decisions. We’re growing quickly and are looking for people who want to grow with us.
Key Responsibilities
- Design, develop, and optimize scalable data pipelines and data architectures using Snowflake
- Write efficient, complex SQL queries for data transformation, validation, and analysis
- Build and manage ELT/ETL pipelines using tools like Airflow, DBT, or similar
- Develop and maintain data models (star/snowflake schemas) to support analytics and reporting use cases
- Work directly with clients to gather requirements, understand business problems, and translate them into technical solutions
- Optimize Snowflake performance, including query tuning, clustering, and cost optimization
- Integrate data from multiple sources (APIs, databases, third-party systems) into Snowflake
- Ensure data quality, governance, and security best practices are followed
- Collaborate with cross-functional teams including analysts, data scientists, and business stakeholders
- Create documentation and present solutions, insights, and progress updates to clients
- Troubleshoot data issues and provide timely resolutions in a client-facing environment
- Stay updated with evolving data engineering tools, Snowflake features, and industry best practices
Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or related field
- 2+ years of experience in data engineering or related roles
- Strong hands-on experience with Snowflake (data modeling, performance tuning, data loading)
- Advanced proficiency in SQL (joins, CTEs, window functions, query optimization)
- Experience with ETL/ELT tools such as Airflow, DBT, or similar
- Understanding of data warehouse design and data modeling concepts
- Experience working with cloud platforms (AWS/GCP/Azure preferred)
- Familiarity with Python for data processing is a plus
- Experience integrating APIs and handling semi-structured data (JSON, Parquet, etc.)
- Strong problem-solving, communication, and stakeholder management skills
- Prior client-facing or consulting experience is highly preferred
What We Offer
- Fully Remote work setup.
- Daily collaboration and guidance from the management team.
- Competitive compensation packages that reward high performance.
- Exciting career growth paths toward management leadership and a supportive culture that facilitates continuous learning.
- Collaborative team-based environment.
- Prominent client base and existing high-value relationships
- Best-in-class team of consultants to work alongside you.