Experience Required-3-7years
What You’ll Do
• Design, build, and maintain scalable data pipelines and ETL/ELT workflows to ingest, transform, and process large volumes of structured and semi-structured data.
• Develop and optimize data models, tables, and transformations to support analytics, reporting, and downstream data consumption.
• Work with large datasets using SQL, PySpark, and modern data platforms such as Snowflake and Databricks to ensure efficient data processing.
• Build and manage data workflows using orchestration tools such as Apache Airflow, ensuring reliable and timely data delivery.
• Develop automation scripts using Shell Scripting and Python to support data pipeline execution, monitoring, and operational efficiency.
• Monitor, troubleshoot, and optimize data pipelines to improve performance, scalability, and reliability across the data ecosystem.
• Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and enable data-driven insights
• Ensure adherence to data engineering best practices, including data quality checks, documentation, and pipeline governance.
What We’re Looking For
• Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
• 3–8 years of experience in data engineering or data platform development within large-scale data environments.
• Strong proficiency in SQL, Python, and distributed data processing frameworks such as PySpark.
• Hands-on experience with modern data platforms such as Snowflake and Databricks.
• Experience building and managing workflow orchestration pipelines using Apache Airflow.
• Exposure to Shell Scripting for automation and operational workflow management.
• Strong understanding of data modeling, ETL/ELT processes, and scalable data pipeline architecture.
• Ability to collaborate effectively with cross-functional teams and global stakeholders.
Nice to Have
• Experience working with cloud platforms such as AWS, GCP, or Azure.
• Familiarity with data lakehouse architectures, data governance, and modern data platform practices.
• Proven ability to work with global stakeholders in cross-functional, matrixed environments.
Currently, there aren't any salaries for this role at EXL shared by other job seekers.
View more salaries from EXL →Achieve your dream job with our top-notch tools!
Resume Checker
Our free resume checker analyzes the job description and identifies important keywords and skills missing from your resume in just a minute!
AI InterviewPrep
Utilizing advanced AI, our tool generates tailored interview questions based on your industry, role, and experience. Practice and receive feedback on your answers in real time!
Resume Builder
Let us show you the differences between a bad, good, and great resume, and guide you in building a resume that helps you stand out to employers, ensuring you land your next position faster!