At Axelerant, as a Senior Data Engineer, you will work on meaningful data challenges using modern tools and platforms. Youโll help architect and build data ingestion, processing, and analytical pipelines that support large-scale digital experiences for global clients.
In this senior role, you will guide a small team of data engineers (2-3 developers), lead technical discussions, and work closely with customers to understand their requirements, needs, and platform concerns. This position gives you the opportunity to deepen your skills across cloud data engineering, collaborate with cross-functional teams and client stakeholders, and contribute to building reliable and scalable data systems that drive business insights.
Your Job Responsibilities
- Design, build, and maintain scalable and reliable data pipelines (ETL/ELT processes) for structured and unstructured data, ensuring data accuracy, performance, and availability.
- Lead and mentor a small team of data engineers, providing technical guidance, conducting code reviews, and ensuring high-quality deliverables.
- Contribute to designing and optimizing data architectures, storage layers, and transformation workflows for optimal performance and scalability.
- Work closely with customers and other stakeholders to understand their data requirements, needs, and platform concerns, translating these insights into solutions that support their analytics and reporting objectives.
- Lead technical discussions, architecture reviews, and design sessions to ensure alignment on data engineering best practices and solution approaches.
- Monitor, troubleshoot, and optimize data workflows and pipelines to ensure reliability, efficiency, and timely data availability.
- Apply and enforce best practices for coding, testing, documentation, and version control in all data engineering projects.
- Collaborate with data scientists, analysts, backend engineers, and product teams to enable analytics and machine learning workflows.
- Support and implement data governance, data quality, and security guidelines throughout the data pipeline and storage solutions.
- Stay informed about emerging data engineering tools and technologies, and propose improvements or innovative solutions when relevant to enhance performance or reliability.
Skills, Knowledge and Expertise
- 5+ years of experience in data engineering or a related backend engineering role, with a proven track record of delivering data solutions.
- Proven ability to lead technical projects or small teams, mentor junior engineers, and guide projects to successful completion.
- Strong programming skills in Python, Java, or Scala for building data workflows and pipelines.
- Solid understanding of data modeling, database design, and query optimization for both relational and NoSQL systems.
- Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud, including their managed data services.
- Experience with data warehousing technologies (e.g., Redshift, Snowflake, Databricks) and understanding of lakehouse architectures.
- Hands-on experience with distributed processing and streaming tools such as Apache Spark, Apache Kafka, or Apache Flink.
- Exposure to data visualization or analytics tools (e.g., Apache Superset, Tableau) and understanding of how data is consumed for insights.
- Experience with workflow orchestration platforms like Apache Airflow or Prefect for scheduling and managing data pipelines.
- Knowledge of containerization tools such as Docker and a basic understanding of Kubernetes for deploying data services.
- Strong understanding of data governance, data quality principles, and security best practices in data engineering.
- Excellent communication and collaboration skills for working with both technical and non-technical teams, including direct engagement with customer stakeholders to translate requirements into technical solutions.
Good to Have
- Experience with real-time streaming systems and event-driven architectures.
- Familiarity with CI/CD pipelines and DevOps concepts as they relate to data engineering projects.
- Understanding of machine learning model deployment and operational workflows (MLOps).
- Exposure to multi-cloud or hybrid cloud environments.
- Certifications on relevant platforms (AWS, Azure, GCP, Snowflake, etc.) that demonstrate your expertise