Technical lead - Data Scientist
Key skills: Python, Machine Learning, Data science, SQL, Pandas, ML Frameworks, Snowflake, ETL/ELT
Location: Chennai or Pune
Notice: Immediate to 30 days
Model: Hybrid (monthly 8 days in office)
Share your resumes to [email protected]
A Data Scientist specializing in Python and Structured Data Machine Learning focuses on analyzing structured datasets and developing predictive models using Python-based ML frameworks.
A Data Scientist in this domain is responsible for extracting insights from structured data, building machine learning models, and optimizing data-driven decision-making.
Key Responsibilities
- Develop ML models for structured data analysis.
- Implement data preprocessing pipelines using Python (Pandas, NumPy).
- Optimize feature engineering for structured datasets.
- Work with SQL databases and data warehouses (Snowflake, BigQuery).
- Train and evaluate models using Scikit-learn, TensorFlow, or PyTorch.
- Deploy ML models using MLOps frameworks (MLflow, Kubeflow).
- Collaborate with data engineers and analysts to ensure data quality.
- Perform hyperparameter tuning and model optimization.
- Ensure data security, governance, and compliance.
Required Skills - Python programming for ML and data manipulation.
- SQL proficiency for structured data querying.
- Experience with ML frameworks (Scikit-learn, TensorFlow, PyTorch).
- Knowledge of ETL processes for structured data.
- Understanding of data warehousing concepts.
- Familiarity with cloud platforms (AWS, Azure, GCP). Preferred Qualifications
- Experience with data visualization tools (Tableau, Power BI).
- Knowledge of data governance frameworks (GDPR, HIPAA).
- Familiarity with automated ML workflows. cloud-based data solutions, leveraging Snowflake for data warehousing and Python for scripting, automation, and data processing.
A Snowflake and Python Developer is responsible for designing, developing, and optimizing data solutions using Snowflake’s cloud data platform and Python-based ETL processes.
Key Responsibilities
- Develop ETL/ELT pipelines using Python and Snowflake.
- Design and implement data models and schemas in Snowflake.
- Write and optimize SQL queries for data transformation and reporting.
- Integrate data from various sources into Snowflake.
- Implement Snowflake Tasks and Streams for real-time data processing.
- Ensure data security, governance, and compliance.
- Collaborate with data engineers and analysts to build scalable solutions.
- Perform performance tuning and query optimization in Snowflake.
- Automate workflows using Python scripts. Required Skills
- Strong SQL proficiency for Snowflake.
- Python programming for data manipulation and automation.
- Experience with ETL tools (e.g., Apache NiFi, Talend, Fivetran).
- Knowledge of cloud platforms (AWS, Azure, GCP).
- Understanding of data warehousing concepts.
- Familiarity with Snowflake features like Snowpipe, Streams, and Tasks. Preferred Qualifications
- Experience with data visualization tools (Tableau, Power BI).
- Knowledge of data governance frameworks (GDPR, HIPAA).
- Familiarity with machine learning workflows in Snowflake.
Currently, there aren't any salaries for this role at CitiusTech shared by other job seekers.
View more salaries from CitiusTech →Achieve your dream job with our top-notch tools!
Resume Checker
Our free resume checker analyzes the job description and identifies important keywords and skills missing from your resume in just a minute!
AI InterviewPrep
Utilizing advanced AI, our tool generates tailored interview questions based on your industry, role, and experience. Practice and receive feedback on your answers in real time!
Resume Builder
Let us show you the differences between a bad, good, and great resume, and guide you in building a resume that helps you stand out to employers, ensuring you land your next position faster!