Experience: 6 to 10 years
Roles & responsibilities
- Design, develop, and maintain ETL/ELT data pipelines using Airflow, Python, and DBT
- Build and optimize data models in Snowflake to support analytics and reporting needs
- Implement best practices for data quality, validation, and governance
- Collaborate with analysts, data scientists, and business stakeholders to understand data requirements
- Monitor, debug, and optimize existing data workflows for performance and cost efficiency
- Manage integrations between various data sources (APIs, databases, third-party systems)
- Contribute to the automation of data infrastructure deployment and management
Preferred candidate profile
- 6-10 years of experience in data engineering, snowflake or like wise ETL/ELT platforms
- Proficiency with SQL & in building analytical data models
- Experience with in Python or one of scripting language
- Hands-on with Snowflake data warehouse development & performance tuning
- Experience in building and orchestrating data workflows using Apache Airflow
- Solid understanding of DBT (Data Build Tool) for modular data transformation
- Nice to have experience with Git, CI/CD pipelines, and modern data stack tooling