Job Description
We are looking for a skilled Data Warehouse Engineer with strong expertise in Snowflake, dbt, and SQL to join our Data Platform team. The ideal candidate will be responsible for end-to-end development of data pipelines, including data ingestion, transformation, and output layers within Snowflake.
You will work closely with cross-functional teams to design scalable data models, ensure data quality, and support analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and maintain end-to-end data pipelines in Snowflake
- Build and manage data ingestion frameworks from multiple sources (APIs, databases, flat files, etc.)
- Develop data transformation logic using dbt and SQL
- Create and maintain data models optimized for reporting and analytics
- Implement data quality checks, validations, and monitoring
- Optimize Snowflake performance, including query tuning and cost optimization
- Collaborate with analysts, data scientists, and business stakeholders
- Ensure proper documentation of data flows, models, and processes
- Support data governance and security best practices
- Troubleshoot and resolve data pipeline and performance issues
Required Skills & Qualifications
- Strong experience with Snowflake (end-to-end development)
- Proficiency in SQL (advanced level)
- Hands-on experience with dbt (data build tool)
- Experience in data warehousing concepts (star schema, dimensional modeling)
- Knowledge of ETL/ELT processes and data pipeline architecture
- Familiarity with cloud platforms (AWS / Azure / GCP) is a plus
- Experience with version control (Git)
- Strong problem-solving and analytical skills
