Job Summary
- Having 5+ years’ experience in Data warehouse/ETL/ELT/CLOUD projects.
- Design and implement data pipelines using Snowflake/DBT, ensuring scalability, reliability, and performance.
- Develop ELT processes to extract, Load and transform data from various sources into Snowflake.
- Develop Python script to processes/Load/transform data from various sources into Snowflake.
- Collaborate with stakeholders to understand data requirements and translate them into technical solutions.
- Utilize DBT (Data Build Tool) and Airflow for orchestration, scheduling, and monitoring of data workflows.
- Optimize and tune existing data pipelines for improved efficiency and performance.
- Implement best practices for data modeling, ensuring data consistency and accuracy within Snowflake.
- Troubleshoot issues related to data pipelines, identifying and resolving bottlenecks or discrepancies.
- Stay updated with the latest advancements in Snowflake, DBT, Airflow, and other relevant technologies.
- Document processes, workflows, and configurations for reference and future improvements.
Must to have:
- Bachelor’s degree in computer science, Information Technology, or related field.
- Proven experience (3+ years) working as a Data Engineer or similar role.
- Proficiency in Snowflake, including designing data warehouses and optimizing queries.
- Proficiency in python scripting and handle the job flows.
- Proficiency in SQL and scripting languages for data manipulation and automation.
- Hands-on experience on DBT (Data Build Tool) and Airflow for data orchestration and workflow management.
- Experience with ETL/ELT processes and building scalable data pipelines.
- Strong analytical and problem-solving skills with attention to detail.
- Excellent communication skills and ability to collaborate effectively within a team.
Key Responsibilities
2. Design, build, and optimize data models using snowflake and dbt for efficient data processing.
3. Develop and optimize sql queries for data extraction, transformation, and loading processes.
4. Utilize python for scripting and automation of data processes and workflows.
5. Collaborate with cross functional teams to understand data requirements and ensure the successful delivery of technical solutions.
6. Conduct code reviews, provide technical guidance, and mentor team members to enhance overall technical capabilities.
7. Troubleshoot technical issues and provide resolutions in a timely manner to ensure smooth project execution.
Skill Requirements
2. Experience with dbt (data build tool) for data transformation and modeling.
3. Strong command of sql for querying and manipulating data in databases.
4. Proficient in python programming for scripting, automation, and data manipulation.
5. Excellent problem-solving skills and the ability to communicate effectively with technical and nontechnical stakeholders.
6. Strong leadership skills to effectively lead and motivate technical teams towards project success.