Job Summary
The Technical Lead will be responsible for leading and managing the implementation, maintenance, and support of Snowflake, Azure Data Factory (ADF), and Data Bricks solutions. The role involves overseeing the technical aspects of projects related to data engineering, data integration, and data processing using these platforms.
Key Responsibilities
Role: Data Engineer- Azure Databricks
| JD: • 4–7 years of experience in data engineering, with at least 2+ years in Databricks. • Deep hands-on expertise in Azure Databricks, PySpark, and Delta Lake. • Strong experience with Azure Data Factory, Data Lake Storage Gen2, and Azure Synapse. • Proficiency in SQL, Python, and distributed data processing. • Experience with Unity Catalog, RBAC, and managing multi-tenant data platforms. • Strong understanding of data modelling (dimensional & relational), performance tuning, and partitioning strategies. • Familiarity with event-driven architectures, including Event Hubs, Kafka, or IoT data pipelines. • Experience with Git, CI/CD tools, and Infrastructure as Code (Terraform, Bicep, or ARM templates). ________________________________________ Preferred Qualifications: • Azure certifications (e.g., DP-203: Azure Data Engineer Associate). • Exposure to streaming frameworks, ML model integration, and Power BI. • Experience working in Agile delivery environments with strong documentation and stakeholder engagement skills. |
Skill Requirements
2. Handson experience with azure data factory (adf) including data pipeline development, integration, and monitoring.
3. Strong knowledge of data bricks for big data processing, data engineering, and machine learning workflows.
4. In-depth understanding of data warehousing concepts, etl processes, and data modeling techniques.
5. Excellent problem-solving skills and ability to work in a fast paced environment.
6. Strong communication and leadership skills to effectively collaborate with team members and stakeholders.