Job Summary
Key Responsibilities
2. To conduct comprehensive code reviews, establish and oversee quality assurance processes, performance optimization , implementation of best practices and coding standards to ensure successful delivery of complex projects.
3. To ensure process compliance in the assigned module| and participate in technical discussions/review as a technical consultant for feasibility study (technical alternatives, best packages, supporting architecture best practices, technical risks, breakdown into components, estimations).
4. To collaborate with stakeholders to define project scope, objectives, deliverables and accordingly prepare and submit status reports for minimizing exposure & closure of escalations.
Skill Requirements
Essential Skills and Experience
- Strong backend development experience in .NET Core or Python.
- Hands-on experience with Azure Data Factory, Databricks, and Azure-based data ecosystems.
- Proficiency in SQL, data modeling, and data warehousing concepts.
- Experience working with Delta Lake and Azure Data Lake Storage (ADLS).
- Solid understanding of ETL/ELT frameworks, data governance, and data security.
- Ability to communicate complex technical concepts to non-technical stakeholders.
- Strong analytical, problem-solving, and organizational skills.
Desired Qualifications
- Certifications such as Azure Data Engineer, AWS Data Analytics, Google Professional Data Engineer, or DAMA.
- Experience with data visualization tools (Power BI, Tableau).
- Experience with additional programming languages (Java, Scala).
- Knowledge of regulatory requirements and industry-standard data models.
Other Requirements
Position Overview
We are seeking a highly skilled Data Engineer & Backend Developer to design, build, and optimize data pipelines, data models, and backend APIs supporting our Retail portfolio. This hybrid role combines strong data engineering expertise with backend development skills in .NET Core or Python, enabling efficient data movement to and from our Data Hub Architecture (DHA). The ideal candidate is collaborative, analytical, and passionate about building scalable data systems that drive business value.
Key Accountabilities
Backend Development
- Design, develop, and maintain RESTful APIs using .NET Core or Python to read/write data from/to the Data Hub Architecture (DHA).
- Integrate APIs with Azure Data Factory, Databricks, and downstream applications.
- Implement authentication, authorization, error handling, and logging within backend services.
Data Engineering
- Develop ETL/ELT pipelines in Azure Data Factory and Databricks.
- Create conceptual, logical, and physical data models to support business requirements.
- Build and maintain Delta Lake tables in ADLS for staging, reference, and historical use cases.
- Integrate usage history, weather data, pricing curves, and product terms into standardized data models.
- Define and enforce data architecture standards, data governance, quality, and archival strategies.
- Support data migration from legacy systems to modern cloud-based platforms.
- Enable automated workflows using Azure Notebooks and Databricks jobs.
- Collaborate closely with cross-functional teams to align engineering practices with organizational goals.