£58K/yr to £65K/yr
Bristol, England
Permanent, Variable

Lead Data Engineer

Posted by Artis Recruitment .

Lead Data Engineer required by our market leading, award winning, retail sector client based in Avonmouth. Our client have recently migrated across to Microsoft Dynamics Business Central and require a Lead Data Engineer to lead the development, maintenance, support, and integration across an increasing array of data sources into an Azure-hosted SQL data warehouse and the Power BI platform.

This is a hybrid role with 2-3 days a week onsite but with flexibility.

The Lead Data Engineer will oversee the technical aspects of Reporting & BI solutions within the Technology team, providing robust reporting capabilities to support Commercial, Operational, and Finance functions.

Main Responsibilities

  • Analyse, optimise, and refactor existing stored procedures, ETL jobs, and data workflows to improve performance and reduce load times on the current data warehouse.
  • Clean up and enhance the efficiency of the current Power BI Premium (P1) instance, ensuring optimal resource allocation and utilisation as data volumes and complexity grow.
  • Review and maintain the scheduling of data jobs to ensure timeliness and reliability, improving operational processes to meet evolving business needs.
  • Implement data quality checks, governance, and performance monitoring for both ETL processes and reporting outputs.
  • Lead and actively execute the long-term migration of the on-premises data warehouse to Microsoft Fabric, including hands-on development and implementation of a robust strategy for transitioning datasets, ETL pipelines, and reporting solutions to the cloud.
  • Build scalable ETL processes using Azure Data Factory, Azure Synapse, and other Fabric-native tools, ensuring a smooth integration between the on-prem and future cloud environments.
  • Design and implement Azure-based data architecture, leveraging tools like Azure Data Lake, Azure Synapse Analytics, and Fabric's data capabilities to future-proof the platform.
  • Utilise Azure tools such as Synapse Analytics, Logic Apps, and Azure Functions to streamline data processing and orchestration across the business.
  • Empower stakeholders to build their own reports and dashboards by enhancing data accessibility and developing self-service capabilities.

Required Background

  • Extensive experience in maintaining and optimising on-premises SQL Server data warehouses, focusing on stored procedures, indexing, partitioning, and load performance.
  • Deep understanding of Azure cloud tools such as Azure Synapse Analytics, Azure Data Lake, Azure Data Factory, and Logic Apps
  • Hands-on experience with Microsoft Fabric or similar modern cloud platforms, with a strong focus on data migration strategies and ETL development
  • Proficiency in writing advanced SQL queries, stored procedures, and Python scripts for data manipulation, pipeline development, and automation
  • Experience with ETL tools (both on-prem like SSIS, and cloud-based like Azure Data Factory) for creating, optimising, and maintaining data pipelines
  • Skilled in Power BI Premium, including data modelling, DAX, performance tuning, and advanced reporting features like composite models and incremental refresh
  • Strong understanding of data governance, security best practices, and compliance requirements (GDPR, etc.)
  • Experience working with Agile methodologies
  • Excellent communication skills, capable of collaborating with both technical and non-technical stakeholders to translate business needs into effective data solutions.

The 'Nice to Haves'

  • Experience with Microsoft Dynamics 365 Business Central
  • Experience of defining and operating within data management best practice frameworks
  • Experience working in a Retail environment

This fantastic opportunity comes with a competitive starting salary which is accompanied by the following benefits: 7-10% annual bonus, Health Cash Plan, Life Assurance, 25 days annual leave and a generous staff discount to name but a few.

We use cookies to measure usage and analytics according to our privacy policy.