£45K/yr to £54K/yr
London, England
Permanent, Variable

Data Engineer

Posted by Sanderson.

Job Title: PySpark Data Engineer - Azure

Salary: up to £54,000

Mostly remote

About Us: Join our innovative team where PySpark expertise meets Azure ingenuity. We're passionate about leveraging data to drive business success and looking for a skilled Data Engineer to join us in delivering high-impact solutions.

Key Responsibilities:

  • Develop secure, efficient data pipelines using PySpark for ingestion, transformation, and consumption within Azure.
  • Ensure data quality and adherence to best practices throughout the pipeline life cycle.
  • Design and optimise physical data models to meet business needs and storage requirements.
  • Collaborate with cross-functional teams to deliver BI solutions and reporting structures using Power BI.

Experience & Qualifications:

  • 2-5 years of experience in designing and implementing PySpark-based data solutions.
  • Proficiency in SQL and Azure technologies (Data factory, Synapse).
  • Strong understanding of data life cycle management and CI/CD principles.
  • Experience working with large, event-based data sets in enterprise environments.
  • Excellent communication skills with a passion for leveraging data to drive business value.

Skills & Attributes:

  • Creative problem solver with a proactive and optimistic mindset.
  • Strong time management and organisational skills, capable of handling multiple priorities.
  • Enthusiastic team player committed to delivering shared outcomes.
  • Ability to mentor and support less experienced engineers.
  • Power Bi Knowledge a bonus

What We Offer:

  • Competitive salary and benefits package.
  • Opportunities for professional growth and advancement.
  • A collaborative and supportive work environment where your ideas are valued.

Join Us: If you're ready to harness the power of PySpark and Azure to shape the future of data analytics, apply now! Let's make it happen together.

NO SPONSORSHIP

We use cookies to measure usage and analytics according to our privacy policy.