_______________________
_______________________
________________________
________________________
______________________
_______________________
_____________________
______________________
_______________________
__________________________
Job Purpose As a Data Engineer, you'll be at the forefront of enhancing our client's data capabilities, ensuring every bit is optimised for success. You will be collaborating closely with their visionary Data Scientists, crafting intricate data models and stunning visualisations that will drive strategic decision-making to new heights. Your expertise in programming languages and data analysis tools will be the key to unlocking their data's full potential. Your contributions will be pivotal in driving strategic decision-making, fueling various impactful use cases. Harness your prowess in programming languages and data analysis tools, while showcasing your formidable analytical and communication skills. Responsibilities Support the design and maintenance of data integrity checks across our client's various systems. Collaborate with the Data Scientists to understand data needs, representing key data insights in a meaningful way. Building data pipelines to bring together information from different source systems that will integrate, consolidate, and cleanse data and structure it for use in analytics applications. Collaborate with Developers to design and implement scalable data pipelines for efficient data processing and analysis. Conduct exploratory data analysis to identify patterns, trends, and anomalies in financial data, and provide recommendations for actionable insights to the Data Scientist. Focus on collecting and preparing data for use by Data Scientists and analysts. Solve challenging data integration problems, utilising optimal ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources. Support the design, build, and launch collections of sophisticated data models and visualisations that support multiple use cases across different products or domains. Optimise dashboards, frameworks, and systems to facilitate easier development of data artifacts. Undertake technical or any other relevant training as and when required. Be proactive in developing your knowledge of the industry and the market. Undertake and record relevant Continuous Professional Development (CPD) to develop knowledge and skills. Competencies Technical /Qualifications Degree, ideally in computer science, IT, statistics, analytics, mathematics or other related field. Proven experience in data engineering, data analysis/ processing, machine learning techniques, data warehouses, data pipelines, preferably in a Financial Services or Wealth Management environment. Systems/Internal Processes Familiarity with programming languages such as Python, R or Java and with data analysis libraries (e.g. Pandas, NumPy, scikit-learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design / development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role escalating areas of concern or issue in a timely manner. Core Competencies/Skills Excellent analytical, statistical, and problem-solving capabilities Ability to work independently and proactively Ability to analyse large amounts of data and make deductions and decisions from the data which will add to the growth of the business. Strong vision to develop enhancements to systems Excellent communication skills, with the ability to effectively convey complex technical concepts to non-technical stakeholders Highly motivated and adaptable, with a passion for leveraging data-driven insights to solve business challenges and drive strategic decision-making Morgan McKinley is acting as an Employment Agency and references to pay rates are indicative. BY APPLYING FOR THIS ROLE YOU ARE AGREEING TO OUR TERMS OF SERVICE WHICH TOGETHER WITH OUR PRIVACY STATEMENT GOVERN YOUR USE OF MORGAN MCKINLEY SERVICES.
Data Engineer Python PySpark SQL ETL Azure Databricks AWS Data Engineer - Up to £60,000 Bonus & Benefits Sports Analytics Agency London - Hybrid working (2-days in office) Are you a Data Engineer with experience in developing bespoke ETL pipelines and data infrastructure? Are you passionate about the world of sports? If so, this may be the perfect opportunity for you! Method Resourcing have partnered with an industry-leading Sports Analytics agency who are looking for a mid-level Data Engineer to join their team and have the opportunity to work with some of the biggest names in Sports. The Data Engineer will be responsible for gathering requirements from end-clients, designing Data infrastructures to integrate data sources such as bespoke ETL pipelines and the creation of storage solutions. We are looking for a Data Engineer that has hands-on experience in a majority of the following areas: Python/PySpark coding and development, as well as SQL Strong experience in cloud data platforms - Azure, AWS, GCP Hands-on API development - API and SOAP principles Agile working methodologies Previously worked in Sports and/or Consultancy setting The role is paying up to £60,000 DOE Bonus & Benefits. We are looking for a candidate in or around London who is happy to work on a 2-day in-office working model. We are unable to provide Visa sponsorship for this role - many apologies Please apply now for immediate consideration! Data Engineer Python PySpark SQL ETL Azure Databricks AWS Data Engineer - Up to £60,000 Bonus & Benefits Sports Analytics Agency London - Hybrid working (2-days in office)