Data Engineer - London - AWS - Up to 70,000 + Benefits
Exciting opportunity to work with a forward thinking company who offer their employers the chance to work with cutting edge tech, grow their skills with fully-costed training programs, maintain a healthy work-life balance with modern working arrangements - up to 4 days a week from home - as well as a comprehensive benefits package.
You will be working within global company who boast offices across the world - from London to Melbourne, Singapore to the US + many more in between! Within this multinational is a multitude of high level business who are all world leaders within their respective fields. The organisation celebrates successes with a positive and supportive working culture - and you will realise this first hand as this organisation take the well being of their employees very seriously, understanding that work-life balance is critical to performance and moral.
As an AWS Data Engineer, you will design, develop, and maintain data pipelines and architectures. You will be utilising AWS services such as Redshift, S3, and Glue, alongside tools such as Apache Spark and Kafka, to process large datasets. Your role will involve optimising data workflows, ensuring data quality, and implementing robust data security measures. You'll collaborate with data scientists and analysts to support data-driven decision-making, utilising technologies like Python, SQL, and ETL frameworks. This position offers the opportunity to work with cutting-edge cloud technologies and contribute to innovative data solutions while growing your skills on the job!
Requirements:
- Good experience as a Data Engineer, with a focus on AWS
- Proficiency in AWS services like Redshift, S3, Glue, and Lambda
- Strong skills programming in Python or PySpark
Nice to have:
- AWS Certifications
Interviews are already underway with limited slots remaining, don't miss out on your opportunity to secure this amazing role!
Get in touch ASAP by contacting me at or on !
Data Engineer, Senior Data Engineer, Developer, AWS, Apache, Python, PySpark