£315/day
England, United Kingdom
Contract, Variable

Data Engineer - Google Cloud Platform, GCP

Posted by Peopleworks.

Data Engineer - Google Cloud Platform, GCP

Contract: 12 Months - 56200

Rate: £315 per Day (Inside IR35)

Location: Essex offices (near Basildon) and Hybrid

One of the world's most advanced engineering companies and a household name is seeking a Data Engineer to work in its Global Data Analytics and Insights division to be part of an innovative journey to transform how data is managed and utilized across the organization. The team focuses on comprehensive data ingestion, ensuring regulatory compliance, and democratizing access to enhanced insights.

The successful candidate will be responsible for building scalable data products in a cloud-native environment. You will lead both inbound and outbound data integrations, support global data and analytics initiatives, and develop always-on solutions. Your work will be pivotal in ensuring our data infrastructure is robust, efficient, and adaptable to evolving business requirements.

Main Tasks & Responsibilities:

Data Engineer - Google Cloud Platform, GCP

> Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.

> Build & maintain data products in accordance with Data Factory standards, ensuring adherence to data quality, governance, and control guidelines.

> Develop and automate scalable cloud solutions using Google Cloud Platform native tools (e.g., Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, Big Query) and Apache Airflow.

> Operationalize and automate data best practices: quality, auditable, timeliness and complete

> Monitor and enhance the performance and scalability of data processing systems to meet organizational needs.

> Participate in design reviews to accelerate the business and ensure scalability

> Advise and direct team members and business partners on Ford standards and processes

Essential Skills & Required:

Data Engineer - Google Cloud Platform, GCP

# Develop custom cloud solutions and pipelines with Google Cloud Platform (GCP) native tools Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, and Big Query

# Proficiency in SQL, Python, and Pyspark.

# Expertise in GCP Cloud and open-source tools like Terraform.

# Experience with CI/CD practices and tools such as Tekton.

# Knowledge of workflow management platforms like Apache Airflow and Astronomer.

# Proficiency in using GitHub for version control and collaboration.

# Ability to design and maintain efficient data pipelines.

# Familiarity with data security, governance, and compliance best practices.

# Strong problem-solving, communication, and collaboration skills.

# Ability to work autonomously and in a collaborative environment.

# Ability to design pipelines and architectures for data processing.

# Experience with data security, governance, and compliance best practices in the cloud.

# An understanding of current architecture standards and digital platform services strategy.

# Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.

# Meticulous approach to data accuracy and quality

Desirable (note essential) Skills & Experience:

Data Engineer - Google Cloud Platform, GCP

~ Excellent communication, collaboration and influence skills; ability to energize a team.

~ Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality - Hands on experience in Python using libraries like NumPy, Pandas, etc.

~ Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI

~ Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products.

~ Data Governance concepts including GDPR (General Data Protection Regulation) and how these can impact technical architecture.

We use cookies to measure usage and analytics according to our privacy policy.