£30K/yr to £40K/yr
London, England
Permanent, Variable

Junior Data Engineer

Posted by Sagacity .

The Junior Data Engineer will support the development, maintenance, and optimization of our data pipelines and data architecture for our implemented client products and services. This role involves working closely with data analysts, data scientists, and other stakeholders to ensure the efficient and reliable flow of data across the organization.

Key Responsibilities:

Data Pipeline Development:

  • Assist in designing, building, and maintaining scalable data pipelines.
  • Implement ETL (Extract, Transform, Load) processes to ingest data from various sources.
  • Ensure data quality, integrity, and reliability through thorough testing and validation.

Database Management:

  • Support the management and optimization of data storage solutions (e.g., SQL, NoSQL databases, data lakes).
  • Monitor and maintain database performance and security.

Data Integration:

  • Collaborate with data analysts and scientists to integrate and consolidate data from multiple sources.
  • Develop APIs and other interfaces for data access and manipulation.

Documentation and Reporting:

  • Document data processes, pipelines, and architectures.
  • Generate reports and visualizations to communicate data insights and pipeline performance.

Technical Support:

  • Provide technical support for data-related issues and troubleshoot problems as they arise.
  • Assist in the implementation of data governance and compliance policies.

Continuous Improvement:

  • Stay up to date with emerging trends and technologies in data engineering.
  • Participate in code reviews and contribute to the continuous improvement of data engineering practices.

Qualifications:

Educational Background:

  • Bachelor's degree in computer science, Engineering, Information Technology, or a related field.
  • Relevant certifications in data engineering or cloud platforms (e.g., AWS, Azure, Google Cloud, Databricks) are a plus.

Technical Skills:

  • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL).
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow).
  • Familiarity with programming languages such as Python, Java, or Scala.
  • Basic understanding of cloud platforms and services (e.g., AWS, Azure, Google Cloud).
  • Knowledge of big data technologies (e.g.Spark) is a plus.

Analytical Skills:

  • Strong problem-solving skills and attention to detail.
  • Ability to analyze complex data sets and identify patterns or anomalies.

Soft Skills:

  • Excellent communication and teamwork skills.
  • Eagerness to learn and adapt to new technologies and methodologies.
  • Ability to manage multiple tasks and meet deadlines.

Experience:

  • 0-2 years of experience in data engineering, data analysis, or a related field.
  • Internship or project experience in data engineering is highly desirable.