As a Data Engineer, you will play a crucial role in designing, building, and maintaining robust data infrastructure. You will collaborate closely with data scientists, analysts, and business stakeholders to ensure efficient and scalable data pipelines that drive actionable insights.
Key Responsibilities:
- Design and implement data models, ETL processes, and data pipelines tailored to business needs
- Optimize data flow and collection for cross-functional teams
- Enhance the reliability, efficiency, and quality of data systems
- Build analytics tools that leverage the data pipeline for actionable insights
- Collaborate with data scientists to deploy machine learning models into production
Required Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field
- 3+ years of experience in data engineering roles
- Proficiency in Python, SQL, and big data technologies (e.g., Hadoop, Spark)
- Experience with cloud platforms (AWS, GCP, or Azure)
- Strong understanding of data warehousing and ETL processes
- Excellent problem-solving abilities and communication skills
Nice to Have:
- Experience with streaming technologies (Kafka, Kinesis)
- Familiarity with machine learning deployment
- Knowledge of data visualization tools (Tableau, Power BI)