Description:
Key Responsibilities
- Design, build, and maintain scalable data pipelines (ETL/ELT)
- Develop and optimize data warehouses and data models
- Ensure data accuracy, consistency, and reliability
- Work with structured and unstructured data from multiple sources
- Collaborate with Data Analysts, Data Scientists, and Product teams
- Monitor data workflows and troubleshoot performance issues
- Implement best practices for data security and governance
- Document data processes and technical solutions clearly
Required Skills & Qualifications
- 2–4 years of experience as a Data Engineer
- Strong proficiency in SQL and Python
- Experience with data warehouses (Snowflake, BigQuery, Redshift, or similar)
- Familiarity with ETL tools and orchestration frameworks (Airflow, dbt, etc.)
- Experience with cloud platforms (AWS, GCP, or Azure)
- Solid understanding of data modeling and database concepts
- Good verbal and written communication skills
- Ability to explain technical concepts to non-technical stakeholders