Kthe organization Responsibilities
- Design, build, and maintain scalable data pipelines for archiving and retrieval systems
- Optimize ETL processes to ensure efficient data ingestion and processing
- Collaborate with cross-functional teams to define data requirements and schema standards
- Implement robust error handling and monitoring for data integrity
- Develop and maintain documentation for data architecture and workflows
- Ensure compliance with data governance and security policies
Requirements
- 5+ years of experience in data engineering or related roles
- Proficiency in SQL, Python, and ETL frameworks
- Experience with Apache Spark, cloud storage solutions, and data warehousing
- Strong understanding of data modeling and schema design
- Familiarity with data governance and security best practices