Data Engineer Fabric
EU, 76
il y a 7 jours
Overview
EDV Werke is looking for a Data Engineer Fabric.
Working Model: Remote
Form of cooperation: B2B Contract
Responsibilities
- Fabric Data Platform Development Building and maintaining data platforms using lakehouse and data warehouse workloads while creating scalable pipelines and structured data models.
- ETL and Data Pipeline Engineering Developing and optimizing ETL and ELT pipelines and dataflows to ensure reliable data ingestion, transformation, and processing.
- Reporting Data Model Enablement Supporting reporting capabilities by preparing curated datasets, semantic models, and optimized data structures for business analytics.
- Cross-Platform Data Integration Implementing and maintaining data integration and federation between cloud data environments to ensure secure and consistent data sharing.
- Platform Operations and Maintenance Handling operational support including debugging, incident resolution, performance tuning, and maintenance of existing data solutions.
- Data Architecture Improvement Refactoring legacy components and improving platform architecture, workflows, and automation practices.
- Security and Quality Compliance Ensuring data solutions comply with internal governance standards, security policies, and quality processes.
- Platform Evolution and Feature Adoption Implementing new platform capabilities and aligning data engineering solutions with evolving platform roadmaps.
Must-Have Skills
- Enterprise Data Platform Experience Strong experience designing and operating modern cloud-based data platforms using integrated analytics and storage services.
- Lakehouse and Data Warehouse Workload Expertise Practical experience implementing and managing lakehouse and warehouse data workloads within modern data platforms.
- Data Lake Architecture Knowledge Understanding of centralized data lake architectures, unified storage concepts, and scalable data management principles.
- Data Pipeline Engineering Skills Strong experience building and managing ETL or ELT pipelines using modern data integration and orchestration tools.
- Reporting Data Model Optimization Experience preparing optimized datasets and semantic models for reporting and analytics platforms.
- Advanced SQL Development Skills Strong SQL expertise for designing efficient schemas, managing data transformations, and optimizing queries.
- Python Data Engineering Skills Strong programming experience using Python or similar languages for data processing and automation with test-driven development practices.
- Data Modeling Expertise Strong experience designing scalable and maintainable data models supporting analytics and operational reporting.
Nice-to-Have Skills
- Software Architecture Knowledge Understanding of software architecture principles and common design patterns used in scalable data systems.
- Pipeline Orchestration Experience Familiarity with workflow orchestration tools used for scheduling and coordinating data pipelines.
- Data Quality Framework Knowledge Experience implementing data validation and quality frameworks to ensure reliability of data pipelines.
- Additional Cloud Platform Experience Hands-on experience working with services from other major cloud ecosystems for analytics, serverless processing, and machine learning.
Benefits
- Competitive salary with performance-based bonuses.
- Opportunities for professional development and advancement.
- Dynamic and collaborative work environment.
Entreprise
Edvwerke
Plateforme de publication
WHATJOBS
Offres pouvant vous intéresser
BREST, 29
il y a 8 jours
BREST, 29
il y a 24 jours
STRASBOURG, 67
il y a 24 jours
LYON, 69
il y a 24 jours