Description:
For an international organization in Europe, we are urgently looking for a Full Remote Senior Data Engineer.
Candidates need to be willing to work normal Central European office hours. This positions is long-term.
Tasks and responsibilities:
- Architect, design, and implement robust data pipelines using Azure Data Factory for data extraction, transformation, and loading (ETL) processes;
- Utilize Azure Databricks for data processing and orchestration, leveraging its scalability and integration capabilities;
- Design and implement efficient physical data models in SQL Server, optimizing performance and storage requirements;
- Perform data profiling and analysis to identify opportunities for data model enhancements and optimization;
- Manage Azure resources and services, including Azure Data Lake Storage, Azure SQL Database, and Azure Blob Storage, to support data engineering activities;
- Monitor and optimize Azure infrastructure and services to ensure scalability, performance, and cost-efficiency;
- Integrate data from diverse sources into a unified data mart or data lake environment, ensuring consistency, accuracy, and integrity;
- Implement data consolidation strategies to streamline data management and reduce redundancy.
- Develop automation scripts using Python to automate data processing tasks, such as data ingestion, cleansing, and enrichment;
- Implement error handling and logging mechanisms to ensure data pipeline reliability and resilience;
- Identify and resolve performance bottlenecks in data pipelines and SQL queries, optimizing throughput and latency;
- Implement data quality checks and validation rules to ensure data accuracy, completeness, and consistency;
- Monitor data quality metrics and implement corrective actions as needed to maintain high data quality standards;
- Collaborate with cross-functional teams, including project managers, data analysts, data scientists, and business stakeholders, to understand data requirements and deliver solutions;
- Share knowledge and best practices with team members, providing guidance and mentoring on data engineering techniques and tools;
- Document data pipelines, data models, and integration processes, maintaining clear and comprehensive documentation for future reference and troubleshooting;
- Support the AMS team on KT and handover activities;
Profile:
- Bachelor or Master degree;
- +9 years of development experience with a focus on data integration for data warehousing and data marts;
- Proficient in Azure, with hands-on experience in implementing data solutions within Azure environment ( Azure data factory, Data bricks, etc.);
- Worked with different architectures to implement effective data solutions, demonstrating a deep understanding of data engineering principles;
- Solid experience with SQL Server is a must;
- Proficiency in Python coding, particularly in the context of data processing and data application development, is strongly preferred;
- Experience with CI/CD practices in the context of data engineering, coordinating deployments, and ensuring data pipeline reliability;
- Knowledge and experience in Snowflake is an added advantage.
- Excellent problem-solving and analytical skills;
- Fluent in English;