
Lead Data Engineer
- Θεσσαλονίκη
- Μόνιμη
- Πλήρης Απασχόληση
- Lead the design and development of comprehensive data engineering frameworks and patterns.
- Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general)
- Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs.
- Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient.
- Promote data governance and master/reference data management as a strategic discipline.
- Implement strategies to monitor the effectiveness of data management.
- Be an engineering leader and coach data engineers and be an active member of the data leadership team.
- Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform.
- Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements
- Collaborate with Data Modelers to create data models (conceptual, logical, and physical)
- Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood
- Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals
- Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders
- Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability
- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or a related field; Master's degree preferred
- Minimum of 10 years' experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred.
- Proven track record of successful implementation of data architecture within large-scale transformation programs or projects
- Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices
- Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies
- Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi)
- Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database)
- Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation)
- Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines
- Familiarity with Agile methodologies and experience working in Agile project environments
- Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture