Role: Team Lead - Data Engineering (Cloud & Databricks)
Location: Dublin
Sector: Global Financial Services / Alternative Investments
Technical Stack: Databricks, AWS, Spark, Python, Terraform
The Opportunity
We are seeking a high-caliber Data Engineering Team Lead to spearhead the development and evolution of our enterprise-scale Cloud Data Lakehouse. You will lead a high-performing technical team in building a centralized, high-performance data ecosystem that powers global operations for a premier financial services firm.
This is a "player-coach" role requiring deep technical mastery of Databricks internals, Spark optimization, and cloud infrastructure.
Core Responsibilities
- Technical Leadership: Mentor a distributed team of engineers, driving excellence through rigorous code reviews, architectural design sessions, and continuous knowledge sharing.
- Architecture & Design: Design and implement scalable Lakehouse solutions for complex enterprise data processing and advanced analytics.
- Pipeline Engineering: Build and optimize high-throughput ETL/ELT workflows using Delta Live Tables (DLT) and Structured Streaming for real-time data needs.
- Performance Tuning: Deep-dive into Spark job configurations and cluster optimization to ensure maximum efficiency and cost-effectiveness.
- Governance & Security: Implement robust data governance frameworks using Unity Catalog and manage secure access through cloud-native IAM best practices.
- Modern DevOps: Manage infrastructure as code (IaC) using Terraform and maintain CI/CD integrity via Git-based workflows.
- Agile Delivery: Champion Scrum methodologies to ensure timely delivery of complex data products.
Required Qualifications
- Experience: 8+ years in Data Engineering, with at least 3+ years of specialized experience on the Databricks platform.
- Leadership: Proven track record of managing technical teams and leading complex projects from conception to production.
- Programming: Expert-level proficiency in Python and Spark.
- Cloud Ecosystem: Strong hands-on experience with AWS (specifically S3, Lambda, and Glue) or equivalent cloud providers.
- Data Modeling: Advanced SQL skills and a deep understanding of relational and non-relational data modeling.
- Frameworks: Mastery of Delta Lake architecture and modern ETL/ELT design patterns.
- Innovative Mindset: Practical experience using AI-assisted development tools to enhance team productivity and code quality.
Preferred Skills (Nice-to-Have)
- Experience within the Financial Services or Fintech sectors.
- Knowledge of AI/ML deployment patterns (MLOps).
- Experience building and consuming RESTful APIs for data distribution.
- Background in real-time data streaming and event-driven architectures.
Why Join This Team?
You will be at the forefront of data innovation in a culture that prioritizes professional growth and technical excellence. We offer a flexible working environment, comprehensive benefits, and the opportunity to work on one of the most sophisticated data platforms in the industry.
