Find TalentFind a Job

    Data Engineer (Databricks) - Hybrid, Limerick

    LimerickContract€300 - €350 pd
    Back to job search
    8 hours ago
    JN -052025-1982187
    New

    Data Engineer (Databricks) - Hybrid, Limerick

    Limerick Contract €300 - €350 pd

    About the job

    A leading client is seeking a Data Engineer with strong Databricks expertise to support a key data transformation iinitiative. This is a contract position running until the end of the year, with hybrid working (2-3 days onsite per week in Limerick). The role sits within a collaborative data team focused on enabling next-generation data solutions and analytics capabilities.

    Key Responsibilities:

    • Develop and maintain scalable data pipelines using Databricks and Apache Spark

    • Collaborate with analysts and data scientists to deliver fit-for-purpose data solutions

    • Enhance existing ETL processes for improved performance and reliability

    • Implement robust data validation and quality checks

    • Work with large-scale datasets and Big Data technologies

    • Support data modeling and data warehouse design for analytics and reporting

    • Monitor and resolve performance issues in data pipelines

    • Contribute to code reviews and promote engineering best practices

    • Maintain documentation of data workflows, architecture, and processes

    • Keep up with emerging trends in data engineering and cloud technologies

    Required Skills & Experience:

    • 3+ years in data engineering or a similar role

    • Strong hands-on experience with Databricks and Apache Spark

    • Proficiency in SQL and scripting languages such as Python or Scala

    • Solid understanding of ETL concepts, data modeling, and data warehousing

    • Experience with cloud platforms, ideally Azure (AWS also considered)

    • Knowledge of data quality frameworks and governance principles

    • Strong problem-solving skills and attention to detail

    • Excellent communication and teamwork skills

    Desirable:

    • Exposure to machine learning workflows and data science environments

    • Familiarity with modern data architectures (e.g., data lakes)

    • Experience with version control (Git) and CI/CD practices