- Dublin South
- JN -092023-1947325
- Sep 15, 2023
Looking to recruit a Senior Data Engineer for an international financial services company based in South Dublin. Contract role for an initial 12 months w/ view to extend, 2 days a week onsite required
Looking to hire a Senior Data Engineer to help fight crypto fraud, prevent money laundering and use state of the art tools to build models for AI & Machine Learning. You will be responsible for expanding and optimising data pipeline architecture, as well as optimising data flow and collection for cross functional teams.
Ideally, you will be a data pipeline builder who enjoys optimising data systems and building them from the ground up. You will have worked with Data Science teams and understand the importance of quality and integrity in the end-data, as well as having worked in an Agile environment and be familiar with SAFe/Scrum methodologies. A passion for data, being highly motivated and take technical ownership of solutioning with an emphasis on engineering efficiency and on-time delivery is critical. Be a self-directed, self-starter, and be comfortable supporting the needs of multiple teams, systems and products.
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using ETL processes and modern cloud technologies.
- Take ownership or clarification of requirements and solutions proposition before implementation
- Advanced working SQL knowledge and experience working with relational databases, query authoring as well as working familiarity with a variety of databases.
- Experience building and optimizing data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytical skills working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Experience with big data tools: Hadoop, Spark, Databricks, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres, Oracle and CosmosDB.
- Experience with data pipeline and workflow management tools
- Experience with cloud services
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Java, Python, etc.
Morgan McKinley is acting as an Employment Agency and references to pay rates are indicative.