- Nov 24, 2022
- ¥5.5M to ¥8M
Japanese: Basic level
Looking for a talented individual with at least 2 years experience in System Administration
- Build up and maintain large scale batch and real-time data pipelines using processing frameworks like Spark, Flink, Kafka, Druid etc.
- Use your skills to build analytics tools that utilize the data pipeline to provide insights into customer acquisition, efficiency and ither key business performance metrics.
- Improve the data quality through testing and tooling.
- Work together with data and analytics experts to strive for greater functionality.
- At least 2-3 years experience in system administration/warehousing.
- Have at least 2 years DevOps experience with Hadoop distributions (Hortonworks, Cloudera, EMR, DataProc, or HDinsight).
- Comfortable with setting up and maintaining CentOS systems on premises or on the cloud and be familiar with bash/python.
- IaaS experience or variants such as Chef, Ansible, Puppet or OpsWorks.
- Experience with implementation, using, monitoring, and maintaining Apache Druid at scale.
- Have some knowledge of Google Analytics/Adobe Analytics.
About our client
A fantastic Web-Company with a global mindset.
Morgan McKinley Asia Pac is acting as an Employment Business in relation to this vacancy.