One of Asia’s leading software company is expanding their development team in Taiwan, looking for data talent with strong coding skill to join them for the next step of their business.
Build up and maintain optimal data pipeline architecture
Assemble large, complex data sets that meet functional / non-functional business requirements
Identify, design, and implement internal process improvements : automate manual processes, optimise data delivery, advance the infrastructure for greater scale, etc.
Create analytics tool that use the data pipeline to offer actionable insight into customer acquisition, operational efficiency and other key business performance metrics
Bachelor or Master degree in Computer Science is a plus
More than three years experience in role
Familiar with big data tools : Hadoop, Spark, Kafka, etc.
Familiar with relational SQL and NoSQL databases, including MySQL and Hbase
Familiar with stream-processing systems : Spark-Streaming, Storm, etc.
Familiar with one of these languages : Go, Java, python , Scala, etc.
Familiar with Linux
About the organisation
The company offers their engineers great working environment and flexible work arrangement. Their people act with speed, innovation and creativity.
The company gives autonomy to drive your career forward all adds up to a great place to work.