Hadoop Specialist / Hadoop技术支持
You will function as an administrator and engineer working on our World Technical Operations team on many complex and strategic projects within the Hadoop/Big Data realm.
You will be expected to understand the business goals/needs, utilize Hadoop technology to ingest and analyze large amounts of data, and transform findings into actionable business solutions. Job responsibilities will include defining and documenting technical requirements including Data Mappings, building SQL/HIVE databases, developing custom MapReduce processes, job automation, using analytical tools to find trends and predictive characteristics, developing and automating reports, and exploring new analytical and big data tools.
Mission:
- Design, implement and deploy custom applications on Hadoop
- Troubleshoot issues within the Hadoop environment
- Performance tuning of Hadoop processes and applications.
- Delivery of high-quality work, on time and with little supervision
- 20% of time input on Linux administration
- Follow The Sun support on Hadoop and Linux
Qualifications
- Experience of building and coding applications using Hadoop components - HDFS, Hbase, Hive, Sqoop, Flume etc.
- Experience of coding Java MapReduce, Python, Pig programming, Hadoop Streaming, HiveQL
- Minimum 1 year understanding of traditional ETL tools & Data Warehousing architecture
- Experience in Teradata and other RDBMS is a plus.
- Experience with Cloudera.
- Hands on expertise in Linux/Unix and scripting skills are required
- Expertise on Linux is a plus
Présentation de l'entreprise
Since Ubisoft Chengdu opened in 2008, it has become one of the key game studios in western China with over 275 talents contributing to some of the biggest brands in the Ubisoft line-up. The studio has been working closely with other Ubisoft studios on AAA brands such as Assassin’s Creed®, Tom Clancy’s Rainbow Six® Siege, Tom Clancy’s The Division®, Skull and Bones™ and For Honor®.