Missions Day-to-Day Responsibilities:In a multi-cultural team installing, upgrading, configuring, and maintaining a Hadoop cluster.Because security is key for us: setup security (Kerberos, AD, IDMP,SSL …)As you don’t like to repeat the same task over and over: automate operations, installation andmonitoring of the Hadoop Framework.Participate to the continuous evaluation of Hadoop infrastructure requirements and design/deploysolutions (high availability, big data clusters, hadoop in public cloud, etc)Because you always question yourself for the better: cluster Monitoring, Troubleshooting andConfigurationAs you know that evidences and traces are important in case of incident: manage and review Hadoop logfilesPart of a BigData feature team, key element of a Follow the Sun support it means you will work with Parisand Bangalore, deliver tasks of Agile Sprints and act upon incidents to ensure a 7/7 – 24/24 production. Profile Experience Needed:4+ years of relevant professional experience in Unix and Linux Systems, serverhardware, virtualization, RedHatAt least 3 years of experience of Hadoop (HortonWorks or Cloudera)Proven experience with automationKnowledge of AnsibleExcellent troubleshooting techniquesGood communication skills over the phone, e-mail and documentationDesired / Plus:Java and/or Python experienceKnowledge of Hadoop in Public Cloud (e.g. Azure HD Insight)Understanding and usage of REST APIEducational Requirements:Degree in Computer Science or related experienceLanguages: English, French Technical Skills:HDFS,Knox,Kafka,Hbase,Hive,Spark,Ranger,Kerberos,SSL,Linux,Shell programming ADVANTAGESWork from home, financial sector, located downtown. 37.5 hours a week contract.RESPONSIBILITIESDay-to-Day Responsibilities:In a multi-cultural team installing, upgrading, configuring, and maintaining a Hadoop cluster.Because security is key for us: setup security (Kerberos, AD, IDMP,SSL …)As you don’t like to repeat the same task over and over: automate operations, installation andmonitoring of the Hadoop Framework.Participate to the continuous evaluation of Hadoop infrastructure requirements and design/deploysolutions (high availability, big data clusters, hadoop in public cloud, etc)Because you always question yourself for the better: cluster Monitoring, Troubleshooting andConfigurationAs you know that evidences and traces are important in case of incident: manage and review Hadoop logfilesPart of a BigData feature team, key element of a Follow the Sun support it means you will work with Parisand Bangalore, deliver tasks of Agile Sprints and act upon incidents to ensure a 7/7 – 24/24 production. QUALIFICATIONSProfile Experience Needed:4+ years of relevant professional experience in Unix and Linux Systems, serverhardware, virtualization, RedHatAt least 3 years of experience of Hadoop (HortonWorks or Cloudera)Proven experience with automationKnowledge of AnsibleExcellent troubleshooting techniquesGood communication skills over the phone, e-mail and documentationDesired / Plus:Java and/or Python experienceKnowledge of Hadoop in Public Cloud (e.g. Azure HD Insight)Understanding and usage of REST APIEducational Requirements:Degree in Computer Science or related experienceLanguages: English, French Technical Skills:HDFS,Knox,Kafka,Hbase,Hive,Spark,Ranger,Kerberos,SSL,Linux,Shell programmingSUMMARYProfile Experience Needed:4+ years of relevant professional experience in Unix and Linux Systems, serverhardware, virtualization, RedHatAt least 3 years of experience of Hadoop (HortonWorks or Cloudera)Proven experience with automationKnowledge of AnsibleExcellent troubleshooting techniquesGood communication skills over the phone, e-mail and documentationDesired / Plus:Java and/or Python experienceKnowledge of Hadoop in Public Cloud (e.g. Azure HD Insight)Understanding and usage of REST APIEducational Requirements:Degree in Computer Science or related experienceLanguages: English, French Technical Skills:HDFS,Knox,Kafka,Hbase,Hive,Spark,Ranger,Kerberos,SSL,Linux,Shell programming