Hadoop Developers x 2 - 12-months - Telecommunications - LondonOur client in the GTA is looking to bring on two Hadoop Developers on 12-month contracts for a major telecommunications provider. The ideal candidate will have a strong Hadoop background in Scala, Spark, and Kafka. It would be great to find individuals with Kubernetes and Cloud (AWS, Azure, or GCP) experience. AdvantagesFast-paced team, latest and greatest technologies, Agile/Scrum forward-thinking environment. ResponsibilitiesResponsible for the development, design, and implementation of application systems. Designs and codes programs, including the ability to test their coding, find errors, and correct codes to provide quality coding. Interfaces with technical team to design and implement application systems.Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and supportDevelop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysisEnsure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)Create formal written deliverables and other documentation, and ensure designs, code, and documentation are aligned with enterprise direction, principles, and standardsTrain and mentor teams in the use of the fundamental components in the Hadoop stackAssist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on Big Data solutionsTroubleshoot production issues within the Hadoop environmentPerformance tuning of a Hadoop processes and applicationsProven experience as a Hadoop Developer/Analyst in Business Intelligence and Data management production support space is needed.QualificationsMinimum of 2 years of building and coding applications using Hadoop components - HDFS, Kafka, Flume, Hbase, Hive, Sqoop.Minimum of 2 years of coding Java, Scala / Spark, Python, Hadoop Streaming, HiveQLMinimum 4 years experience of traditional ETL tools & Data Warehousing design.Strong personal leadership and collaborative skills, combined with comprehensive, practical experience and knowledge in end-to-end delivery of Big Data solutions.Experience in Sysadmin, Exadata and other RDBMS is a plus.Must be proficient in SQL/HiveQLHands on expertise in Linux/Unix and scripting skills are required.Strong communication, technology awareness and capability to interact work with senior technology leaders is a mustGood knowledge on Agile Methodology and the Scrum processDelivery of high-quality work, on time and with little supervisionCritical Thinking/Analytic abilitiesSummaryThe hiring decision is to be made by the end of June 2021, interested applicants are encouraged to apply, those qualified will be contacted by a recruiter for the next steps. Don't delay, apply today!Randstad Canada is committed to building a diverse workforce reflective of the diversity of Canada. As a result, we promote employment equity and encourage candidates, especially those who identify as a woman, an Aboriginal person, a person with a disability or a member of a visible minority group, and any others who may contribute to the diversification of our workforce, to apply.Randstad Canada is also committed to developing an inclusive, barrier-free selection processes and work environments.If contacted in relation to a job opportunity, you should advise your Randstad Representative or your local Randstad branch in a timely fashion of the accommodation measures which must be taken to enable you to be assessed in a fair and equitable manner.Information received relating to accommodation measures will be addressed confidentially.For all feedback on equity and accommodation needs, please contact your local Randstad Canada Branch.