The Hadoop Developer will be responsible for developing and managing Hadoop applications. They will work with a team of developers to create MapReduce programs and load data into the Hadoop Distributed File System (HDFS). The Hadoop Developer will also be responsible for setting up and configuring Hadoop clusters.

Hadoop Developer Job Responsibilities

  • Design and implement big data solutions using Hadoop and related technologies.
  • Work with stakeholders to understand business requirements and design technical solutions to meet those requirements.
  • Write efficient code to process large amounts of data according to the solution design.
  • Perform unit testing of the code to ensure it meets all functional and nonfunctional requirements.
  • Deploy and configure Hadoop clusters as per the solution design.
  • Monitor the performance of Hadoop clusters and tune them for optimal performance.
  • Troubleshoot issues with Hadoop clusters and investigate root cause of problems identified by users or monitoring tools

Objectives

  • To develop and manage Hadoop clusters to process big data sets.
  • To design and implement new applications to process big data sets using Hadoop.
  • To troubleshoot and optimize Hadoop clusters for performance.
  • To monitor the status of Hadoop clusters and jobs running on them.
  • To work with other teams in the organization to integrate their data with Hadoop for processing.

Hadoop Developer Job Skills & Qualifications Needed

  • A Hadoop developer should have strong skills in Java, MapReduce and HDFS. They should also be familiar with other big data technologies such as Hive, Pig and Spark.