Hadoop Job Description
Hadoop Duties & Responsibilities
To write an effective hadoop job description, begin by listing detailed duties, responsibilities and expectations. We have included hadoop job description templates that you can modify and use.
Sample responsibilities for this position include:
Hadoop Qualifications
Qualifications for a job description may include education, certification, and experience.
Licensing or Certifications for Hadoop
List any licenses or certifications required by the position: SSL, CE, TLS/SSL, AWS, SAP, PEGA, V3, ITIL, GCP, HDP
Education for Hadoop
Typically a job would require a certain level of education.
Employers hiring for the hadoop job most commonly would prefer for their future employee to have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Engineering, Technical, Education, Information Technology, Business, Information Systems, Science, Software Engineering, Mathematics
Skills for Hadoop
Desired skills for hadoop include:
Desired experience for hadoop includes:
Hadoop Examples
Hadoop Job Description
- Hadoop development and implementation (Environment - HDFS, Hbase, Spark, Kafka, Ozie, Scoop, Flume, Kerberos, Oracle ASO, MySQL, GeoSpatial data)
- Write complex backend systems using Python and Node.js
- Serve as the subject matter expert for Hadoop ecosystem
- Installation of Hadoop ecosystem
- Applying security (Kerberos / Open Ldap) linking with Active Directory and/or LDAP
- Onboarding users to use Hadoop - configuration, access control, disk quota, permissions
- Work with vendor (Hortonworks) on all issues, apply upgrades and security patches
- Commission/de-commission nodes backup and restore
- Monitor the cluster - jobs, performance and fine-tune when necessary
- Maintain, support, and upgrade Hadoop clusters
- Minimum of 3 years of managing and supporting applications using Hadoop or Hadoop based components - HDFS, Hbase, Hive, Impala, Sqoop, Flume , within
- Should have min
- 3+ years of experience & working knowledge with Software Delivery Methodologies
- Experience working on Marketing and Sales development projects
- Expertise with Hadoop ecosystem and experience with Hive, Oozie, Flume and Sqoop
- Experience using the Hadoop tool set such as Spark, Oozie, Pig, Hive, Map Reduce, Java, (3+ years)
Hadoop Job Description
- Monitor jobs and other hardware/internal aspects of the cluster
- Back up key databases and configuration
- Partner with Hadoop developers and architects to enable the business
- Perform common Linux administration tasks
- Work closely with key IT and business roles to determine the best solutions
- Work closely with the existing BI team to bridge current ecosystem with next generation ecosystem
- €¢ Understand the overall product roadmap as articulated by senior story author/product owner and work closely with the scrum master and dev and testing team to translate roadmap into team specific release plan and sprint plan
- €¢ Identify upstream and downstream interdependencies as uncovered during planning sessions
- Develop, test and document applications
- Integration - Code integration of different module between teams
- 1 year experience in ETL (Syncsort DMX-h, Ab Initio, IBM - InfoSphere Data Replication, ), mainframe skills, JCL
- Excellent verbal written and verbal communication skills
- Prior consulting experience in the above areas preferred
- Exposure of development experience in On-Site/Off-Shore engagement model
- 4 to 8 year's work experience related to BI tools and platforms
- Experience integrating analytic tools specifically SpotFire (7.5) and SAS Visual Analytics is a plus
Hadoop Job Description
- Learn to document the code, prepare deployment and release notes documentation
- Learn new technologies like Hadoop to perform tasks
- Work under software and company security guidelines
- Prioritize time-sensitive assignments
- Learn company security related items related to software development, data, access permissions and how to incorporate them into development life cycle
- Recommend security management best practices including the ongoing promotion of awareness on current threats, auditing of server logs and other security management processes, following established security standards
- Partners with Infrastructure to identify server hardware, software and configurations necessary for optimally running big data workloads
- Support and implement DevOps methodology
- Working with data delivery teams to setup new Hadoop users
- Administer Hadoop cluster providing support for all administrative tasks such as user management, security planning and implementation, backups, disk management and cluster maintenance
- Experience with DXP files generated by Spotfire is a plus
- Extensive experience with Java, and the willingness to learn new technologies
- Linux/Unix and scripting languages like Bash, Python
- Java, C, C++, C#, Perl, PHP, Python, UNIX shell, SQL, HTML
- 1+ years of experience in Hadoop Data Ingestion tools and utilities
- 3-5 years of experience in Scripting Language (Linux, SQL, Python)
Hadoop Job Description
- Manage environment and tools for a large development shop
- Work with project teams to integrate Hadoop access points
- Proactively manages hadoop system resources to assure maximum system performance and appropriate additional capacity for peak periods and growth
- Recommends security management best practices including the ongoing promotion of awareness on current threats, auditing of server logs and other security management processes, following established security standards
- Designing and developing automation code for deploying and managing Hadoop solutions
- Managing, upgrading and troubleshooting Hadoop clusters
- Installing and configure software
- Aligning with the engineering team to deploy new hardware and software environments required for Hadoop and to expand existing environments
- Cluster maintenance creation and removal of nodes using tools like Cloudera Manager and Mammoth
- Administration and protection of Cloudera Navigator Key Trustee servers and HDFS encryption zones
- 3-5 years of experience on Administrative activities likes –
- Product knowledge on Hadoop distributions such as Cloudera, Hortonworks & Greenplum pivotal, OR MapR
- Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo
- Development or administration on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift
- Web/Application Server & SOA administration (Tomcat, JBoss)
- Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, & BI tools like Tableau, Pentaho
Hadoop Job Description
- Monitoring and management of Cloudera Backup and Disaster Recovery processes
- Collaborating with development teams to install operating system and Hadoop updates, patches, version upgrades when required
- Develop advanced analytics solutions to achieve our deep learning and predictive analytics objectives
- Design, develop, and implement custom big data solutions for atypically complex problems
- Provide technical direction and architecture advice for initiatives
- Installing and designing monitoring tools that are critical for Hadoop systems and services
- Providing day-to-day support for development, support, and business analyst teams
- Coordinating Root Cause Analysis (RCA) efforts to help minimize future system issues
- Creating and publishing production metrics which includes system performance and reliability information to system owners and management teams
- Partnering with the Linux Server Administration team in the administering of server hardware and operating systems
- Has minimum of 2+ years of experience in Oozie
- Entrepreneurship, Solution Orientation and Client Intimacy
- Team work, Social skills, Self-awareness and Good Communication
- Deep knowledge and strong deployment experience in the Hadoop and Big Data ecosystem - Hadoop, Flume, Hive, HBase, Pig, HDFS, MapReduce, Linux
- Sound Product knowledge on Hadoop distributions such as Cloudera, Hortonworks & Greenplum pivotal, OR MapR
- High degree of initiative and the ability to work independently