Senior Hadoop Job Description
Senior Hadoop Duties & Responsibilities
To write an effective senior hadoop job description, begin by listing detailed duties, responsibilities and expectations. We have included senior hadoop job description templates that you can modify and use.
Sample responsibilities for this position include:
Senior Hadoop Qualifications
Qualifications for a job description may include education, certification, and experience.
Licensing or Certifications for Senior Hadoop
List any licenses or certifications required by the position: MS, PMI, CDH, PMP, HDP, V3, ITIL
Education for Senior Hadoop
Typically a job would require a certain level of education.
Employers hiring for the senior hadoop job most commonly would prefer for their future employee to have a relevant degree such as Bachelor's and University Degree in Computer Science, Technical, Engineering, Information Systems, Business, Education, Computer Engineering, Information Technology, Technology, Design
Skills for Senior Hadoop
Desired skills for senior hadoop include:
Desired experience for senior hadoop includes:
Senior Hadoop Examples
Senior Hadoop Job Description
- Collaborate with the Architect Leader, and develop plans for building application solutions and environments that address the company's business and technological strategies
- Participate as a team leader on projects, which includes training, coaching, and sharing technical knowledge with less experienced staff
- Practicing Agile Software Development methodology using the Scrum framework
- Act as the lead technical resource responsible for directing the overall technical progress of projects or application initiatives targeting the Hadoop platform
- Communicate progress across organizations and levels from individual contributor to senior executive
- Be part of CMO driven development projects and ensure the designing, building, installing, configuring and supporting of Hadoop installations
- Designing and implementing data interfaces to streaming and CEP applications
- Translate complex functional and technical requirements into detailed design by ensuring security and data privacy aspects
- Close collaboration with architects, developers, product owners and other stakeholders to enable modern application development (enable continuous integration and deployment, DevOps)
- Understanding the competitive environment the Business operates in and knowing what delivers advantage from a digital business perspective
- Have a BS/MS/PhD in Computer Science or related field
- Knowledge/working experience on different Hadoop & MapReduce framework like Cascading
- Experience with enhancing and maintaining mission-critical software in a fast-paced environment is a plus
- Experience with various input/output file formats, Parquet
- As a Senior Developer you will be working with a wide range of technologies
- A genuine passion and interest in technology and software development with a thorough knowledge of all aspects of software development, best practices, and new technologies with the drive to stay informed on upcoming trends
Senior Hadoop Job Description
- Teach younger Hadoop engineers and ensure knowledge transfer in building, monitoring and administration of complex Hadoop clusters based on best practices and standards
- The Senior Data Science Consultant will be responsible for forming and maintaining strategic relationships within and between business units through proactive engagement, consultation and project leadership in the area of advanced analytics including
- Building new functionality into Presto to increase enterprise adoption
- Applying strong familiarity with algorithms and complexity analysis, database systems, and distributed systems concepts
- Writing unit, integration, and system tests that run in our continuous integration environment
- Collaborating with teams members to solve engineering problems
- Mentoring and training new engineers of all types
- Interviewing prospective engineering hires and relaying the technological and business value the company delivers
- Defining the best practices that the company’s engineers should use to be efficient and effective at creating maintainable code
- Develop, code, document programs of average to high complexity using different languages
- Proven experience of designing critical components/modules in a larger solution
- Experienced working within Agile and Classic SDLC processes
- Strong knowledge of data processing systems, preferably using Java, Hadoop and Informatica
- Proven ability to develop in-depth knowledge of a business domain
- Thorough understanding of Java language and Unix Scripting, wide knowledge of design patterns and different architectural styles
- The ability to write well documented, easily maintainable, and testable code
Senior Hadoop Job Description
- Define technical requirements, performance tuning, analytical and business user support
- Provide performance turning related to indexing, stored procedures, triggers and database/server configuration
- Capture/specify detailed technical requirements with absolute precision and completeness
- Establish clear success measures and criteria and technically evaluate approaches and solutions to guaranty that the business requirements are fulfilled exactly as expected
- Engage and work with teams and individuals with diverse business and technical backgrounds
- Ensure the requirements are communicated correctly when expectations are set with stakeholders
- Drive business requirements and ensure the success of pertinent projects and initiatives
- Support the planning and execution of Proof of Concept or Proof of Technology projects
- Resolve minor conflicts/issues and escalate others as appropriate
- Monitor and ensure stability of a complex data and information architecture environment in order to meet the defined SLA
- Love to work with open source technologies like Hadoop and Spark
- Proven experience in designing and coding (or leading a team of developers) on critical components of a larger system
- The ability to tackle difficult issues and propose solutions under tight timelines
- Troubleshoot Map Reduce jobs, PIG scripts and HIVE queries
- Ability to Provide technical guidance and mentoring to other junior team members if needed
- Ability to write automated unit test cases for Map Reduce code
Senior Hadoop Job Description
- Establish automated monitor mechanism for space and database availability conditions
- Provide applications support for staff, , outside division to answer questions and resolve problems related to the DBMS technology and computing platform such as Z/OS, UNIX, Linux, and Windows
- Contact vendor support when necessary to facilitate application development and client problem resolution with optimum speed and efficiency
- Cluster maintenance creation and removal of nodes using tools like Ganglia,Nagios,Cloudera Manager Enterprise, Dell Open Manage and other tools
- Excellent knowledge on security concepts in Hadoop Kerberos
- Design, develop and manage data on the Hadoop Cluster
- Manage resources on Hadoop cluster
- Conduct Hadoop cluster Training
- On-call Rotation for 7x24 Support of Hadoop Clusters
- Troubleshoot Hadoop issues with other developers
- Experience using the Hadoop tool set such as Pig, Hive, Map Reduce, Java, Spark, (3+ years)
- Strong knowledge of OO and functional programming paradigms - Scala/Clojure/Haskell
- Experience in Hadoop, Big Data, Data Warehousing, Web Services, Distributed Computing
- Experience with relational design and data modeling
- Experience in Vendor management - implementation & integrating solutions
- A Computer Science related degree would be preferable
Senior Hadoop Job Description
- Assist IT management in prioritizing database administration team projects
- Daily monitoring of database storage allocation and usage other resource usage
- Assist with backup standards and schedules and recovery procedures
- Intent of this security plan is to establish a means of providing complete accountability for any and all use of the databases
- Provide Technical guidance to teammates
- Export/Import data and data replication
- Develop data expertise, be a data steward evangelist, and own data ingestions and transformations
- Design and develop extremely efficient and reliable data pipelines to move terabytes of data into the Data Lake and other landing zones
- Assist in construction of data lake infrastructure
- Mentor and teach others
- 5+ years of total experience in development, systems engineering or operations
- 2+ years of strong, proven Hadoop experience
- 2+ years of DevOps experience with a public cloud like AWS or Google Cloud
- Chef, Ansible, Puppet or related scripting experience
- Good written and verbal communication skills with the ability and desire to effectively communicate with stakeholders
- Basic Unix OS skills and Shell scripting