Hadoop Architect Job Description
Hadoop Architect Duties & Responsibilities
To write an effective hadoop architect job description, begin by listing detailed duties, responsibilities and expectations. We have included hadoop architect job description templates that you can modify and use.
Sample responsibilities for this position include:
Hadoop Architect Qualifications
Qualifications for a job description may include education, certification, and experience.
Licensing or Certifications for Hadoop Architect
List any licenses or certifications required by the position: DOD, ERP, TOGAF
Education for Hadoop Architect
Typically a job would require a certain level of education.
Employers hiring for the hadoop architect job most commonly would prefer for their future employee to have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Education, Technical, Engineering, Information Systems, Math, Information Technology, Business, Technology, Computer Engineering
Skills for Hadoop Architect
Desired skills for hadoop architect include:
Desired experience for hadoop architect includes:
Hadoop Architect Examples
Hadoop Architect Job Description
- Security Architect role will be responsible for overseeing the methods and techniques we use to guarantee our Hadoop installations meet the security compliance standards our clients have chosen
- Assist client support team with Hadoop security related issue resolution
- The Hadoop Performance Engineer will be responsible for analyzing existing Hadoop systems/Applications, making recommendations to improve performance, and possible implementing those recommendations
- Analyze planned usage patterns for a Hadoop cluster and developing a set of hardware specifications to support the usage
- The Hadoop Solution Architect will be responsible for developing Hadoop based solutions to support typical use cases such as data ingestion (ETL), machine learning analytics, data archiving, text analytics, Internet of Things (IOT) and reporting
- Gather requirements, designing a solution, determining the effort to implement the solution, and possibly leading the team to implement the solution
- The Hadoop Data Ingestion Engineer will be responsible for gathering requirements to load data into Hadoop, design a solution to meet these requirements, determine the effort to implement the solution, and lead a team to implement the solution
- Gathering requirements, designing a solution, determining the effort to implement the solution, and possibly leading the team to implement the solution
- Define BI COE Big Data/Hadoop technology strategies and roadmaps
- Architect and design solutions utilizing Hadoop and associated technologies, with a focus on the Cloudera distribution of Hadoop
- Well versed in installing & managing Cloudera distribution of Hadoop (CDH5, Cloudera Manager)
- Hands-on experience with Visualization and BI tools and reporting software
- Previous experience with NoSQL or distributed RDBMS would be an assist (Cassandra, MongoDB)
- Proficiency in Java and writing software for distributed systems
- Experience around developing code for large clusters with huge volumes of data – streaming batch code
- Conceptual/working knowledge of basic data management concepts like ETL, Data quality, RDBMS
Hadoop Architect Job Description
- Serve as architect and technical lead on medium scale to large scale projects as required
- Accountable for software design and code quality
- Utilizes development languages and tools to create and implement design
- Demonstrates continuous professional and technical development
- Task management – ensures on time delivery
- Creates high level technical design for use by Software Engineer
- Participates in review meetings as necessary
- Actively participates in the continuous improvement of software development process
- Recommendations for improvement
- Follows the defined development / implementation process
- Hands-on administration level experience with the Hadoop stack
- 8+ years experience in software programming using Java, JavaScript Spring, SQL
- Advanced knowledge of ETL/Data Routing and understanding of tools such as NiFi, Kinesis
- Experience in configuration and tuning of Hadoop products
- Experience working with multiple Hadoop distributions Hadoop trunk
- Preferably Master’s degree or above in Computer Science/Mathematics
Hadoop Architect Job Description
- Develops application and custom integration solutions, generally for one business segment
- Perform detailed systems and data analysis in order to provide high quality and detailed open source tools and Hadoop big data platform architecture designs
- Designs, codes, tests for tracks/modules in projects under limited supervision
- Performs long-term evaluations of systems, databases, and solutions for security risks
- Follow QuintilesIMS System Life-Cycle (SLC) and Computer System Validation (CSV) procedures, as appropriate
- Troubleshoot and resolve various process or data related issues
- End-to-end Incident and Problem resolution for Hadoop multi-tenancy platform and respective Applications
- Feeling of personal accountability for all areas directly or indirectly supporting the business/service area for which they are responsible
- Willingness to drive people on all sides of an issue to a common understanding and then drive them toward resolution
- Should be able to clearly communicate ideas in technical or business terms with their peers across IT
- Experience in building software or algorithm from the ground up and can participate in developing new solutions
- Extensive Core Java experience
- Well versed in Java multithreading, Java data structures, algorithms, distributed architecture
- Knowledgeable on Hadoop infrastructure aspects such as high availability, big data clusters, elastic load, capacity, high performance compute
- Good understanding of NoSQL databases, distributed computing, PaaS and other modern technologies
- Good understanding of key agile engineering practices, CI/CD, Test Driven Development and DevOps
Hadoop Architect Job Description
- Design high throughput and streaming data pipelines using client's Architecture
- Designing, building, installing, configuring and supporting Hadoop Infrastructures
- Drives processes for defining and documenting architectures
- Manipulating, aggregating and deriving useful information from data store
- Provide post-delivery support, assisting with root-cause analysis and problem resolution
- Coaches architects to assist in developing staff to expand the capabilities of individual teams and the overall organization
- To be the link between the needs of the organization and the data scientists/analysts and the developers
- To be a self-starter and to able to work analytically in a problem-solving environment
- Have experience in setting up data security and data privacy
- Architect and design data patterns, compute and design strategies for enterprise applications to support client business processes and functional requirements as it relates to complex data driven solutions
- Must be able to interact professionally with a diverse people such development teams, executives, SMEs
- Detailed oriented person with a big picture thinking
- 10+ years experience with data related positions and responsibilities
- Bachelors degree or equivalent in Computer Science, Information Systems or related field
- Experience in architecting, designing, developing and implementing project work within highly-visible data-driven applications in very large data warehousing / data repository environments with complex processing requirements
- A proven track record in system design and performance
Hadoop Architect Job Description
- Research and analyze business requirements and recommend optimal solutions within Information management technology architecture
- Support the development of Data Management roadmaps, associated plans and apply industry and technical knowledge to provide solutions that increase measureable business results and minimize risk
- Create system and application design
- Providing work break down structures and estimates
- Leading the software development effort (coding/configuration/ maintenance/installation, testing, debugging), managing timelines, and technical documentation
- Identifies, defines, and designs non-functional requirements into the solution design and ensures implementation
- Proactively communicates development status, issues and concerns to management with mitigation recommendations
- Lead research and fact-finding efforts needed to develop or modify basic to high complexity information systems
- Troubleshoot basic to high complexity coding/configuration/
- Installation issues encountered in the development or production environments working with software vendors as needed
- Ability to advocate ideas and to objectively participate in design critique
- 3 years Architect level contributions with Hadoop distributions (Horton, Cloudera, Pivotal)
- Experience with architecting & developing applications on Hortonworks (HIVE, SPARK), Apache NiFi, Talend
- Experience in building and maintaining Hadoop/spark clusters
- Programming languages – Spark, Scala, Java/J2EE, Linux, Wakari, Anaconda, R on Hadoop, PHP, Python, Hadoop, Hive, HBase, Pig, MapReduce and other Hadoop eco-system components
- Good knowledge of Unix/Linux, Ruby, Python, Perl or other Scripting languages