Job Description
Cloudera is seeking a Senior Solutions Consultant to join its APAC Professional Services team in Singapore. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark and related Big Data technology. This role is a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to travel across Asia Pacific and across multiple industries and large customer organizations.
Responsibilities
- Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow
- Design and implement Hadoop and NiFi platform architectures and configurations for customers
- Perform platform installation and upgrades for advanced secured cluster configurations
- Analyze complex distributed production deployments, and make recommendations to optimize performance
- Able to document and present complex architectures for the customers technical teams
- Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer
- Drive projects with customers to successful completion
- Write and produce technical documentation, blogs and knowledgebase articles
- Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements
- Keep current with the Hadoop Big Data ecosystem technologies
- Attend speaking engagements when needed
- Post COVID-19, potential travel up to 50%
Qualifications
- 10+ years in Information Technology and System Architecture experience
- 5+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
- 5+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
- Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments.
- Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
- Ability to understand and translate customer requirements into technical requirements
- Experience implementing data transformation and processing solutions
- Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
- Experience setting up multi-node Hadoop clusters
- Experience in configuring security configurations (LDAP/AD, Kerberos/SPNEGO)
- Experience in Cloudera Software and/or HDP Certification (HDPCA / HDPCD) is a plus
- Strong experience implementing software and/or solutions in the enterprise Linux environment
- Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
- Strong understanding of network configuration, devices, protocols, speeds and optimizations
- Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools
- Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
- Solid background in Database administration or design
- Excellent verbal and written communications
- Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements