Cloudera Administrator Opportunity

TalentBurst, an Inc 5000 company company

Subscribe to our Telegram Channel

Cloudera Administrator in SAN FRANCISCO BAY AREA

Remote 6 months ago

Job Title :-Big Data Administrator

Location:- Open For 100% REMOTE

Duration:- 12+ Months Contract (With Possibility of extension)

Job Description:

The Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications.


Primary Responsibilities

  • Oversees implementation and ongoing administration of Hadoop infrastructure and systems
  • Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
  • Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings
  • Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments
  • Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise
  • Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screens Hadoop cluster job performances and capacity planning
  • Monitors Hadoop cluster connectivity and security
  • Manages and reviews Hadoop log files
  • Handles HDFS and file system management, maintenance, and monitoring
  • Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability
  • Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
  • Acts of point of contact for vendor escalation


Requirements

  • Bachelor's degree in a related field
  • Seven (7) years of experience in architecture and implementation of large and highly complex projects

Skills and Competencies

  • Experience with Airflow, Argo, Luigi, or similar orchestration tool
  • Experience with DevOps principals and CI/CD
  • Experience with Docker and Kubernetes
  • Experience with No-SQL databases such as HBase, Cassandra, or MongoDB
  • Experience with streaming technologies such as Kafka, Flink, or Spark Streaming
  • Experience working with Hadoop ecosystem building Data Assets at an enterprise scale
  • Strong communication skills through written and oral presentations
  • Comments for Suppliers: Experienced Hadoop Admin with an understanding Cloudera

Apply now

Subscribe our newsletter

New Things Will Always Update Regularly