Randstad Hadoop Administrator in San Antonio, Texas

Hadoop Administrator

job details:

  • location:San Antonio, TX

  • salary:$50 - $65 per hour

  • date posted:Tuesday, December 18, 2018

  • job type:Contract

  • industry:Mining

  • reference:661752

job description

Hadoop Administrator

job summary:

A client of ours in San Antonio, Texas is looking for a Hadoop Administrator for a 3 month contract opportunity.

location: San Antonio, Texas

job type: Contract

salary: $50 - 65 per hour

work hours: 8am to 5pm

education: Bachelors

responsibilities:

The Hadoop administrator is responsible for the care, maintenance, administration and reliability of the Hadoop ecosystem.

The role includes ensuring system security, stability, reliability, capacity planning, recoverability (protecting business data) and performance.

In addition to providing new system and data management solution delivery to meet the growing and evolving data demands of the enterprise.

qualifications:

  • Hadoop administrator using Hortonworks

  • Administers Hadoop technology and systems responsible for backup, recovery, architecture, performance tuning, security, auditing, metadata management, optimization, statistics, capacity planning, connectivity and other data solutions of Hadoop systems.

  • Responsible for installation and ongoing administration of Hadoop infrastructure.

  • Propose and deploy new hardware and software environments required for Hadoop and to expand existing environments as the need arises. Provision new users, groups and roles and setting up Kerberos principals.

  • Addition and deletion of nodes from Hadoop clusters.

  • Monitoring Hadoop eco-system.

  • Log Files and file systems management. HDFS, storage management and capacity planning.

  • Troubleshooting application errors. Ensure the High Availability of Hadoop clusters.

  • Responsible for data movement in and out of Hadoop clusters. Perform Backup and recovery for metastore databases and configuration files.

  • Data modeling, designing and implementation of of HIVE, Druid, NoSQL tables.

  • Installing patches and upgrading software as and when needed. Automation of manual tasks.

  • Data Lake and Data Warehousing design and development.

  • Health check of all the Hadoop clusters.

  • Thorough knowledge of Hadoop overall architecture, its key components such as HDFS, YARN, HBase, Sqoop, HIVE, LLAP, Druid, Ambari etc.

  • A solid background in Linux shell scripting and system administration

skills:

  • Ability to effectively collaborate with Data Analysts, Data Scientists, system administrators, storage administrators, network team, vendors etc.

  • Good communication skills

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.