Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Implementation Bachelor Degree In Computer Science Hadoop NoSQL Kafka Hive AWS Stro
 
Requirement id 107556
Job title Administrator
Job location in Maryland Heights, MO
Skills required Big Data, Implementation, Cloud Experience, Bachelor Degree In Computer Science Hado
Open Date 17-Jun-2020
Close Date
Job type Contract
Duration 6 Months
Compensation DOE
Status requirement ---
Job interview type ---
   Email Recruiter: coolsoft
Job Description Administrator: Big Data, Implementation, Cloud Experience, Bachelor Degree In Computer Science Hado

Candidate must be our W2 Employee.

Job Description :

Team:
Big Data Team-Building out data lakes
No development
When teams within Charter want to move data to the Hadoop platform will reach out to this team to have them implement and environment for them.

-Build backend data infrastructure (environment)

-build solution around environment the team is currently in and how to best move to the new one. (lift and shift environment without altering functionality)

Requirements:

-3-4 years of experience in Big Data

-Experience implementing solutions

-Experience managing clusters and adding/removing nodes

-Cloud knowledge is a plus

JOB SUMMARY
Designs, develop, automate, implement and support big data clusters using components including Hadoop (HDFS), YARN, Spark, Kafka, HBase, NoSQL, Authorization and Kerberos Authentication to enhance data analytic capabilities.
MAJOR DUTIES AND RESPONSIBILITIES

Design, develop, automate and implement big data clusters using Hadoop, YARN, Hive, Zookeeper, Kafka, NoSQL components.

Perform Platform administration and automation of Hadoop and Kafka including installation, maintenance, and configuration.

Perform troubleshooting and resolution management, and provide support to the customer, users, and technical teams.

Resolve issues related to development, operations, implementations, and system status.

Research and recommend options for department direction on Big Data management. Manage and maintain all production and non-production Hadoop and Kafka clusters and its infrastructure.

Develop run books for Ansible, Shell script, and Python for automation.

Review, develop, and walk through Java and Scala code to implement best practices and tuning.

Support multiple clusters of medium to large complexity with multiple concurrent users, ensuring control, integrity, and accessibility of data.

Create and maintain standard operating procedures and templates for cluster user access.

Design and implement a toolset that simplifies provisioning and support of a large cluster environment.

Enable and configure Kerberos for Hadoop components and implement enterprise security for Hadoop and Kafka.

Enable data encryption at rest and at motion with TLS/SSL to meet the security standards.

Responsible for system backups, and coordinate with infrastructure team for storage and rotation of backups is accomplished.

Big Data tenant onboarding, Enable Sentry for RBAC (role-based access control) to have a privilege level access to the data in HDFS/Hive/Kafka as per the security policies.

Perform cluster maintenance as well as creation and removal of nodes

Design and implement Backup and Disaster Recovery strategy

Participate in new tool discovery and technical deep-dive sessions, Proof-Of-Concept (POC) development with prospects

Screen Hadoop cluster job performances, manage, troubleshoot and review Hadoop log files.

Utilize expertise in technologies and tools, such as Kafka, Hadoop, Spark, and other storage systems as well as other cutting-edge tools and applications in Big Data space.
Performance tune Big Data components including Hive queries, and address performance issues related to schedulers and YARN
Participates in continuous performance improvement sessions to discuss opportunities to improve processes or standards

REQUIRED SKILLS/QUALIFICATIONS

Education: Masters degree or Bachelor degree (or foreign equivalent) in Computer Science, IT, MIS, or a closely related field.
4+ years of experience in handling large-scale distributed platforms or integration pr
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 107556 while calling.
 
Other jobs in MO: Clayton (1), Kansas City (6), Morrisville (1), Saint Louis (3), St Louis (15),
Big Data job openings in Maryland Heights, MO
Jobs List

Tableau Developer - 53707
Create date: 07-Oct-2019
Candidate must be our W2 Employee


Description:

Must-haves

- 5+ years of experience as a Tableau Developer

- Experience developing reports within Tableau

- Experience building dashboards within Tableau

- 5+ years experience with SQL

- Experience with big data (Hado.... (This job is for - Tableau Jobs in MO MarylandHeights Developer - (in Maryland Heights, MO))
 
 Big Data job openings in other states
Jobs List

Technical Specialist 3/TS3 - 83945 - SP
Create date: 07-Apr-2021
start date:5/3/2021

End date:06/30/2021

submission deadline:4/12/2021

client info : MCD


Note:

* Interviews: TEAMS

* The selected candidate does not have to be local to Columbus, OH and can work remotely

* REPOST OF 82361 AND 83069 - DO NOT RESUBMIT CANDIDATES Hadoop Jobs in OH Columbus Specialist - (in Columbus, OH))

Data/Kafka Engineer - 21-00594
Create date: 26-Mar-2021
On-site/Remote:
On-site

Job industry:
Automotive and Transportation

Description:

Responsibilities

Job Title: Senior Data/Kafka Engineer

Location: Fremont, CA

Duration: 6+ months

Ursus is looking for a strong Data Engineer to support our clients Digital Ex.... (This job is for - job Jobs in CA Fremont Engineer - (in Fremont, CA))

Big Data Developer - 73185
Create date: 22-Mar-2021
Job Description:

This project is a focused on datafactory. It pulls in data from other agencies, insurance, providers, etc.. They clean up the data
prepares it for reporting (cleaning it up). this team loads it into the backend of other applications/reporting tools.

Strong Java application developer with big data. (This job is for - job Jobs in MN Minneapolis Developer - (in Minneapolis, MN))

Big Data Engineer
Create date: 22-Mar-2021
Job Description

Big Data Engineer (#2 Openings)

Remote

5 Months Contract

Client:

Client was built to help professionals achieve more in their careers, and every day millions of people use our products to make connections, discover opportunities and gain insights. Their global reach means
.... (This job is for - job Jobs in CA MountainView Engineer - (in Mountain View, CA))

Technical Specialist 3/TS3 - 83069
Create date: 10-Mar-2021
start date:3/15/2021

End date:06/30/2021

submission deadline:Monday 3/15 @ 10 am EST

client info : MCD

Note:

* Interviews: TEAMS

Description:

The Technical Specialist will be responsible for Medicaid Enterprise data warehou.... (This job is for - AnalysisHadoop Jobs in OH Columbus Specialist - (in Columbus, OH))
 
 Big Data job openings in MO
Jobs List

Reports Analyst - 24548
Create date: 09-Jun-2017
Job Description :

Apex Systems is looking to employ a Report Analyst in St. Louis, MO. This position requires 5 years of experience and all candidates must be able to work on your direct W2.



Data Visualization and Report Analyst

This position will support the PLQ Managers in NA RC Quality organizati.... (This job is for - SAP Jobs in MO StLouis Analyst - (in Stlouis, MO))
(Administrator: Big Data, Implementation, Cloud Experience, Bachelor Degree In Computer Science Hado in Maryland Heights, MO)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us