Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Hadoop Cloudera Administrator Administrator
(Jobs in Helena, MT)
 
Requirement id 94085
Job title Administrator
Job location in Helena, MT
Skills required Hadoop, Informatica Administration, Big Data, Cloudera Administrator
Open Date 20-Jun-2019
Close Date
Job type Not specified
Duration 6 Months
Compensation DOE
Status requirement ---
Job interview type ---
   Email Recruiter: coolsoft
Job Description Administrator: Hadoop, Informatica Administration, Big Data, Cloudera Administrator

Note: Online Submission

Submission Deadline: 6/26/2019 2:00 PM MDT
Start Date: 7/1/2019
End Date: 12/31/2019

Client: Montana State Fund (MSF)

Description:

SCOPE OF SERVICES:

1. Review of existing MSF Big Data Ecosystem, identification and documentation of gaps if any, along with a plan to address these gaps and implement the same (for 2019 only).
2. Create a detailed plan and repurpose existing CDH cluster to carve out additional environment(s) – this activity must be completed in the initial month of the service (The detailed plan must be documented and reviewed with the MSF IT Director(s) (before implementation for 2019 only).
3. Maintenance of cluster(s), addition and removal of nodes using cluster monitoring tools (example: Ganglia Nagios or Cloudera Manager), configuring NameNode high availability and keeping a track of all the running Hadoop jobs.
4. Implementation, management and administration of the overall ecosystem.
5. Execution of day-to-day running of clusters including availability on a continuous basis.
6. Implementation of capacity planning and estimation of requirements for lowering or increasing the capacity of the clusters.
7. Provide recommendations on sizing of clusters based on MSFs data ingestion plan for 2019.
8. Monitoring of cluster connectivity and performance.
9. Management and review of log files.
10. Implementation of Backup and Recovery Tasks.
11. Implementation of resource and security management.
12. Implementation of data (in Big Data tables) backup and recovery, connectivity and security.
13. Implementation of all administrative tasks connected to Informatica BDM.
14. Documentation of all repeatable and recurring Tasks and Procedures.
15. Conduct periodic knowledge transfer sessions for in-house MSF Big Data Developers and System Administrators.
16. Assist MSF System Administrators in troubleshooting Linux issues.
17. Share weekly status (through a report) on activities planned and performed with MSF IT director(s).

18. Vendor is expected to adhere to the following Client specified SLAs.

MINIMUM REQUIREMENTS:

1. Administration of services needs to be performed onsite at Helena, MT. MSF requires that the staff resource assigned to this project be on-site an average of 40 hours per week and live in the greater-Helena area. The services required include Cloudera administration as well as Informatica BDM administration services. It is acceptable if a second staff resource is involved ONLY if the two skill sets are not possessed by the same vendor staff resource. In such an event, the Cloudera Administration service must be performed onsite at Helena, while the BDM administration services may be performed remotely but the staff resource should be flexible to travel to Helena, MT at least once in 2019 and twice for each subsequent year.

2. Please provide a resume/curriculum vitae for each staff resource you would propose. MSF will conduct a telephonic interview of each staff resource prior to contract award. Staff resource must possess the following minimum qualifications:
a. At least three years of experience in Hadoop (Cloudera) Administration and one year of experience in Informatica BDM Administration.
b. A Bachelors Degree
c. Must have worked in this role (Cloudera or/and BDM) for at least one of the implementations for the organization showcased in the case studies provided in Item 3 below.

3. The vendor must be able to demonstrate its experience of providing the services outlined above for a minimum of 3 Clients (of which at least 2 must be Insurance clients) as evidenced through case studies (must use attached case study format).

4. The vendor must provide a high-level proposal to repurpose the existing CDH cluster to carve out additional environment(s) –
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 94085 while calling.
 
Other jobs in MT: Helena (1),
 
 Hadoop job openings in other states
Jobs List

Back End Java Developer - 85842
Create date: 03-Jun-2022
Description:

General skills
Nice to have

Databricks
Hadoop is preferred but any big Data warehousing experience

IT skills
Must have

Backend data experience
Data Warehousing
Java/springboot API knowledge
AWS
Datalayer knowledge (This job is for - Hadoop Jobs in NY NewYork Developer - (in New York, NY))

Data Warehouse Designer/Developer - 58707
Create date: 05-Oct-2021
Description:

Responsibilities

Our Client in Columbus, Ohio is seeking an experienced Data Warehouse Designer / Developer for a contract role.

CANDIDATES MUST BE LOCAL TO CENTRAL OHIO OR RELOCATED BY DAY ONE FOR CONSIDERATION.

The Data Warehouse Designer/Developer will be responsible for Enterprise data wa.... (This job is for - Hadoop ETL ELT Jobs in OH Columbus Developer - (in Columbus, OH))

Technical Specialist - 58686
Create date: 30-Sep-2021
Description:

Responsibilities

Our client in Columbus Ohio is seeking an Enterprise Data Warehouse / ETL Developer.

Local candidates are highly preferred. Candidates are required to report onsite on Day 1 of the assignment.

The Technical Specialist will be responsible for the clients Enterprise data wareho.... (This job is for - Hadoop ETL Jobs in OH Columbus Specialist - (in Columbus, OH))

Data Warehouse Designer / Developer - 58526
Create date: 20-Aug-2021
Description:

Our Client in Columbus, Ohio is seeking an experienced Data Warehouse Designer / Developer for a contract role

CANDIDATES MUST BE LOCAL TO CENTRAL OHIO OR RELOCATED BY DAY ONE FOR CONSIDERATION

The Data Warehouse Designer/Developer will be responsible for Enterprise data warehouse design, development, i.... (This job is for - Hadoop Hive Jobs in OH Columbus Developer - (in Columbus, OH))

Senior Software Development Engineer
Create date: 26-Apr-2021
Description:

Top 3 required technical skills:

Automation Dev Ops Engineering [Hadoop, Spark]
Operational experience in real-time, streaming and data pipelines relevant frameworks [NiFi or Airflow]
Java/ Scala and Python programming
PCF or Cloud experience in general.

Role

.... (This job is for - Hadoop Spark Scala Jobs in CA SanFrancisco Engineer - (in San Francisco, CA))
 
 
(Administrator: Hadoop, Informatica Administration, Big Data, Cloudera Administrator in Helena, MT)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us