Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Hadoop ETL Problem Solving Skills Spark Big Data SQL Developer
(Jobs in Wilmingto
 
Requirement id 67841
Job title Developer
Job location in Wilmington, DE
Skills required Mandatory Experience, Hadoop, ETL, Problem Solving Skills Spark Big Data SQ
Open Date 22-Feb-2018
Close Date
Job type Contract
Duration 10 Months
Compensation DOE
Status requirement ---
Job interview type Face to Face interview
   Email Recruiter: coolsoft
Job Description Developer: Mandatory Experience, Hadoop, ETL, Problem Solving Skills Spark Big Data SQ

Client Info : Barclaycard - DE

Job Description :

Responsibilities
Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
Loading from disparate data sets
ETL coding using Hive and Spark.
Translate complex functional and technical requirements into detailed design.
Maintain security, data privacy and data quality.
Continuously evaluate new technologies, innovate and deliver solution for business critical applications
Test prototypes and oversee handover to operational teams.
Propose best practices/standards.

Mandatory experience
Working with Hadoop
Writing HQL and Spark Code
Loading data into Hadoop
Working with job scheduling tools

Preferred Experience
Reading and writing data to Kafka
Processing real time data with Spark Streaming or other technology.
Working with Financial data especially Credit Cards

Mandatory Skills
Knowledge in Hadoop
Writing high-performance, reliable and maintainable code.
Ability to write Spark jobs.
Good knowledge of database structures, theories, principles, and practices.
Proficiency in writing SQL including HiveQL.
Knowledge of workflow schedulers
Analytical and problem solving skills, applied to Big Data domain
Proven understanding of Hadoop SQL base tools such as Hive & Impala.
Good aptitude in multi-threading, concurrency and distributed processing concepts.

Preferred Skills
Familiarity with data loading tools such Flume and Sqoop.
Familiar with real time streaming tools such as Kafka and Spark Streaming.
Understanding of file formats such as Avro and Parquet including serialization/deserialization
Understanding NoSQL technologies such as HBase, MongoDB and Cassandra.
Solid understanding of Data Warehousing concepts.
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 67841 while calling.
 
Other jobs in DE: Dover (1), Wilmington (3),
 
 Mandatory Experience job openings in other states
Jobs List

PeopleSoft Supplier Onboarding Services - 110.7-016-079
Create date: 06-Jan-2017
Start Date :02/10/2017
Submission Deadline :02/06/2017
Description :


TECHNOLOGY SERVICES SOLUTION
It seeks an Offeror with the following required and desired deliverables:
If the Offeror is successful, the Offeror agrees that it shall comply with all requirements throughout the full term of the.... (This job is for - Implementation Jobs in ND Bismarck Consultant - (in Bismarck, ND))
 
 
(Developer: Mandatory Experience, Hadoop, ETL, Problem Solving Skills Spark Big Data SQ in Wilmington, DE)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us