Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Executive Order 11246, Section 503 & VEVRAA EOE, including disability/vets
Hadoop ETL AbInitio Teradata Engineer
(Jobs in Charlotte, NC)
 
Requirement id 86647
Job title Engineer
Job location in Charlotte, NC
Skills required Big Data, Hadoop, ETL, AbInitio Teradata
Open Date 19-Dec-2018
Close Date
Job type Contract to Hire
Duration 24 Months
Compensation DOE
Status requirement not found
Job interview type ---
Apply
   Email Recruiter: coolsoft
Job Description Engineer: Big Data, Hadoop, ETL, AbInitio Teradata

Candidate must be our own W2 Employee

We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens


Job Description :

The Artificial Intelligence Enterprise Solutions (AIES) Technology Data Engineering Team is looking for a highly motivated and experienced Senior Data Engineer. The right candidate will have expert level experience in supporting Artificial Intelligence / Machine Learning (AI/ML) Platforms, products and data ingestion/provisioning activities from different data sources to and from the Enterprise Data Lake. As a senior data engineer, you will be working with Wells Fargo business & data science teams to get the business data requirements, perform data engineering/provisioning activities to support building, exploring, training and running Business models. The senior data engineer will use ETL tools like Informatica, Ab Initio, and data warehouse tools to deliver critical Model Operationalization services to the Enterprise.
In this role Senior Data Engineer will be responsible for:
• Data modeling, coding, analytical modeling, root cause analysis, investigation, debugging, testing and collaboration with the business partners, product managers, architects & other engineering teams.
• Adopting and enforcing best practices related to data ingestion and extraction of data from the big data platform.
• Extract business data from multiple data sources and store in MapR DB HDFS location.
• Work with Data Scientists and build scripts to meet their data needs
• Work with Enterprise Data Lake team to maintain data and information security for all use cases
• Build automation script using AUTOSYS to automate the loads
• Design and develop scripts and configurations to successfully load data using Data Ingestion Frameworks or Ab initio
• Coordinate user access requests for data loaded in Data Lake
• Post-production support of the AIES Open Source Data Science (OSDS) Platform
• Supporting end-to-end Platform application delivery, including Infrastructure provisioning & automation and integration with Continuous Integration/Continuous Development (CI/CD) platforms, using existing and emerging technologies
Required Qualifications

• BS/BA degree
• Possession of excellent analytical and problem-solving skills with high attention to detail and accuracy
• Demonstrated ability to transform business requirements to code, metadata specifications, specific analytical reports and tools
• Good verbal, written, and interpersonal communication skills
• Experience with SDLC (System Development Life Cycle) including understanding of project management methodologies used in Waterfall or Agile development projects
• Strong Hadoop scripting skills to process petabytes of data
• 5+ years of ETL (Extract, Transform, Load) Programming with tools including Informatica
• 2+ years of Unix or Linux systems with scripting experience in Shell, Perl or Python
• Experience with Advanced SQL (preferably Teradata)
• Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, HBase, Pig, Apache Spark, etc.).


Desired Qualifications • MS/MA degree • Experience with Java and Scala • Experience with Ab Initio • Experience with analytic databases, including Hive, Presto, and Impala • Experience with multiple data modeling concepts, including XML and JSON • Experience with loading and managing data using technologies such as Spark, Scala, NoSQL (MongoDB, Cassandra) and columnar MPP SQL stores (Redshift, Vertica) • Experience with Change and Release Management Processes • Experience with stream frameworks including Kafka, Spark Streaming, Storm or RabbitMQ • Experience working with one or more of the following Amazon Web Services (AWS) Cloud services: EC2, EMR, ECS, S3, SNS, SQS, Cloud Formation, Cloud watch
 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 86647 while calling.
 
Other jobs in NC: Chapel Hill (2), Charlotte (58), Dix Campus (1), Greensboro (3), Morrisville (2), North Charlotte (2), Raleigh (96), St Raleigh (2), Winston Salem (3),
Big Data job openings in Charlotte, NC
Jobs List

Solutions/Application Architect - 49238
Create date: 27-Jun-2019
Candidate must be our W2 Employee


Description:
Enterprise Risk and Finance Technology organization is seeking a Systems Architect to work on various projects within our team. Responsibilities will be varied and are likely to cover a wide range of architecture related tasks associated with the creation and development of solut.... (This job is for - job Jobs in NC Charlotte Architect - (in Charlotte, NC))

Business/Data Analyst - Big Data - NTTJP00021043
Create date: 17-Jun-2019
start date:7/15/2019
End date:07/31/2020
submission deadline:7/10/2019

Description:

we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our companys growth, market presence and our abi.... (This job is for - Hadoop Jobs in NC Charlotte Analyst - (in Charlotte, NC))

Java Developer - 48678
Create date: 14-Jun-2019
Candidate must be our W2 Employee

We should submit only GC, GC EAD, L2 EAD, H4 EAD and Citizens.

Description:*

10+ years core java development experience (data structures, spring framework, rest apis, sql)
Mongo DB experience (required for at least one of the 2 roles)
Big Data experience (apach.... (This job is for - Core Java Jobs in NC Charlotte Developer - (in Charlotte, NC))

Data Analyst With BigData Experience - NTTJP00020761
Create date: 16-May-2019
start date:6/3/2019
End date:12/31/2019
submission deadline:5/28/2019

Description:

we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our companys growth, market presence and our ability to help our clients stay a step ahead of.... (This job is for - UATSIT Jobs in NC Charlotte Analyst - (in Charlotte, NC))

Senior ETL/BigData Developer - NTTJP00014772
Create date: 19-Feb-2019
Start Date :03/11/2019
End Date :12/31/2017
Submission Deadline :03/13/2020
Client Info :App Svcs: Delivery - ADM - Existing Technologies

Description :

We know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees ar.... (This job is for - datawarehouseBI Jobs in NC Charlotte Developer - (in Charlotte, NC))
 
 Big Data job openings in other states
Jobs List

Technical Specialist 3/TS3 - 83945 - SP
Create date: 07-Apr-2021
start date:5/3/2021

End date:06/30/2021

submission deadline:4/12/2021

client info : MCD


Note:

* Interviews: TEAMS

* The selected candidate does not have to be local to Columbus, OH and can work remotely

* REPOST OF 82361 AND 83069 - DO NOT RESUBMIT CANDIDATES Hadoop Jobs in OH Columbus Specialist - (in Columbus, OH))

Data/Kafka Engineer - 21-00594
Create date: 26-Mar-2021
On-site/Remote:
On-site

Job industry:
Automotive and Transportation

Description:

Responsibilities

Job Title: Senior Data/Kafka Engineer

Location: Fremont, CA

Duration: 6+ months

Ursus is looking for a strong Data Engineer to support our clients Digital Ex.... (This job is for - job Jobs in CA Fremont Engineer - (in Fremont, CA))

Big Data Developer - 73185
Create date: 22-Mar-2021
Job Description:

This project is a focused on datafactory. It pulls in data from other agencies, insurance, providers, etc.. They clean up the data
prepares it for reporting (cleaning it up). this team loads it into the backend of other applications/reporting tools.

Strong Java application developer with big data. (This job is for - job Jobs in MN Minneapolis Developer - (in Minneapolis, MN))

Big Data Engineer
Create date: 22-Mar-2021
Job Description

Big Data Engineer (#2 Openings)

Remote

5 Months Contract

Client:

Client was built to help professionals achieve more in their careers, and every day millions of people use our products to make connections, discover opportunities and gain insights. Their global reach means
.... (This job is for - job Jobs in CA MountainView Engineer - (in Mountain View, CA))

Technical Specialist 3/TS3 - 83069
Create date: 10-Mar-2021
start date:3/15/2021

End date:06/30/2021

submission deadline:Monday 3/15 @ 10 am EST

client info : MCD

Note:

* Interviews: TEAMS

Description:

The Technical Specialist will be responsible for Medicaid Enterprise data warehou.... (This job is for - AnalysisHadoop Jobs in OH Columbus Specialist - (in Columbus, OH))
 
 Big Data job openings in NC
Jobs List

Senior Data Integration Engineer - 63757
Create date: 02-Jun-2020
Candidate must be our W2 Employee


We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens

Description:

Senior Data Integration Engineer

This position will be involved in loading data from Big Data infrastructure to Microsoft Azure and Snowflake and in performing mapping of data fields from a n.... (This job is for - job Jobs in NC Raleigh Engineer - (in Raleigh, NC))

Big Data Engineer - 59184
Create date: 10-Feb-2020
Note:

* Candidate must be our W2 Employee

* We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens

* Candidates RTR form should be only on the name of APEX Systems

• The project is moving pharmaceutical claims healthcare Big Data with Kafka, Spark, Hive using Python and Java. This data is then boug.... (This job is for - Spark Python Jobs in NC WinstonSalem Engineer - (in Winston Salem, NC))

PEDS - Sr. Python Engineer - 49685
Create date: 12-Jul-2019
Candidate must be our W2 Employee

Description:
Primary Responsibilities:
-Perform all phases of software engineering including requirements analysis, application design, code development and testing
-Design and implement product features including reusable components, frameworks and libraries in collaboration with business.... (This job is for - Python Jobs in NC Raleigh Engineer - (in Raleigh, NC))

PEDS - Sr. Lead Tech - 49684
Create date: 12-Jul-2019
Candidate must be our W2 Employee

Description:

Senior level Experienced (Tech Lead) resource with strong hands-on experience/skills in the areas of:



- Big Data Technologies

- Hadoop

- Hbase/Hive

- MapR

- Cloud Technologies

- Java

- Python.... (This job is for - Hadoop HBase Jobs in NC Raleigh Consultant - (in Raleigh, NC))

Oracle DBA - 48266
Create date: 06-Jun-2019
Candidate must be our W2 Employee

Description:*

Our client is a large Financial Company located in the Durham, NC area that is looking for an Oracle DBA to fill a 6+ month contract position.

We are seeking an Oracle DBA to manage a large-scale environment, supporting all environment builds, including design, capacit.... (This job is for - Oracle Jobs in NC Durham DBA - (in Durham, NC))
(Engineer: Big Data, Hadoop, ETL, AbInitio Teradata in Charlotte, NC)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us