Home       |     Overview      |     Candidate Login      |     Post Resume       |     Contact us
 
  
     
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     
Hadoop ABINITIO SQL Database Engineer
(Jobs in Charlotte, NC)
 
Requirement id 91830
Job title Engineer
Job location in Charlotte, NC
Skills required Hadoop, ABINITIO, Informatica ETL, SQL Database
Open Date 01-May-2019
Close Date
Job type Contract
Duration 24 Months
Compensation DOE
Status requirement ---
Job interview type ---
   Email Recruiter: coolsoft
Job Description Engineer: Hadoop, ABINITIO, Informatica ETL, SQL Database

Job Description :

The Artificial Intelligence Technology Data Engineering Team is looking for a highly motivated and experienced Senior Data Engineer/ETL Developer. The right candidate will have expert level experience in supporting Big Data Platforms, products and data ingestion/provisioning activities from different data sources to and from the Enterprise Data Lake. As a Senior Data Engineer/ETL Developer, you will be working with Wells Fargo business & data science teams to get the business data requirements, perform data engineering/provisioning activities to support building, exploring, training and running Business models. The senior data engineer will use ETL tools like Informatica, Ab Initio, and data warehouse tools to deliver critical Artificial Intelligence Model Operationalization services to the Enterprise.

In this role you will be responsible for:

Data modeling, coding, analytical modeling, root cause analysis, investigation, debugging, testing and collaboration with the business partners, product managers, architects & other engineering teams.
Adopting and enforcing best practices related to data ingestion and extraction of data from the big data platform.
Extract business data from multiple data sources and store in MapR DB HDFS location.
Work with Data Scientists and build scripts to meet their data needs
Work with Enterprise Data Lake team to maintain data and information security for all use cases
Build automation script using AUTOSYS to automate the loads
Design and develop scripts and configurations to successfully load data using Data Ingestion Frameworks or Ab initio
Coordinate user access requests for data loaded in Data Lake
Post-production support of the AIES Open Source Data Science (OSDS) Platform
Supporting end-to-end Platform application delivery, including Infrastructure provisioning & automation and integration with Continuous Integration/Continuous Development (CI/CD) platforms, using existing and emerging technologies

Provides design and development support for production enhancements, problem tickets and other issue resolution.
Follows SDLC documentation needs for fixes to code
Develops new documentation, departmental technical procedures and user guides
Monitor production execution and respond to failures with processing
Review code execution and recommend optimizations for production processes.

Candidates must:

Be willing to work non-standard hours to support production execution or issue resolution
Be willing to be on-call/pager support for production escalation

Required Qualifications



BS/BA degree
1+ year experience with Ab Initio suite of tools – GDE, Express>IT
3+ years experience with Big Data platforms – Hadoop, MapR, Hive, Parquet
5+ years of ETL (Extract, Transform, Load) Programming with tools including Informatica
2+ years of Unix or Linux systems with scripting experience in Shell, Perl or Python
Experience with Advanced SQL preferably Teradata
Strong Hadoop scripting skills to process petabytes of data
Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, HBase, Pig, Apache Spark, etc.).
Possession of excellent analytical and problem-solving skills with high attention to detail and accuracy
Demonstrated ability to transform business requirements to code, metadata specifications, specific analytical reports and tools
Good verbal, written, and interpersonal communication skills
Experience with SDLC (System Development Life Cycle) including understanding of project management methodologies used in Waterfall or Agile development projects

Desired Qualifications


 
Call 502-379-4456 Ext 100 for more details. Please provide Requirement id: 91830 while calling.
 
Other jobs in NC: Chapel Hill (2), Charlotte (58), Dix Campus (1), Greensboro (3), Morrisville (2), North Charlotte (2), Raleigh (96), St Raleigh (2), Winston Salem (3),
Hadoop job openings in Charlotte, NC
Jobs List

Data Engineer - 71995
Create date: 16-Feb-2021
Candidate must be our W2 Employee.

Job Description:

The Digital organization is looking for a self-driven individual who will combine strong technical abilities and collaboration skills to deliver business value in a fast-paced and collaborative development environment. We are looking for a seasoned engineer with strong passio.... (This job is for - GCP Hadoop Python Jobs in NC Charlotte Engineer - (in Charlotte, NC))

Big Data/Hadoop Support Consultant - 69809
Create date: 03-Dec-2020
Note :This is a markup account, please list consultants s W2 hourly pay rate in the comments section when submitting. Upon offer, it is required to show consultants W2 take home pay in the Exh A.

Candidate must be our W2 Employee.

Job Description :

Seeking a Senior Big Data/Hadoop Support Consultant to support the R.... (This job is for - HadoopKafka Jobs in NC Charlotte Consultant - (in Charlotte, NC))

Big Data/Oracle Developer - 62915
Create date: 04-May-2020
Candidate must be our W2 Employee.

Job Description :

1. 5+ years overall backend development experience

2. Mongo Database/ Hadoop- big data development experience

3. oracle/pl sql experience along with the nosql database experience

4. oracle 12c or higher

Day to .... (This job is for - Hadoop Jobs in NC Charlotte Developer - (in Charlotte, NC))

Data Analyst - 61246
Create date: 17-Mar-2020
Job Description :

Requirements:

Tableau

Ability to do data extractions

Data validations

Data analysis

Data testing

Plusses: Hadoop and/or Data Lake

Job Description: The concentration will be on doing Data Analysis in Tableau based reports. This resource will be.... (This job is for - Tableau Hadoop Jobs in NC Charlotte Analyst - (in Charlotte, NC))

Hadoop Developer - 60936
Create date: 11-Mar-2020
* Candidate must be our W2 Employee.

Job Description :

Develops specifications for extremely complex computer network security/protection technologies for company information and network systems/applications. Develops security solutions for the companys networks and virtual private networks, application systems, key public in.... (This job is for - Hadoop RBAC Jobs in NC Charlotte Developer - (in Charlotte, NC))
 
 Hadoop job openings in other states
Jobs List

Back End Java Developer - 85842
Create date: 03-Jun-2022
Description:

General skills
Nice to have

Databricks
Hadoop is preferred but any big Data warehousing experience

IT skills
Must have

Backend data experience
Data Warehousing
Java/springboot API knowledge
AWS
Datalayer knowledge (This job is for - Hadoop Jobs in NY NewYork Developer - (in New York, NY))

Data Warehouse Designer/Developer - 58707
Create date: 05-Oct-2021
Description:

Responsibilities

Our Client in Columbus, Ohio is seeking an experienced Data Warehouse Designer / Developer for a contract role.

CANDIDATES MUST BE LOCAL TO CENTRAL OHIO OR RELOCATED BY DAY ONE FOR CONSIDERATION.

The Data Warehouse Designer/Developer will be responsible for Enterprise data wa.... (This job is for - Hadoop ETL ELT Jobs in OH Columbus Developer - (in Columbus, OH))

Technical Specialist - 58686
Create date: 30-Sep-2021
Description:

Responsibilities

Our client in Columbus Ohio is seeking an Enterprise Data Warehouse / ETL Developer.

Local candidates are highly preferred. Candidates are required to report onsite on Day 1 of the assignment.

The Technical Specialist will be responsible for the clients Enterprise data wareho.... (This job is for - Hadoop ETL Jobs in OH Columbus Specialist - (in Columbus, OH))

Data Warehouse Designer / Developer - 58526
Create date: 20-Aug-2021
Description:

Our Client in Columbus, Ohio is seeking an experienced Data Warehouse Designer / Developer for a contract role

CANDIDATES MUST BE LOCAL TO CENTRAL OHIO OR RELOCATED BY DAY ONE FOR CONSIDERATION

The Data Warehouse Designer/Developer will be responsible for Enterprise data warehouse design, development, i.... (This job is for - Hadoop Hive Jobs in OH Columbus Developer - (in Columbus, OH))

Senior Software Development Engineer
Create date: 26-Apr-2021
Description:

Top 3 required technical skills:

Automation Dev Ops Engineering [Hadoop, Spark]
Operational experience in real-time, streaming and data pipelines relevant frameworks [NiFi or Airflow]
Java/ Scala and Python programming
PCF or Cloud experience in general.

Role

.... (This job is for - Hadoop Spark Scala Jobs in CA SanFrancisco Engineer - (in San Francisco, CA))
 
 Hadoop job openings in NC
Jobs List

Sr. Data Engineer - 71519
Create date: 02-Feb-2021
Candidate must be our W2 Employee.

We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens.

Job Description:

What are the top 5-10 responsibilities for this position?

• Work collaboratively with business teams including data scientists and business intelligence analysts to understand their needs for.... (This job is for - HadoopETL Jobs in NC Raleigh Engineer - (in Raleigh, NC))

Big Data QA Tester - 68766
Create date: 22-Oct-2020
Candidate must be our W2 Employee.

We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens.

Job Description : Big Data QA Tester

Analyze and test software to ensure that products meet design specifications and are within total quality management limits and standards. Participate actively in the entire software.... (This job is for - ETL Hadoop Jobs in NC Durham Tester - (in Durham, NC))

Data Engineer - 64874
Create date: 10-Jul-2020
Note :We are trying to set IVs for next week so please send over your best candidates ASAP.

Candidate must be our W2 Employee.

Job Description : Data Engineer

The Data Engineer will work closely with senior engineers, data scientists and other stakeholders to design and maintain moderate to advanced data models. Th.... (This job is for - ETL Hadoop Hive Jobs in NC Durham Engineer - (in Durham, NC))

Hadoop Administrator - 61714
Create date: 30-Mar-2020
* Candidate must be local : Yes

* Candidate must be our W2 Employee

* We should submit only GC,GC - EAD, L2-EAD,H4-EAD and Citizens

Job Description :

Hadoop Administrator:

Our client is seeking a Hadoop Administrator to assist in managing a large scale 7x24x365 Hadoop cluster environment. The s.... (This job is for - Hadoop Jobs in NC Cary Administrator - (in CARY, NC))

Big Data Engineer - 61688
Create date: 30-Mar-2020
* Candidate must be our W2 Employee

Job Description :

Big Data Engineer Opening in Durham, NC!



Apex Systems combines with parent company On Assignment to make it the 2nd largest IT staffing agency in the country.

Apex has an opportunity for a Big Data Engineer in the Durham area. This is a 6+.... (This job is for - HadoopJava Jobs in NC Durham Engineer - (in Durham, NC))
(Engineer: Hadoop, ABINITIO, Informatica ETL, SQL Database in Charlotte, NC)
     
Search Jobs
     
Keywords,Title,Skills,Company  Location,City,State,Zip  
  Advanced Search
     

    About Us     Services    Privacy policy    Legal     Contact us