Previous Job
Previous
Big Data Developer/Greenville, SC 9 mth+ contract
Ref No.: 18-00361
Location: Greenville, South Carolina
Position Type:Contract
Role: Big Data Developer
Location: Greenville, SC  
Duration: 9 mth+ contract 

Visa: US Citizen or GC or GC EAD or TN Only


Skills and Experience: -

Experience building large-scale data ingestion framework or leverage COTS products for implementing batch frameworks (e.g. SpringXD, KiteSDK etc.) –
Excellent understanding and implementation experience of Hadoop Architecture, including the following technologies: -
Hadoop Distribution: Hortonworks (preferred) –
Data Storage: HDFS, HBase, HIVE –
Data Processing, Analysis & Integration: Spark (Python or Scala), Kafka, Impala, Sqoop - ETL tool: Talend –
Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming –
Work on MPP, Big Data technologies like Hadoop, Data Engineering, Data Governance implementations and support. –
Gather and process raw data at scale (including writing scripts, calling APIs, write SQL queries, etc.). –
Design and develop data structures that support high performing and scalable analytic applications. –
Implementing automation and related integration technologies with Ansible, Chef, or Puppet. –
Work closely with engineering team to integrate amazing innovations and algorithms into data lake systems.
 
Responsibilities: - Work on technical architecture design, application design and development, testing, and deployment. - Work on MPP, Big Data technologies like Hadoop, Data Engineering, Data Governance implementations and support. - Produce technical specifications and design for development, solution development/migration and systems integration requirements. - Participate and lead internal development and external collaboration meetings. - Conduct or coordinate tests to ensure that intelligence is consistent with defined needs. - Oversee testing of data acquisition processes and their implementation into production. - Work closely with engineering team to integrate amazing innovations and algorithms into data lake systems.