Previous Job
Previous
Hadoop Developer/Santa Clara, CA 36mth+ Contract
Ref No.: 18-01355
Location: Santa Clara, California
Position Type:Contract
 Location: Santa Clara, CA
Duration: 3 Years
 
No H1's or OPT/CPT EAD
 
 
  • Need to have Hadoop development  experience
  • The candidate must have strong experience/exposure across the entire data landscape with the ability to extend traditional data architecture techniques to include big data components. This includes Data Strategy and Architecture development, Data Governance, Master Data Management, Metadata management, Data Integration/ingestion, Data Quality management, Data modeling, Data warehousing, Business Intelligence and advanced Analytics.
  • Must possess a thorough understanding of data management as well as big data technology, tools, processes and data architecture best practices to guide data and big data centric initiatives.
  • Demonstrated ability to liaise closely with business and IT leadership to establish, defend, and negotiate the approach for implementing data management solutions.
  • Serves as an expert consultant to senior IT leadership on Data architecture and enterprise data initiatives.
  • Strong communication, organizational, interpersonal and time management skills; must have the ability to work independently with minimal direction or supervision in a team setting and dynamic environment. - Ability to identify and articulate domain specific use cases that can take advantage of big data tools and technologies.  
 
Required Job Functions:
  • 5+ years of experience with the Hadoop ecosystem and Big Data technologies
  • Expert level software development experience
  • Ability to dynamically adapt to conventional big-data frameworks and tools with the use-cases required by the project
  • Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hbase, Hive, Impala, Spark, Kafka, Kudu, Solr)
  • Experience with building stream-processing systems using solutions such as spark-streaming, Storm or Flink etc
  • Experience in other open-sources like Druid, Elastic Search, Logstash etc is a plus
  • Knowledge of design strategies for developing scalable, resilient, always-on data lake
  • Some knowledge of agile(scrum) development methodology is a plus  
 
Desirable Skills:
  • Hadoop/Cloud Developer Certification
  • MongoDB Developer Certification
  • Experience deploying applications in a cloud environment; ability to architect, design, deploy and manage cloud based Hadoop clusters