Previous Job
Previous
Scala Developer
Ref No.: 19-03637
Location: Wayne, New Jersey

Hi All,

We are looking for Scala Developer

Location: Whippany, NJ and San Ramon, CA and SFO

LOng Term

Experience in Scala Development and 4+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture, and application architecture.

Advanced knowledge about Spark supported by 2+ years of experience with at least one year with Spark 2.x.
Demonstrated and strong experience in a software engineering role, including the design, development, and operation of distributed, fault-tolerant applications with particular attention to security, scalability, performance, availability and optimization.

Proficiency working with Hadoop platform including Kafka, Spark/Scala, SparkSQL, HBase, Impala, Hive, HDFS in multi-tenant environments

Experience with Scala design and development experience;

Advanced analytical thinking and problem-solving skills

Advanced knowledge of application, data and infrastructure architecture disciplines

Advanced knowledge of Scala or Python 3, with at least some familiarity in both and with one of them supported by 5+ years of programming experience using functional and object-oriented paradigms

At least some familiarity with second from Python and Scala pair

At least 3 years of experience with Hadoop ecosystem including tools like YARN, Hive, Impala, HDFS including some knowledge about Hadoop clusters architecture

Good to Haves
Experience with Apache NiFi, Atlas, Hortonworks DPS is highly preferred
Understanding Database design, creation, manipulation and query of Relational, Hadoop and NoSQL datastores. Demonstrates thorough abilities and/or a proven record of success in the following areas:
A solid base in data technologies like warehousing, ETL,Data Quality, Informatica

Keywords: Spark/Scala Developer, Spark developer, Scala developer, Kafka Developer, Bigdata developer, Hadoop Developer, Java hadoop developer

Please share your resume with me.