Previous Job
Big Data Engineer
Ref No.: 18-00099
Location: Costa Mesa, California
Position Type:Contract
Start Date: 03/13/2018
The Advanced Technologies Group is responsible for leveraging new and emerging open source technologies to solve key technical challenges for our business and clients as well as integrating new data sets, products and applications acquired by Experian.

As a BigData Engineer on our team you will be responsible for applying your broad and deep knowledge of enterprise application development and big data to create seamless solutions that integrate our wide array of data and tools into comprehensive, scalable solutions

Key Responsibilities:
• Work in a collaborative manner with our scaled agile (SAFe) teams to rapidly deliver solutions
• Utilize your software engineering skills including Java, Python, Scala and Ruby to Analyze disparate, complex systems and collaboratively design new products and services
• Integrate new data sources and tools
• Implement scalable and reliable distributed data replication strategies
• Leverage Amazon Web Services to provide innovative solutions
• Convert business requirements to working prototypes and then deployable solutions
• Design and implement high-performance, scalable data solutions
• Provide best in class security in everything you do
• Automate everything
Knowledge, Experience & Qualifications:
• BS degree in computer science, computer engineering or equivalent
• 5+ year experience delivering enterprise software solutions
• Proficient in Java, Python, Ruby
• Familiarity with scripting languages
• Familiarity with AWS scripting and automation
• Must be able to quickly understand technical and business requirements and be able to translate them into technical implementations
• Experience with Agile Development methodologies
• 3+ years experience across multiple Hadoop / Spark technologies such as Hadoop, MapReduce, HDFS, Cassandra, HBase, Hive, Flume, Sqoop, Spark, Kafka, etc.
• Experience with data ingestion and transformation
• Solid understanding of secure application development methodologies
• Experience in developing large-scale software platforms involving ETL, data quality, fusion of data, and real-time ingestion and delivery
• Experience in streaming data processing platform such as Kafka, etc.
• Experience with data collection using public API's
• Experience in developing real-time based solutions
• Experience with the Scaled Agile Framework (SAFe)