Search for More Jobs
Forward job to a friend
Apply without Registering
Apply by creating/using an account
Please enter your registered email address, and we'll email you a link to reset your password right away.
Job Description: Are you excited to make a difference in the fast growing fields of aerospace and analytics? Are you a self-starter, innovator and a team player looking to enable pioneering solutions with thousands of Client data scientists and mathematicians working to benefit the aerospace industry? Here is an opportunity to be part of the Information & Analytics team to help transform data, reveal insights that empower a world of limitless possibilities.
Client AnalytX powers portfolio of analytics-driven products and services provided by Client and its family of companies. These solutions tailored for commercial and defense operators include Flight/Mission Optimization and Management, Fleet Performance & Reliability Analytics, Maintenance & Engineering Optimization, Supply Chain & Inventory Optimization, and Optimized Training. Client AnalytX applies sophisticated levels of data analytics to reveal relevant insights and facilitate swift decision-making.
You will be part of AnalytX Platform and Data team at Bellevue, WA or Charleston, NC and will partner closely with a team of data technologists, data scientists, and business analysts leading Client's Big Data strategy. You will architect and implement these road maps and bring to life revolutionary new analytics and insights. You will provide technical direction to the engineering and application team. You will collaborate with internal functions to utilize the new big data tools.
* Lead a team of technical resources performing solution development in the Big Data ecosystem (Hadoop and Teradata)
* Provide architecture and technology leadership across Batch and Streaming data processing platforms
* Focus in one or more core areas: Hive, HBase, Kafka, Storm, Spark, Spark Streaming, NiFi and other tools within the big data ecosystem
* Design and develop data pipelines (code, scripting, tooling) for both structured and unstructured data
* Participate in requirements gathering and design technical workshops with platform users
* Estimate new projects
* Evaluate new big data technologies
* Leverage the Big Data ecosystem (Hadoop and Teradata) to manage data at scale
* Integrating with external data sources and APIs
* Designing, building, and delivering apps following industry best practices
* Work with the developers, business analysts and subject matter experts to understand the complex technological system in order to produce integrated end-to-end solution options
Client is the world's largest aerospace company and leading manufacturer of commercial airplanes and defense, space and security systems. We are engineers and technicians. Skilled scientists and thinkers. Bold innovators and dreamers. Join us, and you can build something better for yourself, for our customers and for the world.
This position must meet Export Control compliance requirements, therefore a "US Person " as defined by 22 C.F.R. § 120.15 is required. "US Person " includes US Citizen, lawful permanent resident, refugee, or asylee.
Preferred Qualifications (Desired Skills/Experience):
Strong SQL experience
Minimum 3 years of experience with Big Data ecosystem (Hadoop and Teradata) in a production environment
Minimum 3 years of experience working with Linux
Minimum 3 years of experience building and deploying Java applications
Minimum 3 years of experience leading a software development project
Experience building data pipelines
Experience collecting, organizing, synthesizing and analyzing data with Teradata
Experience with Hive, Spark, Kafka, HBase
Experience on Hortonworks Hadoop Distrbution
Application development on Java
Ability to "Read the Manual " and figure it out.
Apply by creating/using an account