- Architect and implement business requirements into data and real-time analytic solutions leveraging Hadoop, spark, Scala, Kafka, Oracle, Spring Boot and Java.
- Architect, develop and maintain complex frameworks in Spring Boot, PCF, UNIX shell scripts, Hive, Oozie, Sqoop, Spark and Scala.
- Build real time data pipelines for collecting and processing data using Spark streaming, Spark structured streaming, Kafka Listener and DStreams.
- Write detailed design specifications for application requirements, including business rules, screens, interfaces, reports and definition of data and error messages.
- Work closely with product owners and application owners to build high-performance scalable architecture; provide inputs to deliver in a timely manner.
- Implement and support APIs and Microservices using Swagger framework.
Various unanticipated work locations throughout the United States; relocation may be required. Must be willing to relocate.
- Education: Applied Computer Science or Computer Science
- Experience: One (1) year
To apply for this job email your details to email@example.com