0

applicants

Hadoop- Spark/Scala Consultant Full-time

at USM Business Systems in Connecticut (Published at 26-06-2018)

Please do let me know if you’re interested for Spark/Kafka Consultant - Big data Platform role at Stamford, CT

Spark/Kafka Consultant - Big data Platform
Fulltime/Permanent role
Stamford, CT

Requirements
• 5+ years of relevant Hands-on experience in Big Data technologies, such as Hadoop, Spark, HDFS,
• Yarn, Storm, Hive, HBase, Sqoop, OOZIE, and NoSQL databases etc.
• Very strong concept in Spark and Kafka
• Hands-on experience in other distributed systems e.g. Solr, ElasticSearch, Kafka etc.
• Very hands on writing Java or Scala programs by using any build tools such as Maven, Gradle, or SBT.
• Deep knowledge of distributed processing frameworks. Spark knowledge is must have.
• Strong development and automation skills.
• Comfortable leading and coordinating with offshore team.
• Knowledge of design strategies for developing scalable, resilient big data solutions.
• Experience in agile(scrum) development methodology.
• Curiosity and passion about data engineering and solving data problems


Thank You
Sanjaya Jena | Yochana IT Solution Inc
I’m at sanjaya@yochana.com


Reference : Hadoop- Spark/Scala Consultant jobs


Recent jobs at USM Business Systems


Viewed: 8 times
« Go back to category
Is this job ad fake? Report it!   
Recommend to a friend