Hadoop Interview questions


Total available count: 27
Subject - Apache
Subsubject - Hadoop

Please explain, how workers work, when a new Job submitted to them?

When SparkContext is created, each worker starts one executor. This is a distinct java process or you can say new or fresh JVM, and it loads application jar in this JVM. Now executors connect back to your driver program and the driver sends them commands, like, filter, foreach, map, etc. the executors shut down when the drivers quit.




Next 5 interview question(s)

1
What are the workers?
2
What is the purpose of Driver in Spark Architecture?
3
Define Spark architecture?
4
What is checkpointing?
5
What is Shuffling?