Hadoop Interview questions


Total available count: 27
Subject - Apache
Subsubject - Hadoop

What are the workers?

Workers or slaves are running Spark instances where executors live to execute tasks. They are the compute nodes in Spark. A worker receives serialized/marshaled tasks that it runs in a thread pool.




Next 5 interview question(s)

1
What is the purpose of Driver in Spark Architecture?
2
Define Spark architecture?
3
What is checkpointing?
4
What is Shuffling?
5
Data is spread in all the nodes of cluster, how spark tries to process this data?