Hadoop Interview questions


Total available count: 27
Subject - Apache
Subsubject - Hadoop

How can you define SparkConf?

Spark properties control most application settings and are configured separately/individually for each application. These properties can be set directly on a SparkConf passed to your SparkContext. SparkConf permits you to configure some of the common properties (e.g. application name and master URL), as well as arbitrary key-value pairs through the set() method. 

For example, we could initialize an application with two threads as below:

Note that we run with local[3], meaning three threads - which represents minimal parallelism,

Which can help detect bugs that only exist when we run in a distributed context.

Val conf = new SparkConf()
.setMaster("local[3]")
.setAppName("CountingSlightBooks")
Val sc = new SparkContext(conf)

 




Next 3 interview question(s)

1
How do you define SparkContext?
2
Why both Spark and Hadoop needed?
3
Why Spark, even Hadoop exists?