Skip to content

Commit 41b011e

Browse files
Yuto AkutsuYuto Akutsuyutoacts
authored andcommitted
[SPARK-595][DOCS] Add local-cluster mode option in Documentation
### What changes were proposed in this pull request? Add local-cluster mode option to submitting-applications.md ### Why are the changes needed? Help users to find/use this option for unit tests. ### Does this PR introduce _any_ user-facing change? Yes, docs changed. ### How was this patch tested? `SKIP_API=1 bundle exec jekyll build` <img width="460" alt="docchange" src="https://user-images.githubusercontent.com/87687356/127125380-6beb4601-7cf4-4876-b2c6-459454ce2a02.png"> Closes #33537 from yutoacts/SPARK-595. Lead-authored-by: Yuto Akutsu <[email protected]> Co-authored-by: Yuto Akutsu <[email protected]> Co-authored-by: Yuto Akutsu <[email protected]> Signed-off-by: Thomas Graves <[email protected]>
1 parent e17612d commit 41b011e

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

docs/submitting-applications.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -162,9 +162,10 @@ The master URL passed to Spark can be in one of the following formats:
162162
<tr><th>Master URL</th><th>Meaning</th></tr>
163163
<tr><td> <code>local</code> </td><td> Run Spark locally with one worker thread (i.e. no parallelism at all). </td></tr>
164164
<tr><td> <code>local[K]</code> </td><td> Run Spark locally with K worker threads (ideally, set this to the number of cores on your machine). </td></tr>
165-
<tr><td> <code>local[K,F]</code> </td><td> Run Spark locally with K worker threads and F maxFailures (see <a href="configuration.html#scheduling">spark.task.maxFailures</a> for an explanation of this variable) </td></tr>
165+
<tr><td> <code>local[K,F]</code> </td><td> Run Spark locally with K worker threads and F maxFailures (see <a href="configuration.html#scheduling">spark.task.maxFailures</a> for an explanation of this variable). </td></tr>
166166
<tr><td> <code>local[*]</code> </td><td> Run Spark locally with as many worker threads as logical cores on your machine.</td></tr>
167167
<tr><td> <code>local[*,F]</code> </td><td> Run Spark locally with as many worker threads as logical cores on your machine and F maxFailures.</td></tr>
168+
<tr><td> <code>local-cluster[N,C,M]</code> </td><td> Local-cluster mode is only for unit tests. It emulates a distributed cluster in a single JVM with N number of workers, C cores per worker and M MiB of memory per worker.</td></tr>
168169
<tr><td> <code>spark://HOST:PORT</code> </td><td> Connect to the given <a href="spark-standalone.html">Spark standalone
169170
cluster</a> master. The port must be whichever one your master is configured to use, which is 7077 by default.
170171
</td></tr>

0 commit comments

Comments
 (0)