Update guides/spark

This commit is contained in:
Rob Genova 2018-06-22 20:55:12 +00:00 committed by Preetha Appan
parent 1ebdcf7688
commit 2e5f483dc2
No known key found for this signature in database
GPG key ID: 9F7C19990A50EAFC
4 changed files with 9 additions and 9 deletions

View file

@ -40,30 +40,30 @@ the first Nomad server contacted.
- `spark.nomad.docker.email` `(string: nil)` - Specifies the email address to
use when downloading the Docker image specified by
[spark.nomad.dockerImage](#spark.nomad.dockerImage). See the
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
[Docker driver authentication](/docs/drivers/docker.html#authentication)
docs for more information.
- `spark.nomad.docker.password` `(string: nil)` - Specifies the password to use
when downloading the Docker image specified by
[spark.nomad.dockerImage](#spark.nomad.dockerImage). See the
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
[Docker driver authentication](/docs/drivers/docker.html#authentication)
docs for more information.
- `spark.nomad.docker.serverAddress` `(string: nil)` - Specifies the server
address (domain/IP without the protocol) to use when downloading the Docker
image specified by [spark.nomad.dockerImage](#spark.nomad.dockerImage). Docker
Hub is used by default. See the
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
[Docker driver authentication](/docs/drivers/docker.html#authentication)
docs for more information.
- `spark.nomad.docker.username` `(string: nil)` - Specifies the username to use
when downloading the Docker image specified by
[spark.nomad.dockerImage](#spark-nomad-dockerImage). See the
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
[Docker driver authentication](/docs/drivers/docker.html#authentication)
docs for more information.
- `spark.nomad.dockerImage` `(string: nil)` - Specifies the `URL` for the
[Docker image](https://www.nomadproject.io/docs/drivers/docker.html#image) to
[Docker image](/docs/drivers/docker.html#image) to
use to run Spark with Nomad's `docker` driver. When not specified, Nomad's
`exec` driver will be used instead.

View file

@ -117,7 +117,7 @@ DataNodes to generically reference the NameNode:
```
Another viable option for DataNode task group is to use a dedicated
[system](https://www.nomadproject.io/docs/runtime/schedulers.html#system) job.
[system](/docs/schedulers.html#system) job.
This will deploy a DataNode to every client node in the system, which may or may
not be desirable depending on your use case.

View file

@ -127,9 +127,9 @@ $ spark-submit \
Nomad clients collect the `stderr` and `stdout` of running tasks. The CLI or the
HTTP API can be used to inspect logs, as documented in
[Accessing Logs](https://www.nomadproject.io/guides/operating-a-job/accessing-logs.html).
[Accessing Logs](/guides/operating-a-job/accessing-logs.html).
In cluster mode, the `stderr` and `stdout` of the `driver` application can be
accessed in the same way. The [Log Shipper Pattern](https://www.nomadproject.io/guides/operating-a-job/accessing-logs.html#log-shipper-pattern) uses sidecar tasks to forward logs to a central location. This
accessed in the same way. The [Log Shipper Pattern](/guides/operating-a-job/accessing-logs.html#log-shipper-pattern) uses sidecar tasks to forward logs to a central location. This
can be done using a job template as follows:
```hcl

View file

@ -10,7 +10,7 @@ description: |-
Nomad is well-suited for analytical workloads, given its [performance
characteristics](https://www.hashicorp.com/c1m/) and first-class support for
[batch scheduling](https://www.nomadproject.io/docs/runtime/schedulers.html).
[batch scheduling](/docs/schedulers.html).
Apache Spark is a popular data processing engine/framework that has been
architected to use third-party schedulers. The Nomad ecosystem includes a
[fork of Apache Spark](https://github.com/hashicorp/nomad-spark) that natively