Update guides/spark
This commit is contained in:
parent
1ebdcf7688
commit
2e5f483dc2
|
@ -40,30 +40,30 @@ the first Nomad server contacted.
|
|||
- `spark.nomad.docker.email` `(string: nil)` - Specifies the email address to
|
||||
use when downloading the Docker image specified by
|
||||
[spark.nomad.dockerImage](#spark.nomad.dockerImage). See the
|
||||
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
|
||||
[Docker driver authentication](/docs/drivers/docker.html#authentication)
|
||||
docs for more information.
|
||||
|
||||
- `spark.nomad.docker.password` `(string: nil)` - Specifies the password to use
|
||||
when downloading the Docker image specified by
|
||||
[spark.nomad.dockerImage](#spark.nomad.dockerImage). See the
|
||||
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
|
||||
[Docker driver authentication](/docs/drivers/docker.html#authentication)
|
||||
docs for more information.
|
||||
|
||||
- `spark.nomad.docker.serverAddress` `(string: nil)` - Specifies the server
|
||||
address (domain/IP without the protocol) to use when downloading the Docker
|
||||
image specified by [spark.nomad.dockerImage](#spark.nomad.dockerImage). Docker
|
||||
Hub is used by default. See the
|
||||
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
|
||||
[Docker driver authentication](/docs/drivers/docker.html#authentication)
|
||||
docs for more information.
|
||||
|
||||
- `spark.nomad.docker.username` `(string: nil)` - Specifies the username to use
|
||||
when downloading the Docker image specified by
|
||||
[spark.nomad.dockerImage](#spark-nomad-dockerImage). See the
|
||||
[Docker driver authentication](https://www.nomadproject.io/docs/drivers/docker.html#authentication)
|
||||
[Docker driver authentication](/docs/drivers/docker.html#authentication)
|
||||
docs for more information.
|
||||
|
||||
- `spark.nomad.dockerImage` `(string: nil)` - Specifies the `URL` for the
|
||||
[Docker image](https://www.nomadproject.io/docs/drivers/docker.html#image) to
|
||||
[Docker image](/docs/drivers/docker.html#image) to
|
||||
use to run Spark with Nomad's `docker` driver. When not specified, Nomad's
|
||||
`exec` driver will be used instead.
|
||||
|
||||
|
|
|
@ -117,7 +117,7 @@ DataNodes to generically reference the NameNode:
|
|||
```
|
||||
|
||||
Another viable option for DataNode task group is to use a dedicated
|
||||
[system](https://www.nomadproject.io/docs/runtime/schedulers.html#system) job.
|
||||
[system](/docs/schedulers.html#system) job.
|
||||
This will deploy a DataNode to every client node in the system, which may or may
|
||||
not be desirable depending on your use case.
|
||||
|
||||
|
|
|
@ -127,9 +127,9 @@ $ spark-submit \
|
|||
|
||||
Nomad clients collect the `stderr` and `stdout` of running tasks. The CLI or the
|
||||
HTTP API can be used to inspect logs, as documented in
|
||||
[Accessing Logs](https://www.nomadproject.io/guides/operating-a-job/accessing-logs.html).
|
||||
[Accessing Logs](/guides/operating-a-job/accessing-logs.html).
|
||||
In cluster mode, the `stderr` and `stdout` of the `driver` application can be
|
||||
accessed in the same way. The [Log Shipper Pattern](https://www.nomadproject.io/guides/operating-a-job/accessing-logs.html#log-shipper-pattern) uses sidecar tasks to forward logs to a central location. This
|
||||
accessed in the same way. The [Log Shipper Pattern](/guides/operating-a-job/accessing-logs.html#log-shipper-pattern) uses sidecar tasks to forward logs to a central location. This
|
||||
can be done using a job template as follows:
|
||||
|
||||
```hcl
|
||||
|
|
|
@ -10,7 +10,7 @@ description: |-
|
|||
|
||||
Nomad is well-suited for analytical workloads, given its [performance
|
||||
characteristics](https://www.hashicorp.com/c1m/) and first-class support for
|
||||
[batch scheduling](https://www.nomadproject.io/docs/runtime/schedulers.html).
|
||||
[batch scheduling](/docs/schedulers.html).
|
||||
Apache Spark is a popular data processing engine/framework that has been
|
||||
architected to use third-party schedulers. The Nomad ecosystem includes a
|
||||
[fork of Apache Spark](https://github.com/hashicorp/nomad-spark) that natively
|
||||
|
|
Loading…
Reference in a new issue