open-nomad/terraform/README.md

89 lines
3.3 KiB
Markdown
Raw Normal View History

# Provision a Nomad cluster in the Cloud
2017-06-15 22:49:02 +00:00
Use this repo to easily provision a Nomad sandbox environment on AWS or Azure with
2017-06-25 17:45:30 +00:00
[Packer](https://packer.io) and [Terraform](https://terraform.io).
[Consul](https://www.consul.io/intro/index.html) and
[Vault](https://www.vaultproject.io/intro/index.html) are also installed
(colocated for convenience). The intention is to allow easy exploration of
Nomad and its integrations with the HashiCorp stack. This is *not* meant to be
a production ready environment. A demonstration of [Nomad's Apache Spark
2017-06-25 17:45:30 +00:00
integration](examples/spark/README.md) is included.
2017-06-15 22:49:02 +00:00
## Setup
Clone the repo and optionally use [Vagrant](https://www.vagrantup.com/intro/index.html)
2017-06-25 17:45:30 +00:00
to bootstrap a local staging environment:
2017-06-15 22:49:02 +00:00
```bash
$ git clone git@github.com:hashicorp/nomad.git
$ cd nomad/terraform
2017-06-15 22:49:02 +00:00
$ vagrant up && vagrant ssh
```
The Vagrant staging environment pre-installs Packer, Terraform, Docker and the
Azure CLI.
2017-06-15 22:49:02 +00:00
2017-06-24 23:50:11 +00:00
## Provision a cluster
2017-06-15 22:49:02 +00:00
2017-11-15 22:18:49 +00:00
- Follow the steps [here](aws/README.md) to provision a cluster on AWS.
- Follow the steps [here](azure/README.md) to provision a cluster on Azure.
2017-06-15 22:49:02 +00:00
Continue with the steps below after a cluster has been provisioned.
2017-06-15 22:49:02 +00:00
## Test
2017-06-15 22:49:02 +00:00
Run a few basic status commands to verify that Consul and Nomad are up and running
2017-06-25 17:45:30 +00:00
properly:
2017-06-15 22:49:02 +00:00
```bash
$ consul members
$ nomad server members
$ nomad node status
2017-06-15 22:49:02 +00:00
```
## Unseal the Vault cluster (optional)
To initialize and unseal Vault, run:
2017-06-15 22:49:02 +00:00
```bash
$ vault operator init -key-shares=1 -key-threshold=1
$ vault operator unseal
2017-06-15 22:49:02 +00:00
$ export VAULT_TOKEN=[INITIAL_ROOT_TOKEN]
```
2017-06-25 17:45:30 +00:00
The `vault init` command above creates a single
[Vault unseal key](https://www.vaultproject.io/docs/concepts/seal.html) for
convenience. For a production environment, it is recommended that you create at
least five unseal key shares and securely distribute them to independent
operators. The `vault init` command defaults to five key shares and a key
threshold of three. If you provisioned more than one server, the others will
become standby nodes but should still be unsealed. You can query the active
2017-06-25 17:45:30 +00:00
and standby nodes independently:
2017-06-25 17:09:28 +00:00
```bash
$ dig active.vault.service.consul
$ dig active.vault.service.consul SRV
$ dig standby.vault.service.consul
```
See the [Getting Started guide](https://www.vaultproject.io/intro/getting-started/first-secret.html)
for an introduction to Vault.
2017-06-15 22:49:02 +00:00
## Getting started with Nomad & the HashiCorp stack
Use the following links to get started with Nomad and its HashiCorp integrations:
2017-06-15 22:49:02 +00:00
2017-06-16 21:46:51 +00:00
* [Getting Started with Nomad](https://www.nomadproject.io/intro/getting-started/jobs.html)
* [Consul integration](https://www.nomadproject.io/docs/service-discovery/index.html)
* [Vault integration](https://www.nomadproject.io/docs/vault-integration/index.html)
* [consul-template integration](https://www.nomadproject.io/docs/job-specification/template.html)
2017-06-15 22:49:02 +00:00
## Apache Spark integration
2017-06-25 17:45:30 +00:00
Nomad is well-suited for analytical workloads, given its performance
characteristics and first-class support for batch scheduling. Apache Spark is a
popular data processing engine/framework that has been architected to use
third-party schedulers. The Nomad ecosystem includes a [fork that natively
integrates Nomad with Spark](https://github.com/hashicorp/nomad-spark). A
detailed walkthrough of the integration is included [here](examples/spark/README.md).