Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions docs/get-started/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,14 @@ If you prefer [uv](https://docs.astral.sh/uv/):
uv tool install -U openshell
```

On Debian and Ubuntu, download the `.deb` package from the [GitHub releases page](https://github.com/NVIDIA/OpenShell/releases) and install it:

```shell
sudo dpkg -i openshell_<version>_amd64.deb
```

The package includes the CLI, the gateway binary, and the VM compute driver.

After installing the CLI, run `openshell --help` in your terminal to see the full CLI reference, including all commands and flags.

<Tip>
Expand Down
2 changes: 1 addition & 1 deletion docs/get-started/tutorials/first-network-policy.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -218,4 +218,4 @@ bash examples/sandbox-policy-quickstart/demo.sh

## Next Steps

- To walk through a full policy iteration with Claude Code, including diagnosing denials and applying fixes from outside the sandbox, refer to [GitHub Sandbox](/tutorials/github-sandbox).
- To walk through a full policy iteration with Claude Code, including diagnosing denials and applying fixes from outside the sandbox, refer to [GitHub Sandbox](/get-started/tutorials/github-sandbox).
6 changes: 3 additions & 3 deletions docs/inference/configure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ openshell provider create \
Use `--config OPENAI_BASE_URL` to point to any OpenAI-compatible server running where the gateway runs. For host-backed local inference, use `host.openshell.internal` or the host's LAN IP. Avoid `127.0.0.1` and `localhost`. Set `OPENAI_API_KEY` to a dummy value if the server does not require authentication.

<Tip>
For a self-contained setup, the Ollama community sandbox bundles Ollama inside the sandbox itself — no host-level provider needed. See [Inference Ollama](/tutorials/inference-ollama) for details.
For a self-contained setup, the Ollama community sandbox bundles Ollama inside the sandbox itself — no host-level provider needed. See [Inference Ollama](/get-started/tutorials/inference-ollama) for details.

</Tip>

Expand Down Expand Up @@ -190,7 +190,7 @@ A successful response confirms the privacy router can reach the configured backe
Explore related topics:

- To understand the inference routing flow and supported API patterns, refer to [Index](/inference/about).
- To follow a complete Ollama-based local setup, refer to [Inference Ollama](/tutorials/inference-ollama).
- To follow a complete LM Studio-based local setup, refer to [Local Inference Lmstudio](/tutorials/local-inference-lmstudio).
- To follow a complete Ollama-based local setup, refer to [Inference Ollama](/get-started/tutorials/inference-ollama).
- To follow a complete LM Studio-based local setup, refer to [Local Inference Lmstudio](/get-started/tutorials/local-inference-lmstudio).
- To control external endpoints, refer to [Policies](/sandboxes/policies).
- To manage provider records, refer to [Manage Providers](/sandboxes/manage-providers).
2 changes: 1 addition & 1 deletion docs/sandboxes/manage-providers.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ The following provider types are supported.
| `claude` | `ANTHROPIC_API_KEY`, `CLAUDE_API_KEY` | Claude Code, Anthropic API |
| `codex` | `OPENAI_API_KEY` | OpenAI Codex |
| `generic` | User-defined | Any service with custom credentials |
| `github` | `GITHUB_TOKEN`, `GH_TOKEN` | GitHub API, `gh` CLI — refer to [GitHub Sandbox](/tutorials/github-sandbox) |
| `github` | `GITHUB_TOKEN`, `GH_TOKEN` | GitHub API, `gh` CLI — refer to [GitHub Sandbox](/get-started/tutorials/github-sandbox) |
| `gitlab` | `GITLAB_TOKEN`, `GLAB_TOKEN`, `CI_JOB_TOKEN` | GitLab API, `glab` CLI |
| `nvidia` | `NVIDIA_API_KEY` | NVIDIA API Catalog |
| `openai` | `OPENAI_API_KEY` | Any OpenAI-compatible endpoint. Set `--config OPENAI_BASE_URL` to point to the provider. Refer to [Configure](/inference/configure). |
Expand Down
2 changes: 1 addition & 1 deletion docs/sandboxes/manage-sandboxes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ openshell sandbox delete my-sandbox

## Next Steps

- To follow a complete end-to-end example, refer to the [GitHub Sandbox](/tutorials/github-sandbox) tutorial.
- To follow a complete end-to-end example, refer to the [GitHub Sandbox](/get-started/tutorials/github-sandbox) tutorial.
- To supply API keys or tokens, refer to [Manage Providers](/sandboxes/manage-providers).
- To control what the agent can access, refer to [Policies](/sandboxes/policies).
- To use a pre-built environment, refer to the [Community Sandboxes](/sandboxes/community-sandboxes) catalog.
2 changes: 1 addition & 1 deletion docs/sandboxes/policies.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -454,7 +454,7 @@ Endpoints without `protocol` use TCP passthrough, where the proxy allows the str
Allow Claude and the GitHub CLI to reach `api.github.com` with per-path rules: read-only (GET, HEAD, OPTIONS) and GraphQL (POST) for all paths; full write access for `alpha-repo`; and create/edit issues only for `bravo-repo`. Replace `<org_name>` with your GitHub org or username.

<Tip>
For an end-to-end walkthrough that combines this policy with a GitHub credential provider and sandbox creation, refer to [GitHub Sandbox](/tutorials/github-sandbox).
For an end-to-end walkthrough that combines this policy with a GitHub credential provider and sandbox creation, refer to [GitHub Sandbox](/get-started/tutorials/github-sandbox).

</Tip>

Expand Down
Loading