Skip to content

Commit

Permalink
cleanup remote #id (#3391)
Browse files Browse the repository at this point in the history
- follow-up to #3329
- see #3299 (comment)
  • Loading branch information
casperdcl authored and iesahin committed Apr 11, 2022
1 parent 64822d8 commit d06f509
Show file tree
Hide file tree
Showing 6 changed files with 47 additions and 47 deletions.
14 changes: 7 additions & 7 deletions content/docs/command-reference/get-url.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ $ wget https://example.com/path/to/data.csv

<details>

### Click for Amazon S3 example
### Amazon S3

This command will copy an S3 object into the current working directory with the
same file name:
Expand All @@ -108,7 +108,7 @@ configuration, you can use the parameters described in `dvc remote modify`.

<details>

### Click for Google Cloud Storage example
### Google Cloud Storage

```dvc
$ dvc get-url gs://bucket/path file
Expand All @@ -120,7 +120,7 @@ The above command downloads the `/path` file (or directory) into `./file`.

<details>

### Click for SSH example
### SSH

```dvc
$ dvc get-url ssh://[email protected]/path/to/data
Expand All @@ -133,7 +133,7 @@ directory).

<details>

### Click for HDFS example
### HDFS

```dvc
$ dvc get-url hdfs://[email protected]/path/to/file
Expand All @@ -143,7 +143,7 @@ $ dvc get-url hdfs://[email protected]/path/to/file

<details>

### Click for HTTP example
### HTTP

> Both HTTP and HTTPS protocols are supported.
Expand All @@ -155,7 +155,7 @@ $ dvc get-url https://example.com/path/to/file

<details>

### Click for WebHDFS example
### WebHDFS

```dvc
$ dvc get-url webhdfs://[email protected]/path/to/file
Expand All @@ -165,7 +165,7 @@ $ dvc get-url webhdfs://[email protected]/path/to/file

<details>

### Click and expand for a local example
### local

```dvc
$ dvc get-url /local/path/to/data
Expand Down
24 changes: 12 additions & 12 deletions content/docs/command-reference/remote/add.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ The following are the types of remote storage (protocols) supported:

<details>

### Click for Amazon S3
### Amazon S3

> 💡 Before adding an S3 remote, be sure to
> [Create a Bucket](https://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html).
Expand All @@ -115,7 +115,7 @@ methods that are performed by DVC (`list_objects_v2` or `list_objects`,

<details>

### Click for S3-compatible storage
### S3-compatible storage

For object storage that supports an S3-compatible API (e.g.
[Minio](https://min.io/),
Expand Down Expand Up @@ -143,7 +143,7 @@ they're effective depends on each storage platform.

<details>

### Click for Microsoft Azure Blob Storage
### Microsoft Azure Blob Storage

```dvc
$ dvc remote add -d myremote azure://mycontainer/path
Expand All @@ -165,7 +165,7 @@ To use a custom authentication method, use the parameters described in

<details>

### Click for Google Drive
### Google Drive

To start using a GDrive remote, first add it with a
[valid URL format](/doc/user-guide/setup-google-drive-remote#url-format). Then
Expand Down Expand Up @@ -199,7 +199,7 @@ modified.

<details>

### Click for Google Cloud Storage
### Google Cloud Storage

> 💡 Before adding a GC Storage remote, be sure to
> [Create a storage bucket](https://cloud.google.com/storage/docs/creating-buckets).
Expand All @@ -221,7 +221,7 @@ parameters, use the parameters described in `dvc remote modify`.

<details>

### Click for Aliyun OSS
### Aliyun OSS

First you need to set up OSS storage on Aliyun Cloud. Then, use an S3 style URL
for OSS storage, and configure the
Expand Down Expand Up @@ -263,7 +263,7 @@ $ export OSS_ACCESS_KEY_SECRET='mysecret'

<details>

### Click for SSH
### SSH

```dvc
$ dvc remote add -d myremote ssh://[email protected]/path
Expand All @@ -281,7 +281,7 @@ Please check that you are able to connect both ways with tools like `ssh` and

<details>

### Click for HDFS
### HDFS

⚠️ Using HDFS with a Hadoop cluster might require additional setup. Our
assumption is that the client is set up to use it. Specifically, [`libhdfs`]
Expand All @@ -303,7 +303,7 @@ $ dvc remote add -d myremote hdfs://[email protected]/path

<details>

### Click for WebHDFS
### WebHDFS

⚠️ Using WebHDFS requires to enable REST API access in the cluster: set the
config property `dfs.webhdfs.enabled` to `true` in `hdfs-site.xml`.
Expand Down Expand Up @@ -331,7 +331,7 @@ active kerberos session.

<details>

### Click for HTTP
### HTTP

```dvc
$ dvc remote add -d myremote https://example.com/path
Expand All @@ -343,7 +343,7 @@ $ dvc remote add -d myremote https://example.com/path

<details>

### Click for WebDAV
### WebDAV

```dvc
$ dvc remote add -d myremote \
Expand All @@ -364,7 +364,7 @@ $ dvc remote add -d myremote \

<details>

### Click for local remote
### local remote

A "local remote" is a directory in the machine's file system. Not to be confused
with the `--local` option of `dvc remote` (and other config) commands!
Expand Down
22 changes: 11 additions & 11 deletions content/docs/command-reference/remote/modify.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ options:

<details>

### Click for Amazon S3
### Amazon S3

- `url` - remote location, in the `s3://<bucket>/<key>` format:

Expand Down Expand Up @@ -333,7 +333,7 @@ For more on the supported env vars, please see the

<details>

### Click for S3-compatible storage
### S3-compatible storage

- `endpointurl` - URL to connect to the S3-compatible storage server or service
(e.g. [Minio](https://min.io/),
Expand All @@ -352,7 +352,7 @@ storage. Whether they're effective depends on each storage platform.

<details>

### Click for Microsoft Azure Blob Storage
### Microsoft Azure Blob Storage

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -535,7 +535,7 @@ can propagate from an Azure configuration file (typically managed with

<details>

### Click for Google Drive
### Google Drive

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -638,7 +638,7 @@ more information.

<details>

### Click for Google Cloud Storage
### Google Cloud Storage

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -685,7 +685,7 @@ $ export GOOGLE_APPLICATION_CREDENTIALS='.../project-XXX.json'

<details>

### Click for Aliyun OSS
### Aliyun OSS

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -731,7 +731,7 @@ $ export OSS_ENDPOINT='endpoint'

<details>

### Click for SSH
### SSH

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -824,7 +824,7 @@ $ export OSS_ENDPOINT='endpoint'

<details>

### Click for HDFS
### HDFS

💡 Using a HDFS cluster as remote storage is also supported via the WebHDFS API.
Read more about by expanding the WebHDFS section in
Expand Down Expand Up @@ -858,7 +858,7 @@ Read more about by expanding the WebHDFS section in

<details>

### Click for WebHDFS
### WebHDFS

💡 WebHDFS serves as an alternative for using the same remote storage supported
by HDFS. Read more about by expanding the WebHDFS section in
Expand Down Expand Up @@ -947,7 +947,7 @@ by HDFS. Read more about by expanding the WebHDFS section in

<details>

### Click for HTTP
### HTTP

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down Expand Up @@ -1039,7 +1039,7 @@ by HDFS. Read more about by expanding the WebHDFS section in

<details>

### Click for WebDAV
### WebDAV

> If any values given to the parameters below contain sensitive user info, add
> them with the `--local` option, so they're written to a Git-ignored config
Expand Down
10 changes: 5 additions & 5 deletions content/docs/user-guide/contributing/core.md
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ manipulations below.

<details>

### Click for Amazon S3 instructions
### Amazon S3

Install
[aws cli](https://docs.aws.amazon.com/en_us/cli/latest/userguide/cli-chap-install.html)
Expand All @@ -195,7 +195,7 @@ $ export DVC_TEST_AWS_REPO_BUCKET="...TEST-S3-BUCKET..."

<details>

### Click for Microsoft Azure Blob Storage instructions
### Microsoft Azure Blob Storage

Install [Node.js](https://nodejs.org/en/download/) and then install and run
Azurite:
Expand All @@ -217,7 +217,7 @@ $ export AZURE_STORAGE_CONNECTION_STRING="DefaultEndpointsProtocol=http;AccountN

<details>

### Click for Google Drive instructions
### Google Drive

> 💡 Please remember that Google Drive access tokens are personal credentials
> and should not be shared with anyone, otherwise risking unauthorized usage of
Expand All @@ -238,7 +238,7 @@ $ export GDRIVE_USER_CREDENTIALS_DATA='mysecret'

<details>

### Click for Google Cloud Storage instructions
### Google Cloud Storage

Go through the [quick start](https://cloud.google.com/sdk/docs/quickstarts) for
your OS. After that, you should have the `gcloud` command line tool available,
Expand Down Expand Up @@ -278,7 +278,7 @@ may use different names.

<details>

### Click for HDFS instructions
### HDFS

Tests currently only work on Linux. First you need to set up passwordless SSH
auth to localhost:
Expand Down
14 changes: 7 additions & 7 deletions content/docs/user-guide/external-dependencies.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ downloads a file from an external location, on all the supported location types.
<details>

### Click for Amazon S3
### Amazon S3

```dvc
$ dvc run -n download_file \
Expand All @@ -55,7 +55,7 @@ $ dvc run -n download_file \

<details>

### Click for Microsoft Azure Blob Storage
### Microsoft Azure Blob Storage

```dvc
$ dvc run -n download_file \
Expand All @@ -72,7 +72,7 @@ $ dvc run -n download_file \

<details>

### Click for Google Cloud Storage
### Google Cloud Storage

```dvc
$ dvc run -n download_file \
Expand All @@ -85,7 +85,7 @@ $ dvc run -n download_file \

<details>

### Click for SSH
### SSH

```dvc
$ dvc run -n download_file \
Expand All @@ -104,7 +104,7 @@ Please check that you are able to connect both ways with tools like `ssh` and

<details>

### Click for HDFS
### HDFS

```dvc
$ dvc run -n download_file \
Expand All @@ -118,7 +118,7 @@ $ dvc run -n download_file \

<details>

### Click for HTTP
### HTTP

> Including HTTPs
Expand All @@ -133,7 +133,7 @@ $ dvc run -n download_file \

<details>

### Click for local file system paths
### local file system paths

```dvc
$ dvc run -n download_file \
Expand Down
Loading

0 comments on commit d06f509

Please sign in to comment.