Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't find --overwrite in az storage blob upload/upload-batch #21875

Closed
qwordy opened this issue Mar 31, 2022 · 11 comments · Fixed by #23329
Closed

Can't find --overwrite in az storage blob upload/upload-batch #21875

qwordy opened this issue Mar 31, 2022 · 11 comments · Fixed by #23329
Assignees
Labels
Auto-Assign Auto assign by bot feature-request Storage az storage
Milestone

Comments

@qwordy
Copy link
Member

qwordy commented Mar 31, 2022

az feedback auto-generates most of the information requested below, as of CLI version 2.0.62

Describe the bug

I notice a breaking change in 2.34.1. However, I can't see it in my machine. If I pass this parameter, it gives me an "unrecognized parameter" error.

[BREAKING CHANGE] az storage blob upload/upload-batch: Fix --overwrite that it no longer overwrite by default
https://docs.microsoft.com/en-us/cli/azure/release-notes-azure-cli

PS C:\> az -v
azure-cli                         2.34.1

core                              2.34.1
telemetry                          1.0.6

Extensions:
image-copy-extension               0.2.4
quantum                            0.2.0

Dependencies:
msal                              1.16.0
azure-mgmt-resource               20.0.0

Python location 'C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\python.exe'
Extensions directory 'C:\Users\yfy\.azure\cliextensions'
Development extension sources:
    C:\Users\yfy\work\azure-cli-extensions

Python (Windows) 3.8.9 (tags/v3.8.9:a743f81, Apr  6 2021, 13:22:56) [MSC v.1928 32 bit (Intel)]

Legal docs and information: aka.ms/AzureCliLegal


Your CLI is up-to-date.

Please let us know how we are doing: https://aka.ms/azureclihats
and let us know if you're interested in trying out our newest features: https://aka.ms/CLIUXstudy
PS C:\> az storage blob upload-batch -h

Command
    az storage blob upload-batch : Upload files from a local directory to a blob container.

Arguments
    --destination -d             [Required] : The blob container where the files will be uploaded.
        The destination can be the container URL or the container name. When the destination is the
        container URL, the storage account name will be parsed from the URL.
    --source -s                  [Required] : The directory where the files to be uploaded are
                                              located.
    --auth-mode                             : The mode in which to run the command. "login" mode
                                              will directly use your login credentials for the
                                              authentication. The legacy "key" mode will attempt to
                                              query for an account key if no authentication
                                              parameters for the account are provided. Environment
                                              variable: AZURE_STORAGE_AUTH_MODE.  Allowed values:
                                              key, login.
    --destination-path                      : The destination path that will be prepended to the
                                              blob name.
    --dryrun                                : Show the summary of the operations to be taken instead
                                              of actually uploading the file(s).
    --lease-id                              : The active lease id for the blob.
    --max-connections                       : Maximum number of parallel connections to use when the
                                              blob size exceeds 64MB.  Default: 2.
    --metadata                              : Metadata in space-separated key=value pairs. This
                                              overwrites any existing metadata.
    --no-progress                           : Include this flag to disable progress reporting for
                                              the command.
    --pattern                               : The pattern used for globbing files or blobs in the
                                              source. The supported patterns are '*', '?', '[seq]',
                                              and '[!seq]'. For more information, please refer to
                                              https://docs.python.org/3.7/library/fnmatch.html.
        When you use '*' in --pattern, it will match any character including the the directory
        separator '/'.
    --socket-timeout                        : The socket timeout(secs), used by the service to
                                              regulate data flow.
    --timeout                               : Request timeout in seconds. Applies to each call to
                                              the service.
    --type -t                               : Defaults to 'page' for *.vhd files, or 'block'
                                              otherwise. The setting will override blob types for
                                              every file.  Allowed values: append, block, page.

Content Control Arguments
    --content-cache --content-cache-control : The cache control string.
    --content-disposition                   : Conveys additional information about how to process
                                              the response payload, and can also be used to attach
                                              additional metadata.
    --content-encoding                      : The content encoding type.
    --content-language                      : The content language.
    --content-md5                           : The content's MD5 hash.
    --content-type                          : The content MIME type.
    --maxsize-condition                     : The max length in bytes permitted for an append blob.
    --validate-content                      : Specifies that an MD5 hash shall be calculated for
                                              each chunk of the blob and verified by the service
                                              when the chunk has arrived.

Precondition Arguments
    --if-match                              : An ETag value, or the wildcard character (*). Specify
                                              this header to perform the operation only if the
                                              resource's ETag matches the value specified.
    --if-modified-since                     : Commence only if modified since supplied UTC datetime
                                              (Y-m-d'T'H:M'Z').
    --if-none-match                         : An ETag value, or the wildcard character (*).
        Specify this header to perform the operation only if the resource's ETag does not match the
        value specified. Specify the wildcard character (*) to perform the operation only if the
        resource does not exist, and fail the operation if it does exist.
    --if-unmodified-since                   : Commence only if unmodified since supplied UTC
                                              datetime (Y-m-d'T'H:M'Z').

Storage Account Arguments
    --account-key                           : Storage account key. Must be used in conjunction with
                                              storage account name. Environment variable:
                                              AZURE_STORAGE_KEY.
    --account-name                          : Storage account name. Related environment variable:
                                              AZURE_STORAGE_ACCOUNT. Must be used in conjunction
                                              with either storage account key or a SAS token. If
                                              neither are present, the command will try to query the
                                              storage account key using the authenticated Azure
                                              account. If a large number of storage commands are
                                              executed the API quota may be hit.
    --connection-string                     : Storage account connection string. Environment
                                              variable: AZURE_STORAGE_CONNECTION_STRING.
    --sas-token                             : A Shared Access Signature (SAS). Must be used in
                                              conjunction with storage account name. Environment
                                              variable: AZURE_STORAGE_SAS_TOKEN.

Global Arguments
    --debug                                 : Increase logging verbosity to show all debug logs.
    --help -h                               : Show this help message and exit.
    --only-show-errors                      : Only show errors, suppressing warnings.
    --output -o                             : Output format.  Allowed values: json, jsonc, none,
                                              table, tsv, yaml, yamlc.  Default: json.
    --query                                 : JMESPath query string. See http://jmespath.org/ for
                                              more information and examples.
    --subscription                          : Name or ID of subscription. You can configure the
                                              default subscription using `az account set -s
                                              NAME_OR_ID`.
    --verbose                               : Increase logging verbosity. Use --debug for full debug
                                              logs.

Examples
    Upload all files that end with .py unless blob exists and has been modified since given date.
        az storage blob upload-batch -d mycontainer --account-name mystorageaccount --account-key
        00000000 -s <path-to-directory> --pattern *.py --if-unmodified-since 2018-08-27T20:51Z


    Upload all files from local path directory to a container named "mycontainer".
        az storage blob upload-batch -d mycontainer -s <path-to-directory>


    Upload all files with the format 'cli-2018-xx-xx.txt' or 'cli-2019-xx-xx.txt' in local path
    directory.
        az storage blob upload-batch -d mycontainer -s <path-to-directory> --pattern
        cli-201[89]-??-??.txt


    Upload all files with the format 'cli-201x-xx-xx.txt' except cli-2018-xx-xx.txt' and
    'cli-2019-xx-xx.txt' in a container.
        az storage blob upload-batch -d mycontainer -s <path-to-directory> --pattern
        cli-201[!89]-??-??.txt


To search AI knowledge base for examples, use: az find "az storage blob upload-batch"

Please let us know how we are doing: https://aka.ms/azureclihats

To Reproduce

Expected behavior

Environment summary

MSI
Windows 10

Additional context

@ghost ghost added the Storage az storage label Mar 31, 2022
@ghost ghost added this to the Backlog milestone Mar 31, 2022
@ghost ghost assigned evelyn-ys Mar 31, 2022
@ghost ghost added Auto-Assign Auto assign by bot Account az login/account labels Mar 31, 2022
@ghost ghost assigned jiasli Mar 31, 2022
@ghost ghost added Installation Batch az batch CXP Attention This issue is handled by CXP team. labels Mar 31, 2022
@qwordy qwordy modified the milestones: Backlog, Apr 2022 (2022-04-26) Mar 31, 2022
@yonzhan
Copy link
Collaborator

yonzhan commented Mar 31, 2022

storage

@yonzhan yonzhan removed Account az login/account Batch az batch Installation CXP Attention This issue is handled by CXP team. labels Mar 31, 2022
@evelyn-ys
Copy link
Member

evelyn-ys commented Mar 31, 2022

Seems your env has some mismatch... Can't reproduce...
image

@calvinhzy
Copy link
Member

Was not able to reproduce it either with the same python version 3.8.9. Please try rerunning pip install azure-cli

@qwordy
Copy link
Member Author

qwordy commented Mar 31, 2022

Thanks for quick response. I restarted machine, reinstalled Azure CLI. I also tried WSL2 Ubuntu. There is no this argument. Azure CLI in Portal is good.

@kingsleyadam
Copy link

I also cannot see this parameter. If I attempt to use it, it says the command doesn't exist.

unrecognized arguments: --overwrite

Running Ubuntu 20.04, installed via apt repo.

$ apt-cache policy azure-cli
azure-cli:
  Installed: 2.35.0-1~focal
  Candidate: 2.35.0-1~focal
  Version table:
 *** 2.35.0-1~focal 500
        500 https://packages.microsoft.com/repos/azure-cli focal/main amd64 Packages
        100 /var/lib/dpkg/status
$ az storage blob download-batch --help

Command
    az storage blob download-batch : Download blobs from a blob container recursively.

Arguments
    --destination -d [Required] : The existing destination folder for this download operation.
    --source -s      [Required] : The blob container from where the files will be downloaded.
        The source can be the container URL or the container name. When the source is the container
        URL, the storage account name will be parsed from the URL.
    --auth-mode                 : The mode in which to run the command. "login" mode will directly
                                  use your login credentials for the authentication. The legacy
                                  "key" mode will attempt to query for an account key if no
                                  authentication parameters for the account are provided.
                                  Environment variable: AZURE_STORAGE_AUTH_MODE.  Allowed values:
                                  key, login.
    --dryrun                    : Show the summary of the operations to be taken instead of actually
                                  downloading the file(s).
    --max-connections           : The number of parallel connections with which to download.
                                  Default: 2.
    --no-progress               : Include this flag to disable progress reporting for the command.
    --pattern                   : The pattern used for globbing files or blobs in the source. The
                                  supported patterns are '*', '?', '[seq]', and '[!seq]'. For more
                                  information, please refer to
                                  https://docs.python.org/3.7/library/fnmatch.html.
        When you use '*' in --pattern, it will match any character including the the directory
        separator '/'.

Storage Account Arguments
    --account-key               : Storage account key. Must be used in conjunction with storage
                                  account name. Environment variable: AZURE_STORAGE_KEY.
    --account-name              : Storage account name. Related environment variable:
                                  AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either
                                  storage account key or a SAS token. If neither are present, the
                                  command will try to query the storage account key using the
                                  authenticated Azure account. If a large number of storage commands
                                  are executed the API quota may be hit.
    --connection-string         : Storage account connection string. Environment variable:
                                  AZURE_STORAGE_CONNECTION_STRING.
    --sas-token                 : A Shared Access Signature (SAS). Must be used in conjunction with
                                  storage account name. Environment variable:
                                  AZURE_STORAGE_SAS_TOKEN.

Global Arguments
    --debug                     : Increase logging verbosity to show all debug logs.
    --help -h                   : Show this help message and exit.
    --only-show-errors          : Only show errors, suppressing warnings.
    --output -o                 : Output format.  Allowed values: json, jsonc, none, table, tsv,
                                  yaml, yamlc.  Default: json.
    --query                     : JMESPath query string. See http://jmespath.org/ for more
                                  information and examples.
    --subscription              : Name or ID of subscription. You can configure the default
                                  subscription using `az account set -s NAME_OR_ID`.
    --verbose                   : Increase logging verbosity. Use --debug for full debug logs.

Examples
    Download all blobs that end with .py
        az storage blob download-batch -d . --pattern *.py -s mycontainer --account-name
        mystorageaccount --account-key 00000000


    Download all blobs in a directory named "dir" from container named "mycontainer".
        az storage blob download-batch -d . -s mycontainer --pattern dir/*


    Download all blobs with the format 'cli-2018-xx-xx.txt' or 'cli-2019-xx-xx.txt' in container to
    current path.
        az storage blob download-batch -d . -s mycontainer --pattern cli-201[89]-??-??.txt


    Download all blobs with the format 'cli-201x-xx-xx.txt' except cli-2018-xx-xx.txt' and
    'cli-2019-xx-xx.txt' in container to current path.
        az storage blob download-batch -d . -s mycontainer --pattern cli-201[!89]-??-??.txt


To search AI knowledge base for examples, use: az find "az storage blob download-batch"

Please let us know how we are doing: https://aka.ms/azureclihats

@dciborow
Copy link

dciborow commented Apr 7, 2022

Was not able to reproduce it either with the same python version 3.8.9. Please try rerunning pip install azure-cli

I don't think this is a good way to update the CLI... and has put my Anaconda env in a really bad state.

I think the only method you should try to use to upgrade the cli is "az upgrade". And then upgrade Python packages with commands like "pip install --upgrade az-storage".

I think this is the best way to install AzureCLI
https://docs.microsoft.com/en-us/cli/azure/install-azure-cli-windows?tabs=azure-powershell

I had to go within to my Anaconda Env, and manually delete "az.bat" from the Scripts folder, so that it would use my default system env instead.

@hpourreza
Copy link

Any update on this issue? I am using az cli 2.37.0 and I cannot see the --overwrite option with az storage blob download-batch command. Could you please let me know how I can fix this issue?

@calvinhzy
Copy link
Member

calvinhzy commented Jun 20, 2022

This argument is only for upload/upload-batch. download/download-batch does not have overwrite option, user will need to rename or remove the files in the destination directory.

@DerTim1
Copy link

DerTim1 commented Jul 22, 2022

This argument is only for upload/upload-batch. download/download-batch overwrites by default now.

I think that changed from version 2.37.0. The following error occured if the destination-files already exists:

# az storage blob download-batch --only-show-errors -s common --account-name vaults --pattern '*.license' -d /tmp/cert/common
%s already exists in %s. Please rename existing file or choose another destination folder.

azure-cli 2.37.0 *

@calvinhzy
Copy link
Member

Sorry about this. It was an oversight, the current behavior is that download/download-batch does not have overwrite option, user will need to rename or remove the files in the destination directory.

@calvinhzy
Copy link
Member

We are adding --overwrite in 2.39.0 without changing existing behavior, (for download overwrite by default, and for download-batch does not overwrite by default)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Auto-Assign Auto assign by bot feature-request Storage az storage
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants