AzCopy from HDInsight cluster to PowerShell script
I have a PowerShell script that generates some output using a hive on HDinsight. The output is put into a local block and then I copy it to the local machine using AzCopy. I do this a lot to get the various pieces of data that I need, often calling this script multiple times. The problem is that at some point the error AzCopy
with the message "Condition specified using HTTP conditional headers (s) is not met", but this is after numerous successful iterations.
I'm not sure what that means, and decrypting the scripts hardly helped. I tried deleting the file and retrying AzCopy
and the error persisted, so it might have something to do with the AzCopy
http session . Can anyone enlighten me?
PS C:\hive> AzCopy /Y /Source:https://msftcampusdata.blob.core.windows.net/crunch88-1 /Dest:c:\hive\extracts\data\ /SourceKey:attEwHZ9AGq7pzzTYwRvjWwcmwLvFqnkxIvJcTblYnZAs1GSsCCtvbBKz9T/TTtwDSVMDuU3DenBbmOYqPIMhQ== /Pattern:hivehost/stdout
AzCopy : [2015/05/10 15:08:44][ERROR] hivehost/stdout: The remote server returned an error: (412) The condition specified using HTTP conditional header(s)
is not met..
At line:1 char:1
+ AzCopy /Y /Source:https://msftcampusdata.blob.core.windows.net/crunch88-1 /Dest: ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: ([2015/05/10 15:...s) is not met..:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
The condition specified using HTTP conditional header(s) is not met.
source to share
To ensure data integrity during the entire download process, AzCopy passes the ETag of the original blob into an "If-Match" HTTP header while reading the data from the original blob. Therefore, the HTTP status code 412 (Precondition Failed) "The condition specified using the conditional HTTP header (s) is not met." just means your blobs were changed when AzCopy was loading them .
Please avoid modifying the original blobs when loading them. If you need to change the original blobs at the same time, you can try the following workaround:
First, take a snapshot of the original blob and then download the blob using the AzCopy (/ Snapshot) option so that AzCopy will try to download the original blob and all its snapshots. Although the download of the original flea may fail with 412 (Precondition Fail), the download of the snapshot may be successful. The file name of the uploaded snapshot is {blob name without extension} ({snapshot timestamp}). {Extension}.
For more information about AzCopy and the option / snapshot, see Getting Started with the AzCopy Command Line Utility .
Some updates:
Have you finished AzCopy and then resumed it from the same command line? If so, you need to make sure that the original blob has not been modified since the previous AzCopy run, because AzCopy must ensure that the original blob remains unchanged for the period between AzCopy loaded the first time and the original block loaded successfully. To check if the resume is in progress, you can check if the AzCopy output contains "Incomplete operation with the same command line found in the {Dir Path} log directory, AzCopy will start resume".
Since / Y is specified on the command line, the answer will always be "Yes" at the resume prompt. To avoid duplicate behavior, before executing AzCopy, you can clear the default log folder "% LocalAppData% \ Microsoft \ Azure \ AzCopy" or specify / Z: Configure a unique log folder for each execution.
source to share