Robocopy fails to copy files having .url extension

Brian Hart 21 Reputation points
2022-03-15T21:19:54.807+00:00

I backup my computer to a NAS using a batch file that runs robocopy and logs all activity to .txt files.

I just discovered while reviewing log files that many files, but nowhere near the majority of files, have been failing with this error:

There is not enough space on the disk.
ERROR: RETRY LIMIT EXCEEDED.

When I looked more deeply, I found that every file that failed this way was a file having the extension ".url". There is nothing else I can see that the failed files have in common: some are short paths, some are long paths. Some have only one work and no spaces (e.g. "Login.url"); others have multiple words with one or more spaces between the words (e.g. "Support Forums - Dell Community.url").

Is there something inherent in robocopy preventing copying of .url files, or is it possible the NAS is denying the copy due to the .url suffix? Is there some way to get a more exact error message from robocopy? I know there is sufficient space, since it copies all the other (and much larger) files than the .url files.

The NAS is a WesternDigital MyCloud device, so probably Linux OS.

Windows for business | Windows Client for IT Pros | User experience | Other
{count} votes

12 answers

Sort by: Most helpful
  1. 3086Here 1 Reputation point
    2022-04-07T17:57:20.067+00:00

    Created a test1.log file with the same contents of the url file using notepad..
    Copied the test1.url file to a test1.test file
    used FC to compare the three files. All compared OK.
    copied the three files to my NAS. test.log copied fine, the .url and .txt file did not copy.
    other .txt and .log files always copy to nas fine.
    Something is special about the .url file, even if you copy it to a different name, but the comtents are the same
    (Note the original .url files were created/copied from the address bar of the chrome web explorer, by dragging it to the desktop.)

    here are the details:


    C:\Users\Terry\Desktop\test1>dir
    Volume in drive C is Win10(SSD64)
    Volume Serial Number is 1070-4520

    Directory of C:\Users\Terry\Desktop\test1

    2022-04-07 12:34 PM <DIR> .
    2022-04-07 12:34 PM <DIR> ..
    2022-04-07 12:24 PM 312 Test1.log
    2018-10-04 11:41 AM 312 test1.url
    2022-04-07 12:27 PM <DIR> testit
    2 File(s) 624 bytes
    3 Dir(s) 4,885,045,248 bytes free

    C:\Users\Terry\Desktop\test1>copy test1.url test1.txt
    1 file(s) copied.

    C:\Users\Terry\Desktop\test1>fc test1.log test1.url
    Comparing files Test1.log and TEST1.URL
    FC: no differences encountered

    C:\Users\Terry\Desktop\test1>fc test1.log test1.txt
    Comparing files Test1.log and TEST1.TXT
    FC: no differences encountered

    C:\Users\Terry\Desktop\test1>copy * \mycloudex2\2\testit
    Test1.log
    test1.txt
    There is not enough space on the disk.
    test1.url
    There is not enough space on the disk.
    1 file(s) copied.

    C:\Users\Terry\Desktop\test1>dir \mycloudex2\2\testit
    Volume in drive \mycloudex2\2 is 2
    Volume Serial Number is 61A9-5128

    Directory of \mycloudex2\2\testit

    2022-04-07 12:35 PM <DIR> .
    2022-04-05 05:41 PM <DIR> ..
    2022-04-07 12:24 PM 312 Test1.log
    1 File(s) 312 bytes
    2 Dir(s) 5,741,343,797,248 bytes free


  2. 3086Here 1 Reputation point
    2022-04-08T01:21:15.413+00:00

    WOW, good suggestion, but with surprising results.

    Copy original three files from my pc to another ntfs disk on my pc works fine.
    copy original three files from my ps to a FAT partition on my pc, works fine
    copy original three (NTFS) files to NAS fails with no space error.

    COPY FAT32 PARTITION FILES TO THE NAS WORKS PERFECTLY. (.TXT .LOG & .URL)

    So I keep thinking that there is something special about a NTFS .url file or a ntfs url file that is renamed, that gets lost when copied to a fat even though the files are identical:

    C:\Users\Terry\Desktop\test1>fc test1.url d:\test1.url
    Comparing files test1.url and D:\TEST1.URL
    FC: no differences encountered

    see details:

    C:\Users\Terry\Desktop\test1>dir *
    Volume in drive C is Win10(SSD64)
    Volume Serial Number is 1070-4520

    Directory of C:\Users\Terry\Desktop\test1

    2022-04-07 07:30 PM <DIR> .
    2022-04-07 07:30 PM <DIR> ..
    2022-04-07 07:29 PM 312 test1.log
    2022-04-07 07:30 PM 312 test1.txt
    2018-10-04 11:41 AM 312 test1.url
    2022-04-07 07:19 PM <DIR> testit
    3 File(s) 936 bytes
    3 Dir(s) 5,973,393,408 bytes free

    C:\Users\Terry\Desktop\test1>copy * \mycloudex2\2\testit\
    test1.log
    There is not enough space on the disk.
    test1.txt
    test1.url
    There is not enough space on the disk.
    1 file(s) copied.

    C:\Users\Terry\Desktop\test1>copy * d:\
    test1.log
    test1.txt
    test1.url
    3 file(s) copied.

    C:\Users\Terry\Desktop\test1>copy d:\test1.* \mycloudex2\2\testit\
    d:\test1.log
    d:\test1.txt
    Overwrite \mycloudex2\2\testit\test1.txt? (Yes/No/All): y
    d:\test1.url
    3 file(s) copied.


  3. 3086Here 1 Reputation point
    2022-04-08T03:56:49.907+00:00

    A URL in Windows is a pointer, like a link to a file. Is it possible that the copy operation is trying to find the linked entity to copy vs the pointer link itself.

    That may be why when I create a .url file using the text from the .url pointer (end) file , that the copy completes without error. Is it possible to have robocopy copy the pointer, itself, and not the data that the url is pointing to??

    That may be the problem, hopefully.


  4. 3086Here 1 Reputation point
    2022-04-09T15:37:32.603+00:00

    Simple solution: copy offending file to fat32 partition on my computer. delete the source, and copy it back to the source.

    easily done by checking log for error 112, doing the copies, and restart robocopy.

    Works Perfectly on lots of deadly url's ! There has to be something special about web copied url's that can not be seen, and is only hidden in NTFS files.

    My problem is resolved, thanks
    (I'm going to pipe the robocopy output to a function to check the error code, do the copies, and hopefully get them done before the robocopy retry timeout expires. that will let things backup in one robocopy operation. )

    0 comments No comments

  5. MotoX80 37,156 Reputation points
    2022-04-19T00:51:16.077+00:00

    The return code from the robocopy in the first call should not stop the second bat file from being called. Add some echo statements in each bat file and make sure that the scheduled task captures both stdout and stderr.

    Have the task run cmd.exe with arguments:

    /c C:\Scripts\MyScript.bat 1>C:\Scripts\MyScript.log 2>&1  
    

    The basic problem with copying the .url files is that I can't recreate your problem. I can't tell you what to fix because I don't know what is wrong.

    Try copying the files with Powershell.

    $Source = 'C:\'  
    $Dest = 'D:\UrlTest\'  
    $files = Get-ChildItem -Path $Source -Recurse -Filter '*.url' -ErrorAction SilentlyContinue  
    foreach ($f in $files) {  
        $NewName = $f.FullName.Replace($Source,$Dest)  
        $NewFolder = $f.DirectoryName.Replace($Source,$Dest)  
        New-Item $NewFolder -ItemType Directory -ErrorAction SilentlyContinue  
        Copy-Item  $f.FullName -Destination $NewName -force  
    }  
    

    @Pavel A suggestion of using the streams utility still would appear to be the most likely solution to the .url file problem.

    C:\Users\madne\Downloads>streams VisualStudioSetup.exe  
      
    streams v1.60 - Reveal NTFS alternate streams.  
    Copyright (C) 2005-2016 Mark Russinovich  
    Sysinternals - www.sysinternals.com  
      
    C:\Users\madne\Downloads\VisualStudioSetup.exe:  
         :SmartScreen:$DATA 7  
       :Zone.Identifier:$DATA       378  
      
    C:\Users\madne\Downloads>streams -d  VisualStudioSetup.exe  
      
    streams v1.60 - Reveal NTFS alternate streams.  
    Copyright (C) 2005-2016 Mark Russinovich  
    Sysinternals - www.sysinternals.com  
      
    C:\Users\madne\Downloads\VisualStudioSetup.exe:  
       Deleted :SmartScreen:$DATA  
       Deleted :Zone.Identifier:$DATA  
    
    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.