Share via

ErrorCode=SftpPermissionDenied using copy activity in ADF

Bhupinder Singh 0 Reputation points
2026-04-06T13:18:57.3233333+00:00

I'm trying to migrate data from File server to ADLS. All the access permissions has been granted for my id. but there are some restricted files due to which pipeline is getting failed.

Is there are any way we can setup a pipeline where we can ignore these restricted files and maintain the same folder structure in ADLS.

Error: ErrorCode=SftpPermissionDenied,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Permission denied to access 'Restricted_file_path'.,Source=Microsoft.DataTransfer.ClientLibrary.SftpConnector,''Type=Renci.SshNet.Common.SftpPermissionDeniedException,Message=Permission denied,Source=Renci.SshNet,'

Source and Sink dataset is binary. Fault tolerance is not working here and recursive is enable.
I have tried all the possible options, it seems that I need to setup a pipeline which can skip files from folders/subfolders.

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.

0 comments No comments

2 answers

Sort by: Most helpful
  1. Manoj Kumar Boyini 11,600 Reputation points Microsoft External Staff Moderator
    2026-04-06T21:47:40.59+00:00

    Hi Bhupinder Singh,

    The error “SftpPermissionDenied” means the SFTP server is blocking access to specific files. Azure Data Factory can connect, but when it encounters a file where your SFTP user does not have read permission, the Copy activity fails immediately.

    Fault‑tolerance in Copy Activity only works for text-based formats (CSV/JSON/Parquet). It does not skip binary files or SFTP permission errors. Because of this, ADF cannot ignore restricted SFTP files automatically.

    To migrate all accessible files and skip restricted ones, you need to build a pipeline that copies files one by one and continues even if a single file fails. A common solution is:

    1. Use Get Metadata (childItems = true, recursive = true) to list all files and folders.
    2. Add a ForEach loop over the returned items.
    3. For each file, run a Copy Data activity (or child pipeline) and enable Continue on error.
      • This lets the pipeline skip only the restricted files.
      1. Use PreserveHierarchy or dynamic file paths so the folder structure is kept intact in ADLS.

    This pattern gives you:

    • Per‑file isolation (one failure doesn’t stop the pipeline)
    • Full folder structure preserved
    • Restricted files automatically skipped

    If you know restricted file patterns, you can also add a Filter activity before the loop to exclude them earlier.

    Helpful References:
    https://dori-uw-1.kuma-moon.com/en-us/azure/data-factory/copy-activity-fault-tolerance
    https://dori-uw-1.kuma-moon.com/en-us/azure/data-factory/connector-sftp?tabs=data-factory

    Hope this helps. Please let us know if you have any questions or concerns. If this solves your issue, kindly click “Accept Answer” so it can help others in the community.


  2. Q&A Assist
    2026-04-06T13:19:12.41+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Yes. Configure the copy activity to use fault tolerance and a multi-file source so that restricted files are skipped while preserving folder hierarchy.

    Key points based on the SFTP connector and fault-tolerance behavior:

    1. Ensure the source is configured as multiple files
      • Use one of these in the SFTP copy source so that more than one file is in scope:
        • A folder path (optionally with recursive = true), or
        • wildcardFolderPath + wildcardFileName, or
        • fileListPath pointing to a list of files.
      • When the source is a single explicit file, any error (including permission denied) fails the activity and cannot be skipped.
    2. Use binary format for both source and sink when skipping error files
      • To skip files whose access is forbidden, both the source dataset and sink dataset must be in binary format, and no compression type can be specified.
      • Supported stores for this behavior include SFTP and ADLS (via Azure Data Lake Storage Gen2).
    3. Enable fault tolerance to skip restricted files
      • In the copy activity Settings (UI):
        • Enable fault tolerance and configure it to skip incompatible/error files.
      • With the above conditions met (binary → binary, multiple files), files that return SftpPermissionDenied will be skipped instead of failing the activity.
    4. Preserve folder hierarchy in ADLS
      • In the copy activity Sink settings, choose Preserve hierarchy (default) so that the relative path from the SFTP root folder is kept in ADLS.
      • Result: all accessible files are copied into ADLS with the same folder structure; restricted files are skipped.
    5. Monitor skipped files
      • Configure copy activity session logging.
      • Skipped files (including those with permission issues) appear in the log CSV under OperationName = FileSkip, with OperationItem showing the file path and Message indicating the reason (for example, permission denied).

    This setup allows the pipeline to complete successfully, copy all accessible files, maintain the folder structure in ADLS, and skip only the restricted SFTP files.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.