Ssh To Sftp Server

Posted : admin On 1/2/2022
-->

Connect to a remote server. To open the PSFTP part of the PuTTY application suite, from the Start. The SSH server on the remote side needs to allow for SFTP connections, so I'd look into the SSH server's logs for insights as to why it's failing in addition to making sure that Filezilla has an appropriate key to connect to the server securely. – slm ♦ Jun 15 '15 at 12:31. The first time you try to connect using the SSH File Transfer Protocol, the SFTP client will receive a message that shows the SSH security standard. Before the first connection with the SSH File Transfer Protocol can be established, the client will be asked to confirm whether or not the key sent by the server. Uses the SSH.NET library, which is an open-source Secure Shell (SSH) library that supports.NET. Provides the Create folder action, which creates a folder at the specified path on the SFTP server. Provides the Rename file action, which renames a file on the SFTP server. Caches the connection to SFTP server for up to 1 hour.

APPLIES TO: Azure Data Factory Azure Synapse Analytics

This article outlines how to copy data from and to the secure FTP (SFTP) server. To learn about Azure Data Factory, read the introductory article.

Supported capabilities

The SFTP connector is supported for the following activities:

  • Copy activity with supported source/sink matrix

Specifically, the SFTP connector supports:

  • Copying files from and to the SFTP server by using Basic, SSH public key or multi-factor authentication.
  • Copying files as is or by parsing or generating files with the supported file formats and compression codecs.

Prerequisites

If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.

Alternatively, if your data store is a managed cloud data service, you can use Azure integration runtime. If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs into the allow list.

For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.

Get started

To perform the Copy activity with a pipeline, you can use one of the following tools or SDKs:

The following sections provide details about properties that are used to define Data Factory entities specific to SFTP.

Linked service properties

The following properties are supported for the SFTP linked service:

PropertyDescriptionRequired
typeThe type property must be set to Sftp.Yes
hostThe name or IP address of the SFTP server.Yes
portThe port on which the SFTP server is listening.
The allowed value is an integer, and the default value is 22.
No
skipHostKeyValidationSpecify whether to skip host key validation.
Allowed values are true and false (default).
No
hostKeyFingerprintSpecify the fingerprint of the host key.Yes, if the 'skipHostKeyValidation' is set to false.
authenticationTypeSpecify the authentication type.
Allowed values are Basic, SshPublicKey and MultiFactor. For more properties, see the Use basic authentication section. For JSON examples, see the Use SSH public key authentication section.
Yes
connectViaThe integration runtime to be used to connect to the data store. To learn more, see the Prerequisites section. If the integration runtime isn't specified, the service uses the default Azure Integration Runtime.No
Ssh key sftp server

Use basic authentication

To use basic authentication, set the authenticationType property to Basic, and specify the following properties in addition to the SFTP connector generic properties that were introduced in the preceding section:

PropertyDescriptionRequired
userNameThe user who has access to the SFTP server.Yes
passwordThe password for the user (userName). Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault.Yes

Example:

Use SSH public key authentication

To use SSH public key authentication, set 'authenticationType' property as SshPublicKey, and specify the following properties besides the SFTP connector generic ones introduced in the last section:

PropertyDescriptionRequired
userNameThe user who has access to the SFTP server.Yes
privateKeyPathSpecify the absolute path to the private key file that the integration runtime can access. This applies only when the self-hosted type of integration runtime is specified in 'connectVia.'Specify either privateKeyPath or privateKeyContent.
privateKeyContentBase64 encoded SSH private key content. SSH private key should be OpenSSH format. Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault.Specify either privateKeyPath or privateKeyContent.
passPhraseSpecify the pass phrase or password to decrypt the private key if the key file or the key content is protected by a pass phrase. Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault.Yes, if the private key file or the key content is protected by a pass phrase.

Note

The SFTP connector supports an RSA/DSA OpenSSH key. Make sure that your key file content starts with '-----BEGIN [RSA/DSA] PRIVATE KEY-----'. If the private key file is a PPK-format file, use the PuTTY tool to convert from PPK to OpenSSH format.

Example 1: SshPublicKey authentication using private key filePath

Example 2: SshPublicKey authentication using private key content

Use multi-factor authentication

To use multi-factor authentication which is a combination of basic and SSH public key authentications, specify the user name, password and the private key info described in above sections.

Example: multi-factor authentication

Dataset properties

For a full list of sections and properties that are available for defining datasets, see the Datasets article.

Azure Data Factory supports the following file formats. Refer to each article for format-based settings.

The following properties are supported for SFTP under location settings in the format-based dataset:

PropertyDescriptionRequired
typeThe type property under location in dataset must be set to SftpLocation.Yes
folderPathThe path to the folder. If you want to use a wildcard to filter the folder, skip this setting and specify the path in activity source settings.No
fileNameThe file name under the specified folderPath. If you want to use a wildcard to filter files, skip this setting and specify the file name in activity source settings.No

Example:

Copy activity properties

For a full list of sections and properties that are available for defining activities, see the Pipelines article. This section provides a list of properties that are supported by the SFTP source.

SFTP as source

Azure Data Factory supports the following file formats. Refer to each article for format-based settings.

The following properties are supported for SFTP under the storeSettings settings in the format-based Copy source:

PropertyDescriptionRequired
typeThe type property under storeSettings must be set to SftpReadSettings.Yes
Locate the files to copy
OPTION 1: static path
Copy from the folder/file path that's specified in the dataset. If you want to copy all files from a folder, additionally specify wildcardFileName as *.
OPTION 2: wildcard
- wildcardFolderPath
The folder path with wildcard characters to filter source folders.
Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual folder name has a wildcard or this escape char inside.
For more examples, see Folder and file filter examples.
No
OPTION 2: wildcard
- wildcardFileName
The file name with wildcard characters under the specified folderPath/wildcardFolderPath to filter source files.
Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual file name has wildcard or this escape char inside. For more examples, see Folder and file filter examples.
Yes
OPTION 3: a list of files
- fileListPath
Indicates to copy a specified file set. Point to a text file that includes a list of files you want to copy (one file per line, with the relative path to the path configured in the dataset).
When you use this option, don't specify the file name in the dataset. For more examples, see File list examples.
No
Additional settings
recursiveIndicates whether the data is read recursively from the subfolders or only from the specified folder. When recursive is set to true and the sink is a file-based store, an empty folder or subfolder isn't copied or created at the sink.
Allowed values are true (default) and false.
This property doesn't apply when you configure fileListPath.
No
deleteFilesAfterCompletionIndicates whether the binary files will be deleted from source store after successfully moving to the destination store. The file deletion is per file, so when copy activity fails, you will see some files have already been copied to the destination and deleted from source, while others are still remaining on source store.
This property is only valid in binary files copy scenario. The default value: false.
No
modifiedDatetimeStartFiles are filtered based on the attribute Last Modified.
The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. The time is applied to the UTC time zone in the format of 2018-12-01T05:00:00Z.
The properties can be NULL, which means that no file attribute filter is applied to the dataset. When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
This property doesn't apply when you configure fileListPath.
No
modifiedDatetimeEndSame as above.No
enablePartitionDiscoveryFor files that are partitioned, specify whether to parse the partitions from the file path and add them as additional source columns.
Allowed values are false (default) and true.
No
partitionRootPathWhen partition discovery is enabled, specify the absolute root path in order to read partitioned folders as data columns.
If it is not specified, by default,
- When you use file path in dataset or list of files on source, partition root path is the path configured in dataset.
- When you use wildcard folder filter, partition root path is the sub-path before the first wildcard.
For example, assuming you configure the path in dataset as 'root/folder/year=2020/month=08/day=27':
- If you specify partition root path as 'root/folder/year=2020', copy activity will generate two more columns month and day with value '08' and '27' respectively, in addition to the columns inside the files.
- If partition root path is not specified, no extra column will be generated.
No
maxConcurrentConnectionsThe upper limit of concurrent connections established to the data store during the activity run. Specify a value only when you want to limit concurrent connections.No

Example:

SFTP as a sink

Azure Data Factory supports the following file formats. Refer to each article for format-based settings.

The following properties are supported for SFTP under storeSettings settings in a format-based Copy sink:

PropertyDescriptionRequired
typeThe type property under storeSettings must be set to SftpWriteSettings.Yes
copyBehaviorDefines the copy behavior when the source is files from a file-based data store.
Allowed values are:
- PreserveHierarchy (default): Preserves the file hierarchy in the target folder. The relative path of the source file to the source folder is identical to the relative path of the target file to the target folder.
- FlattenHierarchy: All files from the source folder are in the first level of the target folder. The target files have autogenerated names.
- MergeFiles: Merges all files from the source folder to one file. If the file name is specified, the merged file name is the specified name. Otherwise, it's an autogenerated file name.
No
maxConcurrentConnectionsThe upper limit of concurrent connections established to the data store during the activity run. Specify a value only when you want to limit concurrent connections.No
useTempFileRenameIndicate whether to upload to temporary files and rename them, or directly write to the target folder or file location. By default, Azure Data Factory first writes to temporary files and then renames them when the upload is finished. This sequence helps to (1) avoid conflicts that might result in a corrupted file if you have other processes writing to the same file, and (2) ensure that the original version of the file exists during the transfer. If your SFTP server doesn't support a rename operation, disable this option and make sure that you don't have a concurrent write to the target file. For more information, see the troubleshooting tip at the end of this table.No. Default value is true.
operationTimeoutThe wait time before each write request to SFTP server times out. Default value is 60 min (01:00:00).No

Tip

If you receive the error 'UserErrorSftpPathNotFound,' 'UserErrorSftpPermissionDenied,' or 'SftpOperationFail' when you're writing data into SFTP, and the SFTP user you use does have the proper permissions, check to see whether your SFTP server support file rename operation is working. If it isn't, disable the Upload with temp file (useTempFileRename) option and try again. To learn more about this property, see the preceding table. If you use a self-hosted integration runtime for the Copy activity, be sure to use version 4.6 or later.

Example:

Folder and file filter examples

This section describes the behavior that results from using wildcard filters with folder paths and file names.

folderPathfileNamerecursiveSource folder structure and filter result (files in bold are retrieved)
Folder*(empty, use default)falseFolderA
File1.csv
File2.json
Subfolder1
File3.csv
File4.json
File5.csv
AnotherFolderB
File6.csv
Folder*(empty, use default)trueFolderA
File1.csv
File2.json
Subfolder1
File3.csv
File4.json
File5.csv
AnotherFolderB
File6.csv
Folder**.csvfalseFolderA
File1.csv
File2.json
Subfolder1
File3.csv
File4.json
File5.csv
AnotherFolderB
File6.csv
Folder**.csvtrueFolderA
File1.csv
File2.json
Subfolder1
File3.csv
File4.json
File5.csv
AnotherFolderB
File6.csv

File list examples

This table describes the behavior that results from using a file list path in the Copy activity source. It assumes that you have the following source folder structure and want to copy the files that are in bold type:

Sample source structureContent in FileListToCopy.txtAzure Data Factory configuration
root
FolderA
File1.csv
File2.json
Subfolder1
File3.csv
File4.json
File5.csv
Metadata
FileListToCopy.txt
File1.csv
Subfolder1/File3.csv
Subfolder1/File5.csv
In the dataset:
- Folder path: root/FolderA
In the Copy activity source:
- File list path: root/Metadata/FileListToCopy.txt
The file list path points to a text file in the same data store that includes a list of files you want to copy (one file per line, with the relative path to the path configured in the dataset).

Lookup activity properties

For information about Lookup activity properties, see Lookup activity in Azure Data Factory.

GetMetadata activity properties

For information about GetMetadata activity properties, see GetMetadata activity in Azure Data Factory.

Delete activity properties

For information about Delete activity properties, see Delete activity in Azure Data Factory.

Legacy models

Note

The following models are still supported as is for backward compatibility. We recommend that you use the previously discussed new model, because the Azure Data Factory authoring UI has switched to generating the new model.

Windows Sftp Server Software

Legacy dataset model

PropertyDescriptionRequired
typeThe type property of the dataset must be set to FileShare.Yes
folderPathThe path to the folder. A wildcard filter is supported. Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual file name has a wildcard or this escape char inside.
Examples: rootfolder/subfolder/, see more examples in Folder and file filter examples.
Yes
fileNameName or wildcard filter for the files under the specified 'folderPath'. If you don't specify a value for this property, the dataset points to all files in the folder.
For filter, the allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character).
- Example 1: 'fileName': '*.csv'
- Example 2: 'fileName': '???20180427.txt'
Use ^ to escape if your actual folder name has wildcard or this escape char inside.
No
modifiedDatetimeStartFiles are filtered based on the attribute Last Modified. The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. The time is applied to UTC time zone in the format of 2018-12-01T05:00:00Z.
The overall performance of data movement will be affected by enabling this setting when you want to do file filter from large numbers of files.
The properties can be NULL, which means that no file attribute filter is applied to the dataset. When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
No
modifiedDatetimeEndFiles are filtered based on the attribute Last Modified. The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. The time is applied to UTC time zone in the format of 2018-12-01T05:00:00Z.
The overall performance of data movement will be affected by enabling this setting when you want to do file filter from large numbers of files.
The properties can be NULL, which means that no file attribute filter is applied to the dataset. When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
No
formatIf you want to copy files as is between file-based stores (binary copy), skip the format section in both input and output dataset definitions.
If you want to parse files with a specific format, the following file format types are supported: TextFormat, JsonFormat, AvroFormat, OrcFormat, and ParquetFormat. Set the type property under format to one of these values. For more information, see Text format, Json format, Avro format, Orc format, and Parquet format sections.
No (only for binary copy scenario)
compressionSpecify the type and level of compression for the data. For more information, see Supported file formats and compression codecs.
Supported types are GZip, Deflate, BZip2, and ZipDeflate.
Supported levels are Optimal and Fastest.
No

Tip

To copy all files under a folder, specify folderPath only.
To copy a single file with a specified name, specify folderPath with the folder part and fileName with the file name.
To copy a subset of files under a folder, specify folderPath with the folder part and fileName with the wildcard filter.

Note

If you were using fileFilter property for the file filter, it is still supported as is, but we recommend that you use the new filter capability added to fileName from now on.

Example:

Legacy Copy activity source model

PropertyDescriptionRequired
typeThe type property of the Copy activity source must be set to FileSystemSourceYes
recursiveIndicates whether the data is read recursively from the subfolders or only from the specified folder. When recursive is set to true and the sink is a file-based store, empty folders and subfolders won't be copied or created at the sink.
Allowed values are true (default) and false
No
maxConcurrentConnectionsThe upper limit of concurrent connections established to the data store during the activity run. Specify a value only when you want to limit concurrent connections.No

Example:

Next steps

For a list of data stores that are supported as sources and sinks by the Copy activity in Azure Data Factory, see supported data stores.

Hello RDMers,

We all know how sometimes between geeks we can start a discussion and quickly realize that those outside of our little group seems somewhat confused about our conversation. When starting a conversation about FTPS, SFTP or FTP over SSH, it might quickly get confusing, so I thought I would clear that up and give a little crash course about it!

Ssh To Sftp Server Linux

FTP, or File Transfer Protocol, is a rather standard way to transfer files over a network, and even over the internet. It is an age old protocol that has been designed in a time where the only network users were computer nerds (like me!) and whose only malice was to create more software. Since then, things have changed and security has become a serious concern. FTP accounts need passwords for access, but those passwords are transferred in the clear and it would be easy for an attacker to get them by watching the network traffic. This is why it was necessary to improve on FTP and add security to encrypt the network traffic as well as authenticate both the client and the server. This is where several flavors of FTP appeared: FTPS, SFTP, FTP over SSH. These terms can be quite confusing for a new user, and even amongst aficionados.

FTPS (implicit vs explicit)

FTPS stands for FTP over SSL. It is the same protocol as FTP, but adds a security layer through the use of SSL (Secure Sockets Layer). This usage of SSL can be done in two ways, it can be either implicit, or explicit.

Implicit FTPS starts by a security negotiation and then uses the FTP protocol normally over the encrypted connection. This provides the advantage that the FTP protocol can be used after the connection is established: it will be implicitly encrypted by the SSL connection. But it has the disadvantage of requiring that the client is aware of SSL and thus breaks compatibility with old clients.

This is where explicit FTPS comes in. The connection starts normally over an insecure connection and then the client can try to upgrade the connection to an encrypted one using FTP extended commands. This allows old clients to access a server in the old insecure way, although the server administrator could forbid it, then allows new clients to negotiate a secure connection.

Setup Windows Sftp Server

SFTP

SFTP stands for SSH File Transfer Protocol. SSH is an encrypted and secure communication protocol, and it provides an extension to transfer files. In fact, SFTP is completely different from FTP. It still does essentially the same job, but securely, and with better compatibility and formality than FTP. Especially regarding directory listings, which is quite a challenge with FTP because there is no normalized way for an FTP server to respond to a client requesting a list. I will admit, this is mostly a programmer’s concern, meaning that connecting to a SFTP server will yield correct operations assuredly, while there could be some obscure FTP server that could respond in way that baffles any FTP client GUI.

FTP over SSH

FTP over SSH is quite different from SFTP. It is standard FTP tunneled through an SSH connection. Those of you who knows SSH forwarding would ask: “Then, I just need to open a tunnel for the FTP port and have FTP over SSH?” Well… not really. This is because FTP uses more than one connection to work. If you open an SSH tunnel for the FTP port, you successfully secure the FTP “control” connection. However, data is transferred over another port which is usually at the discretion of either the server or the client. This makes it difficult to open a tunnel for the port required for the data connection. In order for FTP over SSH to be completely secured, the FTP client needs to be tightly integrated with the SSH client.

Active vs Passive

This brings me the opportunity to clarify another aspect of FTP: active vs passive. This specifies how the data connection is performed. In active mode, the server will actively connect to a client data port. It’s up to the FTP client to inform the server about its data port number. Whereas, in passive mode, the server will wait for the client to connect to its data port. It’s up to the FTP client to query the server for this port number. The choice between active and passive is also the responsibility of the client. Usually, passive mode is the preferred way. It makes it easier to pass through many types of proxy, some of which would allow only connection from the client to the server. I, for myself, is unaware of situations where active mode might be useful, but FTP has the flexibility to accommodate such a case. SFTP does not have this problem as it uses the same connection for both control and data transfer.

Binary vs ASCII

Ssh Sftp Server Windows

There is one last point I would like to bring: binary transfer vs ASCII transfer. FTP clients have this option, but what does it really do? Remember I stated earlier that FTP is an age old protocol? Well, it was designed in a time when computers did not have all the same internal data representation. It was especially true for text. This is where the ASCII transfer option was necessary. In order for a text file to be readable across different systems, a standard has been designed and it was up to the server to convert between its internal text representation and the network ASCII representation. Nowadays, all commonly used computers use the ASCII representation for their simple text files. With only one notable difference remaining: end of lines in Windows systems. The FTP standard does not specify formally how to treat these end of lines, so many servers will transmit and receive text files the same in both ASCI and binary transfer mode.

I hope this was helpful, now you can transfer files like never before, knowing a little more what happens on the wire, or the radio waves, because we now also have Wi-Fi.

As always, please let us know your thoughts by using the comment feature of the blog. You can also visit our forums to get help and submit feature requests, you can find them here.