The NHR systems in Berlin ("Lise") and Göttingen ("Emmy") are equipped with various file systems each. Their properties and their intended utilization are described here.
Disk quotas based on group ownerships are implemented on each site's global (non-local) file systems.
We support file transfer tools like scp or rsync which use the SSH protocol in the background to establish the connection and to encrypt the data transfer. For this reason, a working SSH connection is a prerequisite for data transfer. Each of the following sections deals with a specific direction for establishing the transfer connection. Independent of the connection direction, data can always be transferred "from" or "to" the connected target host.
Data Transfer Connecting from the Outside World
External connections to the NHR systems in Berlin or Göttingen require that an SSH key pair is used for authentication. More details can be found here. The location of the private key file can be specified when calling scp or rsync on the user's local machine. Some examples including both data transfer directions are shown below.
Using scp, the option -i <fullpath_of_privatekeyfile> can be added:
$ scp -i <fullpath_of_privatekeyfile> <username>@blogin.hlrn.de:<remote_source> <local_target>
$ scp -i <fullpath_of_privatekeyfile> <local_source> <username>@blogin.hlrn.de:<remote_target>
With rsync, the nested option -e 'ssh -i <fullpath_of_privatekeyfile>' can be added:
$ rsync -e 'ssh -i <fullpath_of_privatekeyfile>' <username>@blogin.hlrn.de:<remote_source> <local_target>
$ rsync -e 'ssh -i <fullpath_of_privatekeyfile>' <local_source> <username>@blogin.hlrn.de:<remote_target>
Alternatively, the additional options shown above for specifying the location of the private key file can be omitted. In this case it is necessary to have a corresponding SSH configuration on the user's local machine as described here. To verify this, the corresponding SSH connection must be working without specifying the private key file on the command line.
Data Transfer Connecting to the Outside World
Connections to external machines located anywhere in the world can be established interactively from the login nodes. In this case, the SSH key pair mentioned above for external connections to the login nodes is not required. However, additional rules imposed by the external host or institution may apply.
Data transfer in the context of a batch job is restricted due to limited network access of the compute nodes. Please send a message to the support mailing list in case you need further help.
Internal Data Transfer
Internal data transfer between a Berlin and a Göttingen login node using scp or rsync works right out of the box - that is, without specifying any keys or passwords. This is enabled through host-based authentication which is active by default.
For internal data transfer, please always use the host name alone, omitting the domain suffix ".hlrn.de". You can use the generic names blogin, glogin or specific names like blogin5, glogin2, etc.. This way NHR internal links are used which are faster than external routes. The latter are used when specifying fully qualified host names (blogin.hlrn.de or glogin.hlrn.de) which is not recommended here.
Data Transfer Between Emmy (Göttingen) And The GWDG SCC
If you have previously been working on the SCC in Göttingen at the GWDG, you can follow these steps if you need to transfer data to/from the Emmy system:
- On an Emmy frontend node (glogin.hlrn.de or glogin[1-9].hlrn.de), generate a new SSH key (also documented at the SCC).
- Add the SSH key at the GWDG Website -> My Account -> Security.
- From an Emmy frontend node (glogin9.hlrn.de has access to both Emmy and Grete scratches, while glogin.hlrn.de and glogin[1-8].hlrn.de only have access to the Emmy scratch; but all have access to
$HOME
), transfer the files usingrsync
(see SCC documentation and rsync man page) to/from the SCC transfer node transfer-scc.gwdg.de. Some examples are given belowCopy a single file
FOO
from SCC$HOME
into your current directory on Emmyrsync -e 'ssh -i <fullpath_of_privatekeyfile>' GWDGUSERNAME@transfer-scc.gwdg.de:/usr/users/GWDGUSERNAME/FOO .
Copy a single file FOO in your current directory on Emmy to
$HOME
on the SCCrsync -e 'ssh -i <fullpath_of_privatekeyfile>' FOO GWDGUSERNAME@transfer-scc.gwdg.de:/usr/users/GWDGUSERNAME/
Copy a directory in your SCC /scratch to your current directory on Emmy
rsync -e 'ssh -i <fullpath_of_privatekeyfile>' -r GWDGUSERNAME@transfer-scc.gwdg.de:/scratch/projects/workshops/forest/synthetic_trees .
Copy a directory in your current directory on Emmy to
/scratch
on the SCCrsync -e 'ssh -i <fullpath_of_privatekeyfile>' -r synthetic_trees GWDGUSERNAME@transfer-scc.gwdg.de:/scratch/projects/workshops/forest/
If you have terrabytes of data that need to be transferred, please contact us so that we can provide a custom solution for this.