|
|
# Access to CECI cluster
|
|
|
# Launch a simulation on a CECI cluster
|
|
|
|
|
|
## Access to CECI cluster
|
|
|
|
|
|
In order to use these clusters, you need to create an account:
|
|
|
[link](https://login.ceci-hpc.be/init/)
|
|
|
[login.ceci-hpc.be](https://login.ceci-hpc.be/init/)
|
|
|
|
|
|
Then, follow the tutorial to login to the cluster (you need to be connected to the UCL internet network):
|
|
|
[link](https://support.ceci-hpc.be/doc/_contents/QuickStart/ConnectingToTheClusters/FromAUnixComputer.html)
|
|
|
[support.ceci-hpc.be](https://support.ceci-hpc.be/doc/_contents/QuickStart/ConnectingToTheClusters/FromAUnixComputer.html)
|
|
|
|
|
|
create a file config in your .ssh folder
|
|
|
|
... | ... | @@ -14,7 +16,7 @@ vim .ssh/config |
|
|
|
|
|
If you are not familiar with vim, you can find the basic commands here:
|
|
|
|
|
|
[link](https://coderwall.com/p/adv71w/basic-vim-commands-for-getting-started)
|
|
|
[coderwall.com](https://coderwall.com/p/adv71w/basic-vim-commands-for-getting-started)
|
|
|
|
|
|
Add an entry to lemaitre (or another cluster) in your config file
|
|
|
|
... | ... | @@ -25,32 +27,14 @@ User YourUserName |
|
|
IdentityFile ~/.ssh/id_rsa.ceci
|
|
|
```
|
|
|
|
|
|
Create a submissions folder. In this folder create a submission.sh file
|
|
|
```bash
|
|
|
vim submission.sh
|
|
|
```
|
|
|
## Connect to the CECI cluster
|
|
|
|
|
|
Write this in submission.sh
|
|
|
In a terminal write
|
|
|
|
|
|
```bash
|
|
|
#!/bin/bash
|
|
|
|
|
|
#SBATCH --job-name=job1
|
|
|
#SBATCH --mail-user=prenom.nom@student.uclouvain.be
|
|
|
#SBATCH --mail-type=ALL
|
|
|
#
|
|
|
##SBATCH --time=16:00:00 # HH:MM:SS
|
|
|
#SBATCH --time=1-00:00:0 # days-hours
|
|
|
#SBATCH --ntasks=16 # number of cpu
|
|
|
#SBATCH --exclusive
|
|
|
##SBATCH --ntasks-per-node=16
|
|
|
##SBATCH --nodes=2
|
|
|
#SBATCH --mem-per-cpu=2048
|
|
|
|
|
|
cd path_to_my_run
|
|
|
|
|
|
mpirun --bind-to none slim run.py
|
|
|
ssh lemaitre
|
|
|
```
|
|
|
where lemaitre is the Host you write in your .ssh/config file
|
|
|
|
|
|
## Install SLIM
|
|
|
|
... | ... | @@ -91,3 +75,55 @@ make -j8 |
|
|
```
|
|
|
|
|
|
If this does not work come to the question session on Friday 4p.m. and ask for Jonathan
|
|
|
|
|
|
## Submit a job
|
|
|
|
|
|
Create a submissions folder. In this folder create a submission.sh file
|
|
|
```bash
|
|
|
vim submission.sh
|
|
|
```
|
|
|
|
|
|
Write this in submission.sh
|
|
|
|
|
|
```bash
|
|
|
#!/bin/bash
|
|
|
|
|
|
#SBATCH --job-name=job1
|
|
|
#SBATCH --mail-user=prenom.nom@student.uclouvain.be
|
|
|
#SBATCH --mail-type=ALL
|
|
|
|
|
|
##SBATCH --time=16:00:00 # HH:MM:SS
|
|
|
#SBATCH --time=1-00:00:0 # days-hours
|
|
|
#SBATCH --ntasks=16 # number of cpu
|
|
|
#SBATCH --exclusive
|
|
|
##SBATCH --ntasks-per-node=16
|
|
|
##SBATCH --nodes=2
|
|
|
#SBATCH --mem-per-cpu=2048
|
|
|
|
|
|
cd path_to_my_run
|
|
|
|
|
|
mpirun --bind-to none slim run.py
|
|
|
```
|
|
|
Do no forget to write your output files (modify the path_output argument of the loop object) in the scratch older, there is not enough place on the cluster to write the output files
|
|
|
|
|
|
```bash
|
|
|
/scratch/ucl/yourDepartment/yourID
|
|
|
```
|
|
|
|
|
|
To launch a simulation, write (while being in the submissions folder)
|
|
|
|
|
|
```bash
|
|
|
sbatch submission.sh
|
|
|
```
|
|
|
|
|
|
To see the state of your simulation
|
|
|
|
|
|
```bash
|
|
|
squeue -u yourName
|
|
|
```
|
|
|
|
|
|
The cluster writes the messages in a file in a file named "slurm-jobID.out". You can visualize these messages by using tail
|
|
|
|
|
|
```bash
|
|
|
tail -f slurm-jobId.out
|
|
|
``` |
|
|
\ No newline at end of file |