Instructions to Batch Jobs and Introduction to parallel programming

Matlab is crucial for scientists nowadays, especially when combined: cluster-based calculations and GUI applications. At present, we have access to a single node in terms of licenses for cluster calculations. However, it is still possible to allocate partitions with over 2048 GB of RAM and a maximum of 12 workers/ processes. Here are first batch job instructions for MATLAB. These should serve as a starting point for the work on the VSC and in later course for the parallel programming with MATLAB on the VSC.  To keep it simple as possible, the notation should be: the input files with file extension .m. Batch jobs have the file extension .job and the output files are simple text files. Further information about the VSC here VSC WIKI

First BATCH Jobs with MATLAB- Input Files

  • Execute the MATLAB test script from the Terminal with Batch Execution:
sbatch BatchJob1.job

Note: BATCH Scripts should be written in LINUX Editor! Sometimes the converting command (from Windows Text File to Linux Text File) does not work!

With the command:

squeue -u   "$whoami"

you can see the progress (if your job is running).

Note: After it is shown in the shell, that the job is finished, time needs to pass by, before the output file is written!

  • Open MATLAB directly from command line and run your program by using commands:
module purge
module load Matlab/(...Version...)
matlab -nodisplay
run main2.m

or start MATLAB(graphical user interface) with following commands: 

module purge
module load Matlab/(...Version...)
gpurender matlab &

Parameter Changes in Batch Jobs for MATLAB Input Files

With the current MATLAB license, the settings for parallel programming must be made directly in the program. With the current Matlab license, a maximum of 12 parallel workers can be set up.

Parallel Programming and/ with MATLAB

  • A Program (with normal for loop) and one (with parfor loop, for parallel programming) are used to investigate the MATLAB function parfor and thus the time and data transmit along the test program execution. The programs are partially changed programs from mathworks. The results:

and for the scalability:


With the function parfor (parallel programming), there are 12 workers shown anyway by inserting following lines after BatchJob execution in terminal/shell in data storage:

squeue -u "$whoami"
ssh node{ID}

but with data transmit zero in between! The scalability results for this program are similar to those scalability results from/ in mathworks.

  • No labels