User Tools

Site Tools


Automate a scalability analysis with Autosubmit

In case of having to perform a bi-dimensional scalability analysis, a lot of simulations are needed. In order to deal with this, it is possible to automate the whole analysis with the script. It is necessary to modify some variables to adapt the scalability analysis, such as the number of MPI tasks (in the two bucles), the MareNostrum's username, all the initialization variables, etc.

On the other hand, once experiments have finsihed, it is possible to gather the execution times using script. The previous script will generate a file called “experiments-ids.list” which contains all the experiments identifiers. It is needed to process all the execution times to output the results. Furthermore, it is necessary to set properly the “RESULTS_BASE_DIR” variable. Once executed, the script generates two files called “execution_times.txt” and “usage_resources_in_time.txt” containing execution times and total CPU hours used respectively for each combination of MPI tasks.

Finally, since we are only interested in execution times, it is possible to remove outputs to save disk space. The script removes the run directory, but keeps the logs. It also uses the “experiments-ids.list” and it is necessary to set the “RESULTS_BASE_DIR” variable.

Note that all the scripts are coded to perform the average of 3 executions for each combination of MPI tasks.

All this was tested with version 3.2.0 of Autosubmit. With newer versions it could not work. For any problem or question, please contact me:

Autosubmit database setup and install

It is recommended to perform this kind of scalability analysis using a separate database different than the one from production experiments, in an isolated environment. The script creates a vast number of experiments in an automated way, so having them in the same database would be confusing for production users.

To create an isolated environment and different database you have to follow next steps:

1- Create a new directory in any place you like (e.g. your $HOME dir):

mkdir ~/performance_autosubmit

2- Edit .autosubmitrc file (located by default in your $HOME dir), and change DATABASE, LOCAL and CONF paths to make them point to the directory created above. You need to replace any path pointing to /esnas/autosubmit by the path to the newly created directory.

If the file is not present, you can create it and fill in with the contents below, but changing the user name with yours:

vi ~/.autosubmitrc
path = /home/Earth/xyepes/performance_autosubmit
filename = performance_autosubmit.db
path = /home/Earth/xyepes/performance_autosubmit
jobs = /home/Earth/xyepes/performance_autosubmit/default
platforms = /home/Earth/xyepes/performance_autosubmit/default

3- Create default directory for the default configuration files that wiil be used for any new experiment of the scalability analysis:

mkdir ~/performance_autosubmit/default

4- Copy default jobs.conf and platforms.conf files from /esnas/autosubmit/default, or any other experiment you'd like, to the newly created directory:

cp /esnas/autosubmit/a0mo/conf/platforms_a0mo.conf ~/performance_autosubmit/default/platforms.conf
cp /esnas/autosubmit/a0mo/conf/jobs_a0mo.conf ~/performance_autosubmit/default/jobs.conf  

5- Execute autosubmit install command

autosubmit install

Then it is ready to be used.

scalability_with_autosubmit.txt · Last modified: 2017/07/03 11:41 by adegimel