The cronjob functionality is used for executing tasks, called cron jobs, regularly at a certain point of time. Typically cron jobs can be used for creating data for backups, updating catalog contents, or recalculating prices. The key idea of applying cron jobs is to start a long or periodic process in the background, with the possibility to log each run and to easily check its result. The concept of cron jobs in SAP Commerce Cloud is explained in detail here.
In this article, we will go through some best practices for running CronJobs and business processes in SAP Commerce Cloud
Writing an Abortable Job:
Custom Jobs are not abortable by default, which means it won't be possible to stop them if they are running for too long.
Because job execution may take a lot of time, you may wish to abort a job. To do this, add a code snippet for checking if you wish to abort a job. Because you can not easily and safely abort a JobPerformable at any time unless you kill the thread, it is necessary to place hooks for the abort before stages that you expect to be time-consuming. See How to write an abortable job.
Assigning a Cron Job to a Group of Nodes:
You can assign a cron job to a group of nodes. It can then be executed by any of the nodes belonging to that group. If a node that is currently executing a cron job stops operating, one of the other nodes from that group takes over and executes it.
Avoid running cron jobs on frontend nodes, such as accstorefront and API nodes, as these are primarily dedicated to maintaining website stability. Running cron jobs on accstorefront nodes, for example, can result in high CPU usage, memory issues, and other performance-related problems. Similar issues can occur if cron jobs are run on API pods.
Avoid Running Multiple Resource-Intensive Jobs Simultaneously:
When dealing with resource-intensive cron jobs or business processes, it’s important to avoid running them simultaneously, as this can quickly exhaust system resources. The simultaneous execution of multiple heavy jobs can lead to performance issues, out-of-memory errors, high DTU consumption, high CPU utilization etc. all of which can compromise the stability of the application.
To minimize these risks, schedule jobs with intervals to spread out resource-heavy tasks. For example, if you have multiple cron jobs that need to run regularly, schedule them using triggers at different times to prevent peak load on the system. See Trigger.
If you run heavy job on backoffice node, it could affect the performance of users browsing the backoffice, the same way it can affect storefront users.
Running CronJobs in Sequence:
If you have multiple jobs that need to run sequentially, you can set up a Composite CronJob, to run multtiple jobs one after the other. See Running CronJobs in Sequence
This section covers the technical data that the platform accumulates over time and how to properly clean it up.
The platform ships with the cleanup capabilities. However, some of these capabilities require additional configuration to enable. The usual areas of consideration for this category are:
All sample configuration in this section is provided on best-effort basis. Make sure to verify and to adapt it to your project!
Cronjobs:
Over time, many cronjobs instances will accumulate in your SAP Commerce Cloud database
The most frequent jobs are:
To clean those up, you can easily configure a retention job with an ImpEx script like the following:
$twoWeeks = 1209600 INSERT_UPDATE FlexibleSearchRetentionRule;code[unique=true];searchQuery;retentionTimeSeconds;actionReference; ; cronjobCleanupRule;"select {c:pk}, {c:itemType} from {CronJob as c join ComposedType as t on {c:itemtype} = {t:pk} left join Trigger as trg on {trg:cronjob} = {c:pk} } where {trg:pk} is null and {c:code} like '00______%' and {t:code} in ( 'ImpExImportCronJob', 'CatalogVersionSyncCronJob', 'SolrIndexerCronJob' ) and {c:endTime} < ?CALC_RETIREMENT_TIME"; $twoWeeks; basicRemoveCleanupAction; INSERT_UPDATE RetentionJob;code[unique=true];retentionRule(code);batchSize ; cronjobRetentionJob; cronjobCleanupRule; 1000 INSERT_UPDATE CronJob;code[unique=true];job(code);sessionLanguage(isoCode)[default=en] ; cronjobRetentionCronJob; cronjobRetentionJob; INSERT_UPDATE Trigger; cronJob(code)[unique = true] ; cronExpression ; cronjobRetentionCronJob ; 0 0 0 * * ?
A few notes regarding the above configuration:
The good part about using retention rules is that they are easily configurable, as shown in the example above.
Note: An alternative way for cleaning up cronjobs would be the CleanupCronJobStrategy for the legacy Maintenance Framework. However, that strategy requires customization if you want to change which cronjobs it processes.
Cronjob Logs:
To actually clean up old cronjob log files as described in CronJob Logs Clean-up, ensure that you configure a cronjob and trigger to delete the logs.
The platform does not clean up old log files out-of-the-box!
The following is a sample ImpEx script which can be used to generate and run a cleanup job:
INSERT_UPDATE CronJob;code[unique=true];job(code);sessionLanguage(isoCode)[default=en] ; cronjobLogCleanupCronjob; cleanUpLogsJobPerformable; INSERT_UPDATE Trigger; cronJob(code)[unique = true];cronExpression # every hour ; cronjobLogCleanupCronjob ; 0 0 0/1 * * ?
If you have cronjobs that run very frequently (for example, every few minutes), you should schedule the log file cleanup even more frequently. Running the cleanup more frequently avoids building up too many log files that need to be deleted.
SAP Commerce Cloud uses Cronjob Histories to track the progress of cronjobs. Similar to cronjob logs, they can accumulate quickly for frequently running jobs.
Starting with SAP Commerce 2005, the platform includes a cleanup cronjob for Cronjob Histories (documentation).
In case you are on an older patch release and cannot upgrade, please refer to the SAP Knowledge Base Note 2848601 for the impex file that sets up the cleanup job.
Cleaning up Cronjob histories is critically important for the performance.
Make sure that the Cronjob histories cleanup job is enabled and active!
Every ImpEx import or export generates at least one ImpexMedia. These media stay in the system, the platform does not delete them when it deletes the ImpEx jobs they belonged to (ImpEx media can potentially be re-used for other ImpEx jobs, but that's rarely ever the case). To set up a retention job for Media, use the following sample ImpEx script:
$twoWeeks = 1209600 INSERT_UPDATE FlexibleSearchRetentionRule;code[unique=true];searchQuery;retentionTimeSeconds;actionReference; ;impexMediaCleanupRule;"select {i:pk}, {i:itemtype} from {ImpexMedia as i} where {i:code} like '00______' and {i:modifiedTime} < ?CALC_RETIREMENT_TIME"; $twoWeeks; basicRemoveCleanupAction; INSERT_UPDATE RetentionJob;code[unique=true];retentionRule(code);batchSize ; impexMediaCleanupJob; impexMediaCleanupRule; 1000 INSERT_UPDATE CronJob;code[unique=true];job(code);sessionLanguage(isoCode)[default=en] ; impexMediaCleanupCronJob; impexMediaCleanupJob; INSERT_UPDATE Trigger; cronJob(code)[unique = true] ; cronExpression # every day at midnight ; impexMediaCleanupCronJob ; 0 0 0 * * ?
The retention time should be the same as for the cronjob cleanup.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
4 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |