Execute commands before/after each pipeline or step
Pipeline hooks allow you to run specific actions at the end and the beginning of the pipeline as well as before/after a step.
Hooks can be a freestyle step as you need to define:
- A Docker image that will be used to run specific commands.
- One or more commands to run within the context of that Docker image.
For simple commands we suggest you use a small image such as
alpine, but any Docker image can be used in hooks.
Also, Hooks can use steps/plugins and need to define:
- The type field for the step/plugin.
- The arguments needed for the step/plugin.
Codefresh allows you to run a specific step before each pipeline as well as after it has finished.
Running a step at the end of the pipeline
You can easily run a step at the end of pipeline, that will execute even if one of the steps have failed (and thus the pipeline is stopped in middle):
In the example above we define a hook for the whole pipeline that will run a step (the
exec keyword) inside
alpine:3.9 and will simply execute an
echo command. Because we have used the
on_finish keyword, this step will execute even if the whole pipeline fails.
This scenario is very common if you have a cleanup step or a notification step that you always want to run at the end of the pipeline. You will see the cleanup logs in the top pipeline step.
Apart from the
on_finish keyword you can also use
on_fail if you want the step to only execute according to a specific result of the pipeline. It is also possible to use multiple hooks at the same time:
Note that if you have multiple hooks like the example above, the
on_finish segment will always execute after any
on_fail segments (if they are applicable).
Running a step at the start of the pipeline
Similar to the end of the pipeline, you can also execute a step in the beginning of the pipeline with the
All pipeline hooks will be shown in the “initializing process” logs:
It is possible to define all possible hooks (
on_fail) in a single pipeline, if this is required by your development process.
Hooks can also be defined for individual steps inside a pipeline. This capability allows for more granular control on defining prepare/cleanup phases for specific steps.
The syntax for step hooks is the same as pipeline hooks (
on_fail), you just need to put the respective segment under a step instead of the root of the pipeline.
For example, this pipeline will always run a cleanup step after integration tests (even if the tests themselves fail).
Logs for steps hooks are shown in the log window of the step itself.
As with pipeline hooks, it is possible to define multiple hook conditions for each step.
The order of events in the example above is the following.
on_electedsegment executes first (authentication)
- The step itself executes (the security scan)
on_failsegment executes (only if the step throws an error code)
on_finishsegment always executes at the end
Controlling errors inside pipeline/step hooks
By default if a step fails within a pipeline, the whole pipeline will stop and be marked as failed. This is also true for
on_elected segments as well. If they fail, then the whole pipeline will fail (regardless of the position of the segment in a pipeline or step).
For example the following pipeline will fail right away, because the pipeline hook fails at the beginning.
You can change this behavior by using the existing fail_fast property inside an
This pipeline will now execute successfully and
step1 will still run as normal, because we have used the
fail_fast property. You can also use the
fail_fast property on step hooks as well:
Notice that the
fail_fastproperty is only available for
on_electedhooks. The other types of hooks (
on_fail) do not affect the outcome of the pipeline in any way. Even if they fail, the pipeline will continue running to completion. This behavior is not configurable.
Using multiple steps for hooks
In all the previous examples, each hook was a single step running on a single Docker image. You can also define multiple steps for each hook. This is possible by inserting an extra
steps keyword inside the hook and listing multiple Docker images under it:
By default all steps in a single hook segment are executed one after the other. But you can also run them in parallel:
You can use multiple steps in a hook in both the pipeline and the step level.
Using annotations and labels in hooks
Note however, that if you decide to use annotations and metadata inside hooks, you cannot mix and max the old syntax with the new syntax.
The following pipeline is NOT valid:
The pipeline is not correct, because the first segment of annotations is directly under
on_success (the old syntax), while the second segment is under
hooks/on_success (the new syntax).
Syntactic sugar syntax
To simplify the syntax for hooks, the following simplifications are also offered:
If you do not want to use metadata or annotations in your hook the keyword
exec can be omitted:
If you do not want to specify the Docker image you can simply omit it. Codefresh will use the
alpine image in that case to run the hook:
If you don’t use metadata or annotations, you can also completely remove the
exec keyword and just mention the commands you want to run (
alpine image will be used by default):
Limitations of pipeline/step hooks
With the current implementation of hooks, the following limitations are present:
- The debugger cannot inspect commands inside hook segments
- Hooks are not supported for parallel steps
- Storage integrations don’t resolve in hooks (for example, test reports)