Hooks in pipelines
Execute commands before/after each pipeline or step
Hooks in pipelines allow you to run specific actions at the end and the beginning of the pipeline, as well as before/after a step.
Hooks can be a freestyle step, as you need to define:
- A Docker image that will be used to run specific commands.
- One or more commands to run within the context of that Docker image.
For simple commands we suggest you use a small image such as
alpine, but any Docker image can be used in hooks.
Codefresh allows you to run a specific step before each pipeline as well as after it has finished.
Running a step at the end of the pipeline
You can easily run a step at the end of pipeline, that will execute even if one of the steps have failed (and thus the pipeline is stopped in middle):
In the example above we define a hook for the whole pipeline that will run a step (the
exec keyword) inside
alpine:3.9 and will simply execute an
echo command. Because we have used the
on_finish keyword, this step will execute even if the whole pipeline fails.
This scenario is very common if you have a cleanup step or a notification step that you always want to run at the end of the pipeline. You will see the cleanup logs in the top pipeline step.
Apart from the
on_finish keyword you can also use
on_fail if you want the step to only execute according to a specific result of the pipeline. It is also possible to use multiple hooks at the same time:
Note that if you have multiple hooks like the example above, the
on_finish segment will always execute after any
on_fail segments (if they are applicable).
Running a step at the start of the pipeline
Similar to the end of the pipeline, you can also execute a step at the beginning of the pipeline with the
All pipeline hooks will be shown in the “initializing process” logs:
It is possible to define all possible hooks (
on_fail) in a single pipeline, if this is required by your development process.
You can also define hooks for individual steps inside a pipeline. This capability allows you to enforce more granular control on defining prepare/cleanup phases for specific steps.
The syntax for step hooks is the same as pipeline hooks:
on_fail. The only difference is that you need to add the respective segment within a step instead of the at the root of the pipeline.
For example, this pipeline will always run a cleanup step after integration tests (even if the tests themselves fail).
Logs for steps hooks are displayed in the log window of the step itself.
As with pipeline hooks, you can also define multiple hook conditions for each step.
The order of events in the example above is the following:
on_electedsegment executes first (authentication)
Security Scanningstep itself executes the security scan
on_failsegment executes only if the step throws an error code
on_finishsegment always executes at the end
Running steps/plugins in hooks
Hooks can use steps/plugins.
With plugins you have to specify:
- The type field for the step/plugin
- The arguments needed for the step/plugin
Controlling errors inside pipeline/step hooks
By default, if a step fails within a pipeline, pipeline execution is terminated and the pipeline is marked as failed.
on_electedsegments fail, regardless of the position of the segment in a pipeline or step, the whole pipeline will fail.
on_finishsegments do not affect the pipeline outcome. A pipeline will continue execution even if one of these segments fails.
Pipeline execution example with
The following pipeline will fail right away, because the pipeline hook fails at the beginning.
You can change this behavior by using the existing fail_fast property inside an
This pipeline will now execute successfully and
step1 will still run as normal, because we have used the
fail_fast property. You can also use the
fail_fast property on step hooks as well:
Notice that the
fail_fastproperty is only available for
on_electedhooks. The other types of hooks (
on_fail) do not affect the outcome of the pipeline in any way. Even if they fail, the pipeline will continue running to completion. This behavior is not configurable.
Using multiple steps for hooks
In all the previous examples, each hook was a single step running on a single Docker image. You can also define multiple steps for each hook. This is possible by inserting an extra
steps keyword inside the hook and listing multiple Docker images under it:
By default all steps in a single hook segment are executed one after the other. But you can also run them in parallel:
You can use multiple steps in a hook in both the pipeline and the step level.
Referencing the ‘working_directory’ in step hooks
To access the
working_directory of a regular step through a hook, use the prefix
parentSteps.<step-name> For example, to access the
working_directory of the
clone step, use
Using annotations and labels in hooks
Note however, that if you decide to use annotations and metadata inside hooks, you cannot mix and max the old syntax with the new syntax.
The following pipeline is NOT valid:
The pipeline is not correct, because the first segment of annotations is directly under
on_success (the old syntax), while the second segment is under
hooks/on_success (the new syntax).
Syntactic sugar syntax
We offer the following options to simplify the syntax for hooks.
Not using metadata or annotations in your hook
Omit the keyword
Not specify teh Docker image
If you do not want to specify the Docker image you can simply omit it. Codefresh will use the
alpine image in that case to run the hook:
Using Type Steps / Plugins in hooks
You can use a type step / plugins in hooks. With this you will need to change
steps with the information needed for the step.
Below is an example pipeline hook using the
slack-notifier step/plugin for when the pipeline starts.
Limitations of pipeline/step hooks
With the current implementation of hooks, the following limitations are present:
- The debugger cannot inspect commands inside hook segments.
- Hooks are not supported for parallel steps.
- Storage integrations don’t resolve in hooks (for example, test reports).