Samples
On this page you will find a few basic and a few advanced samples what can you do with the Respresso’s flow.
Basic samples
Webhook
In some cases you will need a webhook to integrate Respresso in your workflow.
- Use cases:
Trigger CI build
Notify the team of a change in resources
Sync the converted resources and do your stuff with them. (Backup or host for dynamic usage)
- Used processors:
<flow xmlns="https://app.respresso.io/public/schema/flow.xsd">
<nodes>
<processor id="webhook" name="WebhookProcessor" version="1">
{
"url": "https://your_webhook_page_url/"
}
</processor>
</nodes>
<connections>
<connection from="@input" to="webhook" mergeType="none"/>
<connection from="webhook" to="@output" mergeType="none"/>
</connections>
</flow>
Explanation
When this flow is called the webhook processor will be executed while the data at the input will be ignored due to mergeType="none"
.
This also applies to the output of the webhook (It will return an empty object).
How to trigger a webhook after a resource category changed?
When you have to integrate Respresso with a CI you have to make sure that at the time of the webhook the conversion ended and all data is persisted. Unfortunately currently flow execution runs in a single transaction so while the flow is executed only this transaction can access the fresh data. The easiest way is to call a webhook which will close the connection and wait a few seconds before requesting data from Respresso to ensure that the transaction is already committed. So you have to execute the webhook after the changes has been stored in this transaction which is usually StoreChangedResourceCategoryProcessor:v1 at the make flow. To achieve this you have to make the webhook dependent of the commit point and the output dependent of the webhook to execute it.
Let’s see how to do that:
<flow xmlns="https://app.respresso.io/public/schema/flow.xsd">
<nodes>
<processor id="convert" name="ResourceCategoryConversionExecutorProcessor" version="1">
{ ... Your category dependent config ... }
</processor>
<processor id="executeActions" name="StoreChangedResourceCategoryProcessor" version="1"/>
<processor id="webhook" name="WebhookProcessor" version="1">
{
"url": "https://your_webhook_page_url/"
}
</processor>
</nodes>
<connections>
<connection from="@input" to="convert"/>
<connection from="convert" to="executeActions"/>
<connection from="executeActions" to="@output"/>
<connection from="executeActions" to="webhook" mergeType="none"/>
<connection from="webhook" to="@output" mergeType="none"/>
</connections>
</flow>
Note
Make sure to use mergeType="none"
to ensure that the output will not be overridden by the empty object returned from the webhook processor.
Note
In this example we do not used any special feature of the WebhookProcessor:v1. For more details please read its docs.
Warning
Make sure that after the webhook’s connection is closed the called server waits a few seconds before accessing Respresso’s data. If you do not need any further data you do not have to worry about this.
There is an other, more safe way of triggering a webhook with the fresh data but it is a bit more complex.
The idea is to wait for the store processor (like in the above example) than read the resulted snapshot’s file and send it to a given URL.
You can achieve this using HttpSendFileProcessor with the sending of root/<category_name>.respresso
file. This file is matching the CategorySnapshotStructure.
Note that this method and naming may change in the future.
Advanced samples
Custom conversion
In some cases you may found that you need an other format of a resource category which is currently not supported by Respresso. Fortunately you are able to extend it with your conversion relatively simple.
For this you will need to implement the conversion and host it somewhere which is accessible to the Respresso server. When your conversion has to be executed Respresso will send a HTTP POST request sending all the data provided to the input of that node in JSON format and put the parsed JSON response to the output. That’s it your conversion is added to Respresso.
- Use cases:
You need a custom file format
The target platform is currently not supported by Respresso
- Used processors:
Note
Before the request the input data of HttpFilesStructureRemoteProcessor:v1 is resolved from any Lazy value. This means every Lazy is executed and the input Lazy is replaced with it’s resolved value.
Note
Respresso serializes Binary data to a base64
encoded json string. So you will need to decode it in your converter.
Let’s see what would it look in case of localization.
<flow xmlns="https://app.respresso.io/public/schema/flow.xsd">
<nodes>
<processor id="parser" name="AllLocalizationsParserProcessor" version="1"/>
<processor id="custom" name="HttpFilesStructureRemoteProcessor" version="1">
{"url": "https://your_converter_url/"}
</processor>
<!--Other processors... Not part of this example. -->
</nodes>
<connections>
<connection from="@input" to="parser"/>
<connection from="parser" to="custom"/>
<!--Other connections... Not part of this example.-->
<connection read="files" from="custom" write="files[+]" to="@output"/>
</connections>
</flow>
This method is a simplified version of a custom conversion which requires you to return the converted files but you can check HttpRemoteProcessor:v1 for a more flexible way of custom processing.
Explanation
When this flow is called after parsing the resource category a LocalizationParsedStructure:v1 object is posted to the url configured to HttpFilesStructureRemoteProcessor:v1. The result is parsed as a HttpFilesStructure:v1 object and converted to Respresso’s internally used format with Lazy values.
Note
In the response the fileContent
fields must contain a base64
encoded string.
Note
In this example we used write="files[+]"
in the connection to convertFiles
.
This ensures that you can concat multiple files to the input of convertFiles
so multiple conversions can be joined this way.
Best practises
When you use HttpFilesStructureRemoteProcessor:v1 you may want to use a
token
to ensure that no one will use your publicly exposed conversion service.When possible use
https
to ensure that no one can steal your resources during the communication process. It’s a business secret, don’t forget about it.For a quick implementation you may want to use AWS lambda or Google Cloud Functions. Both provide an easy to deploy and host model with SSL in a free tier. So it will probably fit your needs for a simple conversion task.