DC 5.8 Integration Toolbox & Patterns
The Digizuite DAM comes with a toolbox of functionality that allows you standardize the integration process following certain patterns. Do you need 100% control and access to all functionality then you can choose our great open API or our SDK that is documented further down. Before going down that route it might be well worth investigating the standard toolbox that Digizuite already provides.
The essence of what is described here is having Digizuite automation and workflow as the glue which connects your platforms when it makes sense.
We are focusing a lot on reusability and open examples, so if you want to see examples of use then go to
Embedded UI Examples: https://github.com/Digizuite/embedded-ui-examples
API, Postman Collection or Automation Examples: https://github.com/Digizuite/integration-api-examples
The below patterns can be supported but not necessarily with out-of-box capabilities. Some require custom implementation using the SDK whereas others can be taken 80% out of the way using for instance Integration Endpoints or Automation in combination with the SDK or custom code.
First one look at the complexity of the integration by considering use-cases also.
Essential Integration - Low Complexity & Risk
Enhanced Integration - Medium Complexity & Risk
Embedded Integration - Higher Complexity & Risk
Patterns that are very common in the Enterprise Ecosystem would be
Broadcast or one-way Data Exchange (Basic Integration)
A form of integration where (based on some business logic within DAM), data is send to a receiving system. It be very specific business logic that triggers it such as certain metadata fields changing and has a certain value, or it can be all creates, modifications and deletes of assets that must flow to the receiving end. Easily supported by out-of-box Digizuite capabilities such as Automation or Integration Endpoint. Easily extend by the use of your own service or a simple Azure function which is described here on this page.Bi-directional Data Exchange (Enhanced Integration)
It is a combination of the above, but where either certain must be broadcasted but then based on the response something updated in the broadcasting end (Digizuite). Here it could more complicated where certain changes in the receiving system must trigger updates in the Digizuite platform. Here, the question is the amount of data but often Digizuite API or SDK would be a great fit for the data coming in whereas the broadcast elements is as stated in 1.Correlation Data Exchange (Enhanced or Advanced depending on use-cases)
Data from the integrated systems must be correlated in certain ways. An example could be a PIM platform that must not only receive assets but those assets must automatically be linked to a product based on the Product ID on a metadata field on the asset in Digizuite. Here the recommendation is to extend the platform with Azure Functions or your own 3rd party service that is called by Automation or Integration Endpoint.Enhanced UI with an Asset Picker (Basic Integration)
An integration pattern that does not as such move data around is enhancing a 3rd party UI with more capability coming from another system. An asset picker from Digizuite which is being popped on a certain action. When an asset is selected or clicked in the asset picker then at that time it must be integrated into the 3rd party platform (could be adding the assets into a media library, or adding it to an email template for adding it for your product in a PIM as done with our InRiver connector)Combined Data Exchange and UI (Enhanced or Advanced depending on use-cases)
This one combines both data exchange (where assets are moved back and forth) with having UI capability in the 3d party platform as described above. Essentially, you have flow of data that is then further complimented with a UI that allows you handle assets in Digizuite right there when needed. An example is making a crop that is then automatically flowing from Digizuite to the receiving system. It could be updating metadata such as assigning a product ID without leaving your host platform which will then trigger the integration in real time.
To support the above, then a list of mechanisms and tools within the Digizuite platform follows here. It can give you a head start to development and reduce both implementation time, cost and risk in the project.
Automation Flows (No or low-code)
Digizuite has a very powerful automation engine which allows customers to construct business logic in a drag & drop fashion (no- or low-code). The logic consists of a number of steps that can either be a trigger, an action, a filter or for-loop (iterating elements). So an automation could be triggered based on metadata changes on an asset, or an asset arriving at a certain state in a workflow, and then the steps can filter if that change is relevant and if it is then perform certain actions within the DAM or externally. Full docs are here: DC 5.8 Automations.
We allow any customer or partner to add own actions via our SDK from 5.6.0. We also have a public example on how to do that using our SDK (https://github.com/Digizuite/integration-api-examples/tree/main/AWExternalAction ) and documentation is also here DC 5.8 API & SDK - Register external action in AW (webhook)
It comes out-of-box with your Media Manager:
In this context, it is the ‘externally' part that is interesting to integration. An automation has an action which can invoke an external endpoint with values from the current automation (could be an asset ID or workflow ID) as query parameters. The external system could be simply utilizing a 3rd party service but it could also be a data integration where a downstream system must be notified (broadcast).
Many 3rd party platforms have APIs which could be triggered directly with these invoke endpoints but in many cases you would need to extend the integration with more. The Digizuite Automation could handle all business logic to narrow down what should be triggering the Integration but then the Invoke Endpoint could call a service that simply takes care of transporting what is received to the downstream platform.
One way to extend the Automation is by using own services or maybe creating small purpose-built Azure Functions. For a simple teams or slack integration, or if you wish to extend the platform to talk to your favorite Task & Project Management Software then simply add that API logic in an an Azure Function with 4-5 lines of code and let it be invoked from automation as described above.
If you need to synchronize certain elements from a downstream system (two-way) then an option could be to use a scheduler that would trigger external logic to handle the fetch. Could be an azure function using the SDK to update metadata or similarly but always be careful with data heavy integrations.
Integration Endpoints
Integration Endpoints makes it simple to be notified on all asset changes (create, modify, delete) and then any metadata definition within the platform. The main difference is its sole focus on outbound integration and making that as easy and centralized as possible to create, maintain and monitor an integration that requires a middle-layer to handle what should happens to those assets as they are changing. The endpoints are listening to events in real-time but end up on a queue to keep it asynchronous.
Use automation when you have the need to construct business logic in a no-code fashion and have a more lightweight integration such as Teams, Slack or maybe even Project Management tools since you can easily tie it together with the Digizuite Workflows by using out-of-box. Integration Endpoints is the go-to for data heavy integrations where volume is high and any errored asset must always be retried if it fails.
So for any integration endpoint, it can be clicked and the below notification overview is shown for you to easily ensure that all failed can be retried and delivered.
The Integration Endpoint can similarly to the Invoke Endpoint in Automation, invoke any external service with URL and custom headers for authorization.
In the service, then all the custom integration logic can be placed. Again, this could an Azure Function that acts as the connection to your downstream platform.
You would then in your service receive as a body:
{
data: {
itemId: 1234
}
}
Extensions with Azure Functions (or own Service)
A simple approach to extending integration capabilities is using the Azure Platform and more specifically Azure Functions (have in mind this is just an example and it would work just as well by using any service that exposes an API endpoint). It provides an on-demand cloud service (serverless compute) where the focus is on pieces of code that matter to your infrastructure and then the Functions handle the rest and it will scale.
A simple example would be needing a teams integration that must trigger based on certain things happening in a workflow or maybe metadata changing so that you must be notified.
A simple azure function that could be deployed within your own infrastructure in minutes is:
[FunctionName("TriggerTeamsIntegration")]
[OpenApiOperation(operationId: "TeamsIntegration", tags: new[] { "Teams", "Integration" })]
[OpenApiSecurity("function_key", SecuritySchemeType.ApiKey, Name = "code", In = OpenApiSecurityLocationType.Query)]
[OpenApiParameter(name: "title", In = ParameterLocation.Query, Required = true, Type = typeof(string), Description = "The **Title** parameter")]
[OpenApiParameter(name: "text ", In = ParameterLocation.Query, Required = false, Type = typeof(string), Description = "The **Text ** parameter")]
[OpenApiParameter(name: "color", In = ParameterLocation.Query, Required = true, Type = typeof(string), Description = "The **Color** parameter")]
[OpenApiParameter(name: "assetId", In = ParameterLocation.Query, Required = true, Type = typeof(string), Description = "The **Color** parameter")]
[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "text/plain", bodyType: typeof(string), Description = "The OK response")]
public async Task<IActionResult> RunTeams(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req)
{
_logger.LogInformation("C# HTTP trigger function processed a request.");
string title = req.Query["title"];
string text = req.Query["text"];
string color = req.Query["color"];
string assetId = req.Query["assetId"];
var assetUrl = $"https://demo-dam-dam-dam.my-domain.com/asset/{assetId}/asset";
if (string.IsNullOrEmpty(title) || string.IsNullOrEmpty(text) || string.IsNullOrEmpty(color))
{
return new BadRequestErrorMessageResult("Invalid Query parameters. Title, text and color are all mandatory");
}
// Adding get asset name to title
var asset = await GetAssetById(assetId);
if (asset != null)
{
title += $" ({asset.Name})";
}
// Let us try it out!
var url = "<INSERT WEBHOOK URL HERE>";
var client = new TeamsNotificationClient(url);
var message = new MessageCard();
message.Title = title;
message.Text = text;
message.Color = color;
message.Sections = new List<MessageSection>();
message.PotentialActions = new List<PotentialAction>();
message.PotentialActions.Add(new PotentialAction()
{
Name = "Open Asset in Media Manager",
Targets = new List<PotentialActionLink>()
{
new()
{
Value = assetUrl
}
}
});
await client.PostMessage(message);
_logger.LogInformation($"Good to go {JsonConvert.SerializeObject(new { message })}");
return new OkObjectResult(message);
}
The Azure Functions can also use the Digizuite SDK if there is a need for fetching more information about the assets. It can be done by getting our Digizuite SDK (Digizuite C# SDK ) and then using simple dependency injection as described here https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection. Then asset information (and all other availble functions in the SDK) can be fetched like this:
.....
public async Task<Asset> GetAssetById(string assetId)
{
var parameters = new SearchParameters("GetAssets")
{
{"sAssetId", assetId}
};
var response = await _searchService.Search<Asset>(parameters);
var listOfAssets = response.Items.ToList();
return listOfAssets.Count > 0 ? listOfAssets.First() : null;
}
Adding the above to automation or an Integration Endpoint would be a simple exercise. Specifically for automation, one would go and add the following URL:
https://integration-functions.azurewebsites.net/api/TriggerTeamsIntegration?title=myTitle&text=myText&color=f0ad4e&assetId=123
To the Invoke Endpoint as made above then simply take the URL and add it here (remembering to add the right asset ID parameter):
PIM & DAM
Integration PIM and DAM is a typical use-case for Digizuite. The below gives a high-level view.
Digizuite Integration Endpoints are key and will be used to push assets to an external service, which will handle the data exchange. The integration endpoint in Digizuite will provide asset changes (for instance, on approval and when a product id has been added) but also what metadata fields are required (note: if you have a lot fields to trigger on, then we do ask you to use Integration Endpoint to receive only one event rather than multiple per changed metadata field which would be the case in automation**). On your end, you will receive item ids from the integration endpoint which can be used to look up relevant metadata (using our SDK or API). These metadata contains the product ID, relevant channel and all information needed to be carried to the resource in PIM, and for linking it to the right product.
**if you have a lot fields to trigger on, then we do ask you to use Integration Endpoint rather than automation triggers to receive only one event (as opposed to multiple events per changed metadata).
Digizuite is responsible for the Integration Endpoint which defines all triggers for asset changes (described further up). The integration service is a custom parthandled by a partner which can facilitate the data exchange. Custom development is required.
To summarize the push flow from Digizuite.
Asset is approved and gets product ID which triggers a message to the integration service
The integration service will look up the asset based on its ID, see all the metadata and attach (or update) the asset to the right product. It will also add relevant metadata to the asset in PIM (such as description) if desired.
The integration service will also (based on the product id) set metadata back to the asset in DAM if desired.
Data must also flow the other way into Digizuite. The relevant data:
Synchronizing the Product Hierarchy from PIM to DAM
This is important because it can be used to select the relevant product directly from DAM. And since it is synced from PIM, you know that those products exist.
Updating Product Data when it changes in PIM. Either this can be handled by a scheduled job within Digizuite automations which will get updated products at certain intervals, or it can be webhooks from PIM to the Integration Service layer which will then update on the relevant assets in PIM.
Important decisions for PIM and DAM
What should trigger an asset to be added or updated on a product
Approval steps, required metadata and so on.
What asset metadata should be added to the asset in PIM
What product attribution should flow back from PIM to DAM
What happens when an asset is expired? Should it be removed immediately in PIM or what is the process?
Decide on how to approach the data from PIM to DAM. Should it be scheduled job which pulls data from PIM in intervals or should it be webooks.
Ensure that there is clear responsibility for the partner who develops the Integration Service.
Relevant examples on how to communicate with the DAM can be found in our GitHub API examples: https://github.com/Digizuite/integration-api-examples/
It shows a full Azure Function service and how to use it with the SDK.
Custom Integration using our API or SDK
Please read here for more: DC 5.8 API & SDK