Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

This section provides information on how to install Digizuite DAM Center.

Before upgrading

If you are installing a new DC skip this step.

This app should be run to cleanup a table which is likely to have data errors. This is done as an app to avoid a potential long running SQL script as part of the DAM update process.

The app: CleanupAssetDigiupload

Ensure all queues in RabbitMQ are empty. We do not support upgrading with data left in RabbitMQ.

Running the app

Do a database backup first.

  1. Unzip the zip-file to a new folder.
  2. Edit appsettings.json and set the connectionstring to match that of the DC. It can be copy-pasted from the DC web.config.
  3. Open a command prompt in the folder made for the app. The only logging is done in prompt so this is the only way to see how it goes.
  4. Run CleanupAssetDigiupload.exe.

Questions

Why do I need to do this?

There is a table belonging to the core asset data which has data errors preventing normal load to memory cache which becomes a part of DC 5.4.

What happens with the asset data load if there are data errors?

The DC has a simple self-healing mechanism which will ensure normal operation when loading assets, however it will not fix all the errors.

Is this error found on any DC?

Yes. Significantly for any DC which has ever done any asset replace. The number of data errors scale linearly with the number of asset replaces.

Why is the error important to fix now if we always lived with it?

The load of asset data needs to be very quick and to achieve that we need a stricter control with the asset data structure.

Installing/Upgrading Digizuite

The Digizuie DAM Center is installed/Upgraded using Powershell. The package, provided by the Digizuite DAM Vendor, is a zip file with the following structure:

Image Removed

How to install/Upgrade

To install/upgrade the Digizuite DAM Center, do the following:

  1. Edit the Install.ps1 - See section 1.2 for a description of all the parameters
  2. Run the Install.ps1, as administrator, either from Powershell or the PowerShell ISE

Variables

All the parameters that need to be filled out, are described in the table below

...

Variable

...

Mandatory

...

Default value

...

Example /options

...

Description

...

Users

...

Web

...

Database

...

dbServerName

...

Locations

...

Info

SolR has been removed from the 5.6.1 Installer, Existing configuration Settings in Web.Config (SolRUrl and SolRUncPath) are maintained during upgrade.

Important for DamForOptimizely or other DC customers that still needs SolR - the Windows Group "AppPools-<SITENAME>" needs R/W permissions for the SolRUncPath folder


Before upgrading

If you are installing a new DC skip this step.


Info

If you are upgrading from before 5.3 to this version, you must run this application!


This app should be run to cleanup a table which is likely to have data errors. This is done as an app to avoid a potential long running SQL script as part of the DAM update process.

The app:

View file
nameCleanupAssetDigiupload.zip
height250

Ensure all queues in RabbitMQ are empty. We do not support upgrading with data left in RabbitMQ.


Install ElasticSearch: DC 5.6 Elastic Search

Running the app

Do a database backup first.

  1. Unzip the zip-file to a new folder.
  2. Edit appsettings.json and set the connectionstring to match that of the DC. It can be copy-pasted from the DC web.config.
  3. Open a command prompt in the folder made for the app. The only logging is done in prompt so this is the only way to see how it goes.
  4. Run CleanupAssetDigiupload.exe.

Questions

Why do I need to do this?

There is a table belonging to the core asset data which has data errors preventing normal load to memory cache which becomes a part of DC 5.4.

What happens with the asset data load if there are data errors?

The DC has a simple self-healing mechanism which will ensure normal operation when loading assets, however it will not fix all the errors.

Is this error found on any DC?

Yes. Significantly for any DC which has ever done any asset replace. The number of data errors scale linearly with the number of asset replaces.

Why is the error important to fix now if we always lived with it?

The load of asset data needs to be very quick and to achieve that we need a stricter control with the asset data structure.

Installing/Upgrading Digizuite

Info

Always remember to backup the database before upgrading!


The Digizuite DAM Center is installed/Upgraded using Powershell. The package, provided by the Digizuite DAM Vendor, is a zip file with the following structure:


Image Added

How to install/Upgrade

To install/upgrade the Digizuite DAM Center, do the following:

  1. Edit the Install.ps1 - See section 1.2 for a description of all the parameters
  2. Run the Install.ps1, as administrator, either from Powershell or the PowerShell ISE

Variables

All the parameters that need to be filled out, are described in the table below

Variable

Mandatory

Default value

Example /options

Note

Description

NewInstallationTrue$True$False
This variable controls whether it is an update or a new installation (True = new installation, false = update)
PreStagedDatabasesTrue$False$True
This variable controls whether the installer should create the databases and user if it is a new installation. Set this to True only if the databases have been pre-stages, see section

Users






assetstreamUsernameTrue
assetstreamRemoved In 5.6.1This variable is for the username of the storage user. See /wiki/spaces/DD/pages/1322483793, section 2.1
assetstreamPasswordTrue
SuperSecretPasswordRemoved In 5.6.1This variable is for the password of the storage user.
digiadminUsernameTrue
digiadmin
This variable is for the username of the administrative user.
digiadminPasswordTrue
SuperSecretPasswordRemoved In 5.6.1This variable is for the password of the administrative user
useActiveDirectoryTrue$False$True
This variable will setup the web site to require windows authentication. Do NOT set this to true on NewInstallations. Only use it on updates that are already set up for active directory.

Web






portTrue
443
This is the port on which the web interface is installed in the IIS. Usually port 80 or 443
protocolTrue
https
The protocol used for the web interface installed in the IIS. Typically HTTP or HTTPS
siteUrlTrue
dam.company.com
The DNS for the web interface which is installed in the IIS
CorsAllowOriginsTrue@("$protocol://$siteurl")@("https://dam.company.com", "https://mm.company.com")
A comma-separated list of allowed cors origins

Database






SqlServerWindowsLoginTrue$False$TrueAdded in 5.6.1True: Use Integrated security in Master Connection String (Windows login) - Current user needs permissions in Db
False: Use SQL Login

dbServerName

True
sqlserver
This is used for the SQL server name
dbUsernameTrue
sqluser
The user who has access to the database
dbPasswordTrue
SuperSecretPassword
The password of the database user
collationTrueSQL_Latin1_General_CP1_CI_ASSQL_Latin1_General_CP1_CI_AS
The database collation
dbNameTrue
company_dam
The name of the database to be installed/updated. It has to be called _dam in the end.
DamdbPasswordFalse
SuperSecretPassword
Password for database DAM database user. If blank it defaults to: 
admin_{#dbName}
DamRecoveryModelTrueSIMPLEFULL
This variable defines what the recovery model of the DAM database should be

Locations






localStoragePathTrueC:\StorageC:\Storage
The local storage path. This is only used if the storage is located on the webserver. Leave as "C:\Storage" if Azure storage is used
uncStoragePathFalse
\\server\storage
The UNC storage path. If on the webserver, the installer creates the share itself. If remote, then it has to exist.
SetStorageFilesystemRightsFalse$True$true $falseRemoved in 5.6.1Allow disabling update of Storage FileSystemRights IF DISABLED RIGHTS MUST BE HANDLED MANUALLY
logRootTrue"C:\LogFiles\" + $siteUrlD:\logFile\dam.company.com
This is the path on which the log files
for the web interface are placed.MonitoringMonitoringServicesFolder
are written to
servicesFolderTrueC:\Services\Digizuite
The folder to install different monitoring services in, such as Prometheus and Grafana

LokiStorageFolder

True
$($MonitoringServicesFolder)\loki
The folder to install loki in. Logs are stored as
C:\Services\Digizuite
This is the path on which all the binaries of the services are placed.
sitePathTrue"C:\Webs\" + $siteUrlD:\Webs\dam.company.com

This is the path on which the files for the web interface are placed.

Monitoring




LokiStorageFolder

True
$($ServicesFolder)\loki

Removed in 5.6.1The folder to install loki in. Logs are stored as part of this folder also.

LokiHostName

true
$env:ComputerName
10.0.0.5
The hostname/ip-adress of the server that is hosting Loki. Usually it's the web server. If you have multiple web servers, you need to point to the same webserver on all machines.  

RabbitMQ






InstallRabbitMQTrue$True$True $False
Should RabbitMQ be installed
RabbitMQHostTrue
dam.company.com
The hostname of the RabbitMQ Server
RabbitMQUsernameTrue
[aRabbitMqUsername]

Username for connecting to RabbitMQ Server 

Note: guest is not allowed anymore

RabbitMQPasswordTrue


Password for connecting to RabbitMQ Server
RabbitMQVirtualHostTrue


VHost on RabbitMQ, normally set to dam database name, excluding _dam
RabbitMQStorageTrue
C:\ProgramData\RabbitMQ

The location where RabbitMQ Database and logs are stored – only used when RabbitMQ is installed for the first time on the server!

RabbitMQPathTrue
C:\Program Files
Parent folder,
Removed in 5.6.1 - uses servicesFolderParent folder, where the RabbitMQ Application is installed.
FirewallRemoteIpAddressesTrue
@("192.168.1.0/24","172.16.1.22")

Ip address or address ranges that are allowed for rabbitmq and erlang ports, these are needed as Batch machines needs access to RabbitMQ. if an empty array are used all ip addresses are allowed = @()

IPV4 and IPV6 is possible, ranges only with CIDR notation WIKI CIDR

Solr and

SearchService

solrUrl





ForceRepopulateAllSearchesTrue
- if installSolr is True, else false
$false

True = all searches will be repopulated, false = only changed searches will be repopulated

Elastic search






InstallElasticSearchOnPremiseFalseFalseTrue
Install Elastic Search OnPremise
ElasticSearchUrlTrue"http://localhost:
8983/solr/http://localhost:8983/solr/The URL which Solr should usesolrLocalPathTrue - if installSolr is True, else falseC:\Services\SolrServiceC:\Services\SolrServiceLocal path on which the Solr binaries are placedsolrUncPathTrue - if installSolr is True, else false$solrLocalPath\\server\Services\SolrServiceUnc path on which the Solr binaries are placedForceRepopulateAllSearchesTrue$falseTrue = all searches will be repopulated, false = only changed searches will be repopulatedSolRUsernameTrue - if installSolr is True, else false$digiadminUsernameThe username used when connecting to Solr through an authenticated api. If you are not using authentication with Solr, then the value does not matter.

SolRPassword

True - if installSolr is True, else false$digiadminPasswordThe password for solr if using authentication.

Elastic search

ElasticSearchUrlTrue"http://localhost:9200"The URL to Elasticsearch. See this documentation for more information about Elasticsearch: DC 5.6 Elastic SearchElasticSearchApiKeyTrue""The API key required to access elastic. Leave empty if no API key is required. ElasticSearchVirtualHostTrue$local:RabbitMQVirtualHost"mm-dam"To support multiple DC's running on the same Elastic search cluster, the DC code can separate the names of indices and aliases it creates, such that indices do not overlap with each other. 

SMTP

smtpUsernameFalsesmtpuserThe username of the smtp usersmtpPasswordFalseSuperSecretPasswordThe password of the smtp usersmtpFromEmailFalsenoreply@company.comThe e-mail which should be used to send fromsmtpPortFalse587587The port used for the smtp serversmtpServerFalsesmtp.company.comThe smtp server

Licenses

licenseNameTrueCompanyThe license holder namelicenseSerialNumberTrue42The serial number of the licenselicenseApplicationTrue-The license for the main applicationlicenseAssetsTrue-The license for the assetslicenseUsersTrue-The license for the userslicenseMetadataTrue-The license for metadatalicenseOfficeFalse-The license for the office connectorlicenseDamForSitecoreFalse-The license for the Dam For Sitecore connector

Azure

azureStorageAccountBeFalseBackendStorageCompanyThe storage account name of backend storageazureAccessKeyBeFalse-The access key for the backend storageazureStorageAccountFeFalseFrontendStorageCompanyThe storage account name of frontend storageazureAccessKeyFeFalse-The access key for the frontend storage

Install scenarios

As the same script is used for multiple purposes, this section describes different scenarios and what parameters to be especially aware of.

Installation

To install a new Digizuite DAM Center with UNC storage, the important parameters are the following

...

Upgrade

...

9200"

The URL to Elasticsearch. See this documentation for more information about Elasticsearch: DC 5.6 Elastic Search Cloud
ElasticSearchApiKeyTrue""

The API key required to access elastic. Leave empty if no API key is required. 
ElasticSearchVirtualHostTrue$local:RabbitMQVirtualHost"mm-dam"
To support multiple DC's running on the same Elastic search cluster, the DC code can separate the names of indices and aliases it creates, such that indices do not overlap with each other. 

SMTP






smtpUsernameFalse
smtpuser
The username of the smtp user
smtpPasswordFalse
SuperSecretPassword
The password of the smtp user
smtpFromEmailFalse
noreply@company.com
The e-mail which should be used to send from
smtpPortFalse587587
The port used for the smtp server
smtpServerFalse
smtp.company.com
The smtp server

Licenses






licenseNameTrue
Company
The license holder name
licenseSerialNumberTrue
42
The serial number of the license
licenseApplicationTrue
-
The license for the main application
licenseAssetsTrue
-
The license for the assets
licenseUsersTrue
-
The license for the users
licenseMetadataTrue
-
The license for metadata
licenseOfficeFalse
-
The license for the office connector
licenseDamForSitecoreFalse
-
The license for the Dam For Sitecore connector

Azure






azureStorageAccountBeFalse
BackendStorageCompany
The storage account name of backend storage
azureAccessKeyBeFalse
-
The access key for the backend storage
azureStorageAccountFeFalse
FrontendStorageCompany
The storage account name of frontend storage
azureAccessKeyFeFalse
-
The access key for the frontend storage

Install scenarios

As the same script is used for multiple purposes, this section describes different scenarios and what parameters to be especially aware of.

Installation

To install a new Digizuite DAM Center with UNC storage, the important parameters are the following


Important note: The installation script is NOT idempotent, which means if it is run again it will not validate and correct changes to reflect the config that was used in the installation process.
Issues arising from this could be that for example services and webs folder have been created on the C: drive instead of a dedicated drive. 
The only way of correcting these issues are to completely remove everything that the installation script created in this order (some services might need to be deleted by manually killing processes for that service):

  1. Rabbitmq (service, services folder, references in registry)
  2. Erlang (service, services folder, references in registry)

  3. Nats (service, services folder)

  4. Grafana  (service, services folder)

  5. Loki  (service, services folder)

  6. Promtail  (service, services folder)

  7. Windows exporter  (service, services folder)

  8. nssm  (service, services folder)

  9. delete DAM database via SQL management studio

  10. delete SQL security login : "admin_<customername>_dam"
  11. Delete all IIS sites and App pools in IIS and restart IIS
  12. check that all these services have been removed (to delete each: use sc delete "<servicename>" via CMD, not Powershell, and if they still don't get removed, restart server)

    Keep webs files as these are still relevant and can be moved into the new webs folder on the dedicated drive fx. H: drive

    Then finally run the install script again


VariableValueDescription
NewInstallation$TrueThis has to be true on new installations, as otherwise databases won't be created
PrestagedDatabases$True/$False
What happens if it is false, is the databases and the user mapping is checked and corrected (if wrong), but that requires sysadmin rights on the SQL server

Local UNC

...

If databases and users have not been pre-staged (created), then set this to $False, $True otherwise. This is for installing without sysadmin rights
Azure parameters
Leave them blank, if UNC storage is used

Upgrade

To upgrade Digizuite DAM Center, the following parameters are important:

VariableValueDescription
localStoragePath
NewInstallation
Local storage path

Has to point to the local storage path (e.g. C:\Storage)

uncStoragePathUNC storage path

Has to point to the local share (\\Webserver\Storage)

External UNC

...

$TrueThis has to be true on new installations, as otherwise databases won't be created
PrestagedDatabases$True/$FalseWhat happens if it is false, is the databases and the user mapping is checked and corrected (if wrong), but that requires sysadmin rights on the SQL server

Local UNC

To use a Local UNC (Local means it is located on the webserver), the following parameters are important:

VariableValueDescription
localStoragePath
C:\StorageLeave it as:
Local storage path

Has to point to the local storage path (e.g. C:\Storage)

uncStoragePathUNC storage path

Has to point to the

external

local share (\\

SomeExternalServer

Webserver\Storage)

...

External UNC

To use Azure storage, the following parameters are important

...

If provided, it configures up local storage, that can be enabled in the future

...

if provided, it configures a local unc share, that can be enabled in the future

...

Has to be a Microsoft Azure storage account name

...

Accesskey of the above

...

Has to be another Microsoft Azure Storage account name

...

Accesskey of the above

The reason there are two storage accounts is that frontend (used by satellite products, for instance, DAM For Sitecore) and backend (used by the administration web interface) storage is separated. 

Azure storage (Upgrade)

The same parameters as the new install are important, but the difference is, that the azure storage account and accesskeys have to correspond to what is already configured in the existing system.

Solr

In old versions of the Digizuite DAM Center, the installer handled the installation and set up of Solr. This is now taken out, enabling you to choose your own Solr provider.

The installer still configures some things for Solr though, which means that the installer still retains the parameter "installSolr".

"installSolr" no longer decides whether or not Solr should be installed. It now works together with the parameter "solrLocalPath", where if "installSolr" is $True, a folder will be created during the installation in the place where "solrLocalPath" describes.

In short, the first time you perform an installation of the DC where you already have a Solr installed, you set both "installSolr" (to $True) and "solrLocalPath". A folder will now be generated in your Solr installation with all the necessary security settings configured for the relevant users. Additionally, the DC's web.config will get the Solr URL inserted.

...

After having run through the installer (or updated with the installer) where "installSolr" has been set to $True, then performing an update where "installSolr" is $True again is redundant, as the folder and rights have already been created/set.

You can, of course, perform an installation where "installSolr" is $False, and then afterward perform an update where "installSolr" is $True.an external UNC (External means that storage is not located on the webserver), the following parameters are important

VariableValueDescription
localStoragePathC:\Storage

Leave it as: C:\Storage

uncStoragePathUNC storage path

Has to point to the external share (\\SomeExternalServer\Storage)

Azure storage (New install)

To use Azure storage, the following parameters are important

VariableValueDescription
localStoragePathLocal storage path

If provided, it configures up local storage, that can be enabled in the future

uncStoragePathUNC storage path

if provided, it configures a local unc share, that can be enabled in the future

azureStorageAccountBe

Has to be a Microsoft Azure storage account name

azureAccessKeyBe

Accesskey of the above

azureStorageAccountFe

Has to be another Microsoft Azure Storage account name

azureAccessKeyFe

Accesskey of the above

The reason there are two storage accounts is that frontend (used by satellite products, for instance, DAM For Sitecore) and backend (used by the administration web interface) storage is separated. 

Azure storage (Upgrade)

The same parameters as the new install are important, but the difference is, that the azure storage account and accesskeys have to correspond to what is already configured in the existing system.

Solr

SolR is now removed from installer

Info
titleSolR Removed

Note: For new installs that uses the old Optimizly connector SolR needs to be Installed manually, 

The following Web.Config AppSettings is still respected by DC:

<add key="SolRUrl" value="http://localhost:8983/solr" />
<add key="SolRUncPath" value="" />
<add key="SolRUsername" value="" />
<add key="SolRDomain" value="" />
<add key="SolRPassword" value="" />


For updates:

the above settings can be copied from web.config.old


Azure SQL

Info
titleInfo

The created databases in the Azure portal has to be postfixed with _dam  (i.e. YourCompanyName_dam) 

...

NOTE: If another antivirus is used these paths should be excluded via the management interface of the Antivirus.

VariableValueDescription
StorageArealocalStoragePathStorage Area eg. G:\Storage\
PrometeusAreaPrometeusDataFolderPrometeus Data Folder eg. G:\Services\prometheus-2.19.0.windows-amd64\data
SolrAreaSolrFoldersolr Folder - can be found via the AppSetting 'SolRUncPath' in web.config eg. C:\Bitnami\solr-8.3.1-1\apache-solr\server\solr
ElasticSearchAreaOnPrem Elastic Search Data FolderElastic Search Data Folder eg. eg. G:\Services\ElasticSearch\data
LogAreaLogRootLog Area eg. G:\LogFiles\dc.example.com
RabbitStorageRabbitMQ Data AreaRabbitData Area eg: RABBITMQ_BASE environment variable