...
This section provides information on how to install Digizuite DAM Center.
Before upgrading
If you are installing a new DC skip this step.
This app should be run to cleanup a table which is likely to have data errors. This is done as an app to avoid a potential long running SQL script as part of the DAM update process.
The app: CleanupAssetDigiupload
Ensure all queues in RabbitMQ are empty. We do not support upgrading with data left in RabbitMQ.
Running the app
Do a database backup first.
- Unzip the zip-file to a new folder.
- Edit appsettings.json and set the connectionstring to match that of the DC. It can be copy-pasted from the DC web.config.
- Open a command prompt in the folder made for the app. The only logging is done in prompt so this is the only way to see how it goes.
- Run CleanupAssetDigiupload.exe.
Questions
Why do I need to do this?
There is a table belonging to the core asset data which has data errors preventing normal load to memory cache which becomes a part of DC 5.4.
What happens with the asset data load if there are data errors?
The DC has a simple self-healing mechanism which will ensure normal operation when loading assets, however it will not fix all the errors.
Is this error found on any DC?
Yes. Significantly for any DC which has ever done any asset replace. The number of data errors scale linearly with the number of asset replaces.
Why is the error important to fix now if we always lived with it?
The load of asset data needs to be very quick and to achieve that we need a stricter control with the asset data structure.
Installing/Upgrading Digizuite
The Digizuie DAM Center is installed/Upgraded using Powershell. The package, provided by the Digizuite DAM Vendor, is a zip file with the following structure:
How to install/Upgrade
To install/upgrade the Digizuite DAM Center, do the following:
- Edit the Install.ps1 - See section 1.2 for a description of all the parameters
- Run the Install.ps1, as administrator, either from Powershell or the PowerShell ISE
Variables
All the parameters that need to be filled out, are described in the table below
...
Variable
...
Mandatory
...
Default value
...
Example /options
...
Description
...
Users
...
Web
...
Database
...
dbServerName
...
Locations
...
Info |
---|
SolR has been removed from the 5.6.1 Installer, Existing configuration Settings in Web.Config (SolRUrl and SolRUncPath) are maintained during upgrade. Important for DamForOptimizely or other DC customers that still needs SolR - the Windows Group "AppPools-<SITENAME>" needs R/W permissions for the SolRUncPath folder |
Before upgrading
If you are installing a new DC skip this step.
Info |
---|
If you are upgrading from before 5.3 to this version, you must run this application! |
This app should be run to cleanup a table which is likely to have data errors. This is done as an app to avoid a potential long running SQL script as part of the DAM update process.
The app:
View file | ||||
---|---|---|---|---|
|
Ensure all queues in RabbitMQ are empty. We do not support upgrading with data left in RabbitMQ.
Install ElasticSearch: DC 5.6 Elastic Search
Running the app
Do a database backup first.
- Unzip the zip-file to a new folder.
- Edit appsettings.json and set the connectionstring to match that of the DC. It can be copy-pasted from the DC web.config.
- Open a command prompt in the folder made for the app. The only logging is done in prompt so this is the only way to see how it goes.
- Run CleanupAssetDigiupload.exe.
Questions
Why do I need to do this?
There is a table belonging to the core asset data which has data errors preventing normal load to memory cache which becomes a part of DC 5.4.
What happens with the asset data load if there are data errors?
The DC has a simple self-healing mechanism which will ensure normal operation when loading assets, however it will not fix all the errors.
Is this error found on any DC?
Yes. Significantly for any DC which has ever done any asset replace. The number of data errors scale linearly with the number of asset replaces.
Why is the error important to fix now if we always lived with it?
The load of asset data needs to be very quick and to achieve that we need a stricter control with the asset data structure.
Installing/Upgrading Digizuite
Info |
---|
Always remember to backup the database before upgrading! |
The Digizuite DAM Center is installed/Upgraded using Powershell. The package, provided by the Digizuite DAM Vendor, is a zip file with the following structure:
How to install/Upgrade
To install/upgrade the Digizuite DAM Center, do the following:
- Edit the Install.ps1 - See section 1.2 for a description of all the parameters
- Run the Install.ps1, as administrator, either from Powershell or the PowerShell ISE
Variables
All the parameters that need to be filled out, are described in the table below
Variable | Mandatory | Default value | Example /options | Note | Description |
---|---|---|---|---|---|
NewInstallation | True | $True | $False | This variable controls whether it is an update or a new installation (True = new installation, false = update) | |
PreStagedDatabases | True | $False | $True | This variable controls whether the installer should create the databases and user if it is a new installation. Set this to True only if the databases have been pre-stages, see section | |
Users | |||||
assetstreamUsername | True | assetstream | Removed In 5.6.1 | This variable is for the username of the storage user. See /wiki/spaces/DD/pages/1322483793, section 2.1 | |
assetstreamPassword | True | SuperSecretPassword | Removed In 5.6.1 | This variable is for the password of the storage user. | |
digiadminUsername | True | digiadmin | This variable is for the username of the administrative user. | ||
digiadminPassword | True | SuperSecretPassword | Removed In 5.6.1 | This variable is for the password of the administrative user | |
useActiveDirectory | True | $False | $True | This variable will setup the web site to require windows authentication. Do NOT set this to true on NewInstallations. Only use it on updates that are already set up for active directory. | |
Web | |||||
port | True | 443 | This is the port on which the web interface is installed in the IIS. Usually port 80 or 443 | ||
protocol | True | https | The protocol used for the web interface installed in the IIS. Typically HTTP or HTTPS | ||
siteUrl | True | dam.company.com | The DNS for the web interface which is installed in the IIS | ||
CorsAllowOrigins | True | @("$protocol://$siteurl") | @("https://dam.company.com", "https://mm.company.com") | A comma-separated list of allowed cors origins | |
Database | |||||
SqlServerWindowsLogin | True | $False | $True | Added in 5.6.1 | True: Use Integrated security in Master Connection String (Windows login) - Current user needs permissions in Db False: Use SQL Login |
dbServerName | True | sqlserver | This is used for the SQL server name | ||
dbUsername | True | sqluser | The user who has access to the database | ||
dbPassword | True | SuperSecretPassword | The password of the database user | ||
collation | True | SQL_Latin1_General_CP1_CI_AS | SQL_Latin1_General_CP1_CI_AS | The database collation | |
dbName | True | company_dam | The name of the database to be installed/updated. It has to be called _dam in the end. | ||
DamdbPassword | False | SuperSecretPassword | Password for database DAM database user. If blank it defaults to: admin_{#dbName} | ||
DamRecoveryModel | True | SIMPLE | FULL | This variable defines what the recovery model of the DAM database should be | |
Locations | |||||
localStoragePath | True | C:\Storage | C:\Storage | The local storage path. This is only used if the storage is located on the webserver. Leave as "C:\Storage" if Azure storage is used | |
uncStoragePath | False | \\server\storage | The UNC storage path. If on the webserver, the installer creates the share itself. If remote, then it has to exist. | ||
SetStorageFilesystemRights | False | $True | $true $false | Removed in 5.6.1 | Allow disabling update of Storage FileSystemRights IF DISABLED RIGHTS MUST BE HANDLED MANUALLY |
logRoot | True | "C:\LogFiles\" + $siteUrl | D:\logFile\dam.company.com | This is the path on which the log files |
are written to | ||
servicesFolder | True | C:\Services\Digizuite |
---|
LokiStorageFolder
$($MonitoringServicesFolder)\loki
C:\Services\Digizuite | This is the path on which all the binaries of the services are placed. | ||||
sitePath | True | "C:\Webs\" + $siteUrl | D:\Webs\dam.company.com | This is the path on which the files for the web interface are placed. | |
---|---|---|---|---|---|
Monitoring | |||||
LokiStorageFolder | True |
| Removed in 5.6.1 | The folder to install loki in. Logs are stored as part of this folder also. | |
LokiHostName | true |
| 10.0.0.5 | The hostname/ip-adress of the server that is hosting Loki. Usually it's the web server. If you have multiple web servers, you need to point to the same webserver on all machines. | |
RabbitMQ | |||||
InstallRabbitMQ | True | $True | $True $False | Should RabbitMQ be installed | |
RabbitMQHost | True | dam.company.com | The hostname of the RabbitMQ Server | ||
RabbitMQUsername | True | [aRabbitMqUsername] | Username for connecting to RabbitMQ Server Note: guest is not allowed anymore | ||
RabbitMQPassword | True | Password for connecting to RabbitMQ Server | |||
RabbitMQVirtualHost | True | VHost on RabbitMQ, normally set to dam database name, excluding _dam | |||
RabbitMQStorage | True | C:\ProgramData\RabbitMQ | The location where RabbitMQ Database and logs are stored – only used when RabbitMQ is installed for the first time on the server! | ||
RabbitMQPath | True | C:\Program Files |
Removed in 5.6.1 - uses servicesFolder | Parent folder, where the RabbitMQ Application is installed. | ||||
FirewallRemoteIpAddresses | True | @("192.168.1.0/24","172.16.1.22") | Ip address or address ranges that are allowed for rabbitmq and erlang ports, these are needed as Batch machines needs access to RabbitMQ. if an empty array are used all ip addresses are allowed = @() IPV4 and IPV6 is possible, ranges only with CIDR notation WIKI CIDR |
---|
SearchService |
---|
ForceRepopulateAllSearches | True |
---|
$false | True = all searches will be repopulated, false = only changed searches will be repopulated | ||||
Elastic search | |||||
---|---|---|---|---|---|
InstallElasticSearchOnPremise | False | False | True | Install Elastic Search OnPremise | |
ElasticSearchUrl | True | "http://localhost: |
SolRPassword
SMTP
Licenses
9200" | The URL to Elasticsearch. See this documentation for more information about Elasticsearch: DC 5.6 Elastic Search Cloud | ||||
ElasticSearchApiKey | True | "" | The API key required to access elastic. Leave empty if no API key is required. | ||
---|---|---|---|---|---|
ElasticSearchVirtualHost | True | $local:RabbitMQVirtualHost | "mm-dam" | To support multiple DC's running on the same Elastic search cluster, the DC code can separate the names of indices and aliases it creates, such that indices do not overlap with each other. | |
SMTP | |||||
smtpUsername | False | smtpuser | The username of the smtp user | ||
smtpPassword | False | SuperSecretPassword | The password of the smtp user | ||
smtpFromEmail | False | noreply@company.com | The e-mail which should be used to send from | ||
smtpPort | False | 587 | 587 | The port used for the smtp server | |
smtpServer | False | smtp.company.com | The smtp server | ||
Licenses | |||||
licenseName | True | Company | The license holder name | ||
licenseSerialNumber | True | 42 | The serial number of the license | ||
licenseApplication | True | - | The license for the main application | ||
licenseAssets | True | - | The license for the assets | ||
licenseUsers | True | - | The license for the users | ||
licenseMetadata | True | - | The license for metadata | ||
licenseOffice | False | - | The license for the office connector | ||
licenseDamForSitecore | False | - | The license for the Dam For Sitecore connector | ||
Azure | |||||
azureStorageAccountBe | False | BackendStorageCompany | The storage account name of backend storage | ||
azureAccessKeyBe | False | - | The access key for the backend storage | ||
azureStorageAccountFe | False | FrontendStorageCompany | The storage account name of frontend storage | ||
azureAccessKeyFe | False | - | The access key for the frontend storage |
Install scenarios
As the same script is used for multiple purposes, this section describes different scenarios and what parameters to be especially aware of.
...
To install a new Digizuite DAM Center with UNC storage, the important parameters are the following
...
...
Upgrade
To upgrade Digizuite DAM Center, the following parameters are importantImportant note: The installation script is NOT idempotent, which means if it is run again it will not validate and correct changes to reflect the config that was used in the installation process.
Issues arising from this could be that for example services and webs folder have been created on the C: drive instead of a dedicated drive.
The only way of correcting these issues are to completely remove everything that the installation script created in this order (some services might need to be deleted by manually killing processes for that service):
- Rabbitmq (service, services folder, references in registry)
Erlang (service, services folder, references in registry)
Nats (service, services folder)
Grafana (service, services folder)
Loki (service, services folder)
Promtail (service, services folder)
Windows exporter (service, services folder)
nssm (service, services folder)
delete DAM database via SQL management studio
- delete SQL security login : "admin_<customername>_dam"
- Delete all IIS sites and App pools in IIS and restart IIS
- check that all these services have been removed (to delete each: use sc delete "<servicename>" via CMD, not Powershell, and if they still don't get removed, restart server)
Keep webs files as these are still relevant and can be moved into the new webs folder on the dedicated drive fx. H: drive
Then finally run the install script again
Variable | Value | Description |
---|---|---|
NewInstallation | $True | This has to be true on new installations, as otherwise databases won't be created |
PrestagedDatabases | $True/$False |
Local UNC
...
If databases and users have not been pre-staged (created), then set this to $False, $True otherwise. This is for installing without sysadmin rights | ||
Azure parameters | Leave them blank, if UNC storage is used |
Upgrade
To upgrade Digizuite DAM Center, the following parameters are important:
Variable | Value | Description |
---|
NewInstallation |
Has to point to the local storage path (e.g. C:\Storage)
Has to point to the local share (\\Webserver\Storage)
External UNC
...
$True | This has to be true on new installations, as otherwise databases won't be created | |
PrestagedDatabases | $True/$False | What happens if it is false, is the databases and the user mapping is checked and corrected (if wrong), but that requires sysadmin rights on the SQL server |
Local UNC
To use a Local UNC (Local means it is located on the webserver), the following parameters are important:
Variable | Value | Description |
---|---|---|
localStoragePath |
Local storage path | Has to point to the local storage path (e.g. C:\Storage) | |
uncStoragePath | UNC storage path | Has to point to the |
local share (\\ |
Webserver\Storage) |
...
External UNC
To use Azure storage, the following parameters are important
...
If provided, it configures up local storage, that can be enabled in the future
...
if provided, it configures a local unc share, that can be enabled in the future
...
Has to be a Microsoft Azure storage account name
...
Accesskey of the above
...
Has to be another Microsoft Azure Storage account name
...
Accesskey of the above
The reason there are two storage accounts is that frontend (used by satellite products, for instance, DAM For Sitecore) and backend (used by the administration web interface) storage is separated.
Azure storage (Upgrade)
The same parameters as the new install are important, but the difference is, that the azure storage account and accesskeys have to correspond to what is already configured in the existing system.
Solr
In old versions of the Digizuite DAM Center, the installer handled the installation and set up of Solr. This is now taken out, enabling you to choose your own Solr provider.
The installer still configures some things for Solr though, which means that the installer still retains the parameter "installSolr".
"installSolr" no longer decides whether or not Solr should be installed. It now works together with the parameter "solrLocalPath", where if "installSolr" is $True, a folder will be created during the installation in the place where "solrLocalPath" describes.
In short, the first time you perform an installation of the DC where you already have a Solr installed, you set both "installSolr" (to $True) and "solrLocalPath". A folder will now be generated in your Solr installation with all the necessary security settings configured for the relevant users. Additionally, the DC's web.config will get the Solr URL inserted.
...
After having run through the installer (or updated with the installer) where "installSolr" has been set to $True, then performing an update where "installSolr" is $True again is redundant, as the folder and rights have already been created/set.
You can, of course, perform an installation where "installSolr" is $False, and then afterward perform an update where "installSolr" is $True.an external UNC (External means that storage is not located on the webserver), the following parameters are important
Variable | Value | Description |
---|---|---|
localStoragePath | C:\Storage | Leave it as: C:\Storage |
uncStoragePath | UNC storage path | Has to point to the external share (\\SomeExternalServer\Storage) |
Azure storage (New install)
To use Azure storage, the following parameters are important
Variable | Value | Description |
---|---|---|
localStoragePath | Local storage path | If provided, it configures up local storage, that can be enabled in the future |
uncStoragePath | UNC storage path | if provided, it configures a local unc share, that can be enabled in the future |
azureStorageAccountBe | Has to be a Microsoft Azure storage account name | |
azureAccessKeyBe | Accesskey of the above | |
azureStorageAccountFe | Has to be another Microsoft Azure Storage account name | |
azureAccessKeyFe | Accesskey of the above |
The reason there are two storage accounts is that frontend (used by satellite products, for instance, DAM For Sitecore) and backend (used by the administration web interface) storage is separated.
Azure storage (Upgrade)
The same parameters as the new install are important, but the difference is, that the azure storage account and accesskeys have to correspond to what is already configured in the existing system.
Solr
SolR is now removed from installer
Info | ||
---|---|---|
| ||
Note: For new installs that uses the old Optimizly connector SolR needs to be Installed manually, The following Web.Config AppSettings is still respected by DC: <add key="SolRUrl" value="http://localhost:8983/solr" /> For updates: the above settings can be copied from web.config.old |
Azure SQL
Info | ||
---|---|---|
| ||
The created databases in the Azure portal has to be postfixed with _dam (i.e. YourCompanyName_dam) |
...
NOTE: If another antivirus is used these paths should be excluded via the management interface of the Antivirus.
Variable | Value | Description |
---|---|---|
StorageArea | localStoragePath | Storage Area eg. G:\Storage\ |
PrometeusArea | PrometeusDataFolder | Prometeus Data Folder eg. G:\Services\prometheus-2.19.0.windows-amd64\data |
ElasticSearchArea | OnPrem Elastic Search Data Folder | Elastic Search Data Folder eg. eg. G:\Services\ElasticSearch\data |
LogArea | LogRoot | Log Area eg. G:\LogFiles\dc.example.com |
RabbitStorage | RabbitMQ Data Area | RabbitData Area eg: RABBITMQ_BASE environment variable |