BS Cloud Streams

So what are Cloud Streams?

A Cloud Stream is a packaged part of a Contentserv project that can be transported between two projects or downloaded from the Contentserv Market Place.

Cloud Streams are identified by the six-character Cloud Stream ID, like A533C0.

The transport is executed via a cloud storage broker, which allows persisting the package on the cloud server for future download. This way Cloud Streams can be

  • sent like a letter from one project to another, but also be

  • sent to “the cloud” and may be received by as many projects and as many times as needed.

In the diagram above there is the

  • Cloud Tenant 1, which wants to send or publish a stream

  • the cloud in the air which is the virtual repository and stores the stream permanently

  • Cloud Tenant 2, which wants to receive or subscribe to a stream

As streams are stored in the cloud they can be received not only by Cloud Tenant 2 but also by any other tenant, that knows the Stream ID and knows the security token if required.

All in all, Cloud Streams support three major scenarios:

  • Quickly transfer configurations

  • Transport from a Test System to a QA System or from a QA System to a Production System

  • Go “shopping” on the Contentserv Marketplace and download data models, value ranges, workflows, etc.

  • Once the transfer of products is implemented Cloud Streams could even connect an onboarding supplier system with an internal productive PIM.

Shop the Contentserv Marketplace

Contentserv Business Solutions, as well as Contentserv partners, can publish certified Cloud Streams on the Contentserv Marketplace.

 

Sending Cloud Streams

Cloud Streams can be sent within the Contentserv application through the context menu in the list or toolbar menu of the editors for all supported data types.

To send for example a list of selected currencies call up the context menu and choose “Send Stream”:

In the upcoming dialog, you can define

  • an optional recipient

  • a description

  • an expiration definition

  • an optional security token, which is required to receive the stream

  • an optional admin URL of the target system plus optionally again a user name and password

  • and of course, you can redefine the list of selected colors:

By pressing OK the stream will be created and uploaded to the cloud repository. As a result

  • you will see a confirmation with the StreamID and

  • the optional recipient will retrieve an email with the StreamID and installation instructions.

Receiving Cloud Streams

To receive or install a Cloud Stream at least the Stream ID is required. Additionally, Cloud Streams can be sent along with a security token, that is also required when receiving a Cloud Stream.

To install a stream simply call up the main hamburger menu with the three dashes and select “Receive Cloud Stream”:

In the upcoming dialog, you can

  • either search for a published stream on the marketplace

  • or directly provide the Cloud Stream ID at the right bottom next to the OK button:

Click on “OK” to continue with the receiving options:

Here you can provide

  • the optional security token

  • opt for a dry run to see what will be changed first

  • and select or deselect the stream contents.

Click OK to finally receive the stream.

The provider and receiver of the stream will be notified by email during the installation:

When the stream which was sent from the source system is received, it will be rebuilt as good as possible on the target system:

Streams will

  • update existing records

  • create missing records while trying to recreate the same database ID if available

  • and will not delete records that are only on the target system

To identify existing records each data type can decide for itself, which field to use as a unique identifier. Typically the external key will be used here of course, but if this does not exist, also the name or label of the record are used (for example the “Name” of the currency).

Language Mapping between sending and receiving System Languages

Many stream types have language-specific data to transport. When packing the data in the source system the header of the Cloud Stream will contain a list of all languages including IDs and Short Names of the source system. This will allow the receiving system to understand, which Language Short Names are packed and also allow to receiver to map these languages to the local target languages:

Streaming Configuration and Implementation

Some Cloud Streams transfer not only configuration (like for example a Smart Preset), but also implementation (like the corresponding Smart Template). In such a case the corresponding implementation files are packed into the Cloud Streams as Zip-File and then unpacked on the target system creating or overwriting the required files. So be careful when installing such streams and make sure, you do not overwrite important changes of such files on the target system or to have proper backups before the installation.

Streaming Plugins and Modules

Note: In the Contentserv Cloud streaming Plugins is not possible. You can, however, embed plugins in a module and then stream the module.

It is possible to transfer also pure implementation folders containing plugins, icons, skins or complete modules with Cloud Streams. To do so select the Data Type “Plugins” when configuring a stream. The complete folder including all subfolders and files will be packed into the stream.

When receiving such a stream the exact same folder with all subfolders and files will be deployed at the exact same position as on the sending system. If a folder exists with the same name at the target location it will be merged with the folders and files of the Cloud Stream. Existing files will be completely replaced by the Cloud Stream file version, other files and folders found will remain as they are and will not be touched or modified.

Note: Receiving Plugin- or Module-Streams must be permitted by the Cloud Stream security policy setting on the target server. While Plugins-Streams can always be sent, they can only be retrieved when they comply with the global option for the “Security Policy for Code Streams”:

See the chapter “Roles and Rights” and “Options and Security Policies” for further information.

Permanent Streams

One of the very basic ideas of the Cloud Streams is to let data and configuration flow regularly from one system to another. This permanent streaming is actually the motivation of the name Cloud Streams.

As the sending and receiving of Cloud Streams are built on top of the Active Script layer, such a permanent stream can be configured as a special Active Script which is then executed with a custom frequency.

All Active Scripts around Cloud Streams can be found in the System Preferences tree in the sub-node “Cloud Streams”:

To be accessible here make sure, that the user's role has the required right

BS / Cloud Stream / Administrate Cloud Streams:

Within the Cloud Stream node, the following sub-nodes can be found:

Permanent Published Streams

Published streams are streams that are permanently stored and can be sent or resent at any time, including a scheduled update. When a published stream is sent for the very first time, a random Stream ID is created and stored with the stream. Future updates to the same Stream ID then overwrite the published stream. This way future changes in the stream data will earlier or later be resent and update the published stream.

Permanent Subscribed Streams

Subscribed streams are receiving published streams. To identify the right stream the same Stream ID must be provided, that was used to send the published stream.

Transport Streams

It is possible to create permanent streams that are completely transported end-to-end from the sender to the receiver. To create such a transport stream create a permanent published stream and provide the

  • Receiving URL to the target system admin folder

  • Receiving user on the target system

  • Password for this receiving user

When you run this script the stream will be sent to the cloud as usual and then the target system is called up via REST Service to install the stream immediately.

The log on the sender system indicates this at the end of the script:

The import script at the target system will be the internal “BS CloudStream Receive” script, the log can be found here:

Internal Streams

When a user simply selects a bunch of items in a list and calls up the Cloud Stream / Send Stream context menu, then for the sake of simplicity the data is transferred using an internal default Cloud Stream Active Script. Also when simply receiving a Stream ID such an internal default Cloud Stream is used. Without these internal scripts, the user would have to create a permanent stream each time some data should be sent.

These internal streams are called

  • BS Cloud Stream Receiver

  • BS Cloud Stream Sender

They can be listed here so that the internal processes can be tracked and traced.

Log of Streams

As all streams are sent and received via Active Jobs there is a complete list and history of all streaming activities:

When you click on a stream log, you will see the messages generated in the lower area. Click on the “Level” drop-down to focus on problems or even show additional debug messages.

Notarization of Cloud Streams

When publishing or receiving Cloud Stream the metadata of the transaction is notarized by the central Cloud Stream server. This allows verifying the stream content, sender, and receiver at any time and also enables the support to review issues. Each transaction will receive a Cloud Stream notarization transaction number (TAN) that is displayed to the sender and receiver in the Cloud Stream log.

The notarization is processed automatically in the background against the Contentserv Cloud Server and is mandatory unless a private streaming server is configured.

Installation and Setup

To use Cloud Streams you need to check out at least the following modules within the Contentserv project folder:

  • [Project]/modules/alani (which is, as usual, the base foundation of all business modules)

  • [Project]/modules/cloudstream (where the cloud streams are located)

The SVN branches for Business Solutions are listed at https://contentserv.atlassian.net/wiki/spaces/DOCU/pages/1209172593/Installation+and+Licensing.

After the initial checkout log off and on to your project to flush the plugin cache.

Then make sure to activate the Cloud Streams in the global System Preferences Settings at BS / License / BS CloudStream:

Finally activate the required rights for the user roles:

Roles and Rights

To send or retrieve Cloud Streams additional rights are required for the role of the users:

The following rights are available at BS / Cloud Stream:

Right Name

Functionality

Right Name

Functionality

Access to Studio

Required to use the Cloud Stream Studio

Send Cloud Streams

Required to create and send Cloud Streams

Receive Cloud Streams

Required to receive Cloud Streams

Administrate Cloud Streams

Required to see the Cloud Streams in the global system preferences tree

List of 58 available Stream Types

To be “streamable” every specific area of configuration requires an explicit implementation provided through a Stream Plugin. When you, for example, want to send a Smart Preset along with the corresponding Smart Template in a Cloud Stream, a developer has first to provide a Cloud Stream Plugin and define what exactly has to be packed into the stream and then on the target system how to unpack it again.

For the following areas Cloud Stream Plugins are available:

  • Active Scripts

  • Amazon Attributes

  • Cockpits

  • Color

  • Comet Files

  • Comet Page Templates

  • Comet Product Templates

  • Comet Shares

  • Currency

  • Dashboards

  • Data Monitor

  • Data Quality Rules

  • DeepSearch Areas

  • DeepSearch Contents

  • DeepSearch Facets

  • DeepSearch Themes

  • Flex Table Presets

  • Languages

  • MAM Archives

  • MAM Attribute Rights

  • MAM Attributes

  • MAM Classes

  • MAM Export Presets

  • MAM File Deletions (Pro)

  • MAM Files (Pro)

  • Measure Units

  • Modules

  • OpenSearch Areas

  • Options

  • PIM Attribute Rights

  • PIM Attributes

  • PIM Channel Deletions (Pro)

  • PIM Channels (Pro)

  • PIM Classes

  • PIM Product Deletions (Pro)

  • PIM Products (Pro)

  • Plugins

  • Portals

  • REST Services

  • Roles

  • Search Favorites

  • Skin Files

  • Skin Settings

  • Smart Presets and Templates

  • Tags

  • Task Attribute Rights

  • Task Attributes

  • Task Classes

  • Terminology Rules

  • Terminology Terms

  • Translation Jobs

  • User (Pro)

  • User Attribute Rights

  • User Attributes

  • User Classes

  • Value List Items

  • Value List Ranges

  • Workflows

Streaming PIM Products and Channels, MAM Files and Users

Please note, that the streaming of Products, Channels, Files and Users has been added recently and still is in a Beta Test. Do not transfer these types in production environments. Especially the transfer of references, table data and data model along with the streamed master data is not supported jet.

Options and Security Policies

In the global options of the BS Cloud Streams, there are several options available.

Option Name

Functionality

Option Name

Functionality

Security Token Enforcement

Defines if users have to opt for a security token when sending streams.

Disabled Stream Types

Allows to limit the types of data that can be sent and retrieved

Streaming Plugins and Modules

Within the Stream Types, there are two very critical stream types that allow deploying software code files between servers:

  • Module

  • Plugin

Modules can only be selected for transportation on the source system if they contain a file “manifest.json” in the module root folder with the “moduleType” set to “custom”. The Cloud Stream Module will automatically generate for each module found in the project a default manifest file.

The JSON file will have the following structure:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 { "manifestSpecificationVersion": "1.0", "projectInformation": { "partnerName": "[Name of the Partner]", "partnerToken": "[Identifier Token of the Patner]", "projectName": "[Name of the Project]", "projectId": "[ID of the Project]" }, "moduleInformation": { "moduleType": "[core|bs|custom]", "moduleName_DE": "[Name of the module in German]", "moduleName_EN": "[Name of the module in English]", "moduleDescription_DE": "[Description of the module in German]", "moduleDescription_EN": "[Description of the module in English]", "moduleVersion": "[Version of the module]", "moduleMinRequiredCSVersion": "[Minium Contentserv version supported]", "moduleMaxRequiredCSVersion": "[Maximum Contentserv version supported]" }, "deploymentInformation": { "deploymentName": "[Name for Deployment]", "deploymentType": "[Type of Deployment]", "deploymentRevision": "", "deploymentOptions": { "updateDataModel": [true|false] } }, "additionalInformation": { "documentationLink": "[URL to the Documentation]", "releaseNotes": "[URL to the Release Notes]" } }

As such a code transport can also transport malicious software components, there are additional security policies available in the Cloud Stream settings:

Option Name

Functionality

Option Name

Functionality

Security Policy

Defines if Code Streams from other domains or uncertified Streams are allowed.
Same Domain: Code Streams that were added from a server with the same domain
Certified Streams: Streams that are listed as certified by Contentserv

Trusted Code Stream IDs

Allows line by line listing of explicit Code Stream IDs that can be installed

Code Validator URL

Allows providing an alternative Code Validator REST Service URL

The Code Validator is a REST Service on the server that can check and validate transported code

  • before uploading to a stream

  • after downloading into a cache folder but before moving it into place for action

The REST Service will retrieve

  • the folder to be scanned relative to the project cache directory as directory parameters as well as

  • the ActiveScript JobID as GET parameter “JobID” and

  • the original packed source folder or file name as GET parameter “Folder”.

Example:

The original file folder ../BSLive/images/library should be transported and the URL provided as Code Validator URL in the Cloud Stream settings would be “../admin/rest/foo/validate/”

When the temporary folder to be checked is [Project]/data/cache/bs/cloudstream/2ab450/ the URL ../admin/rest/foo/validate/bs/stream/cloudstream/2ab450?JobID=1234&Folder=..%2FBSLive%2Fimages%2Flibrary will be called.

The REST Service is expected to return a JSON array with error messages if the code in the folder is rejected, otherwise an empty JSON array.

If no Code Validator URL is provided, the internal default REST Service is called up at

../admin/rest/cloudstream/validatecode/

which does some very basic checking, but gives no guarantee, that code will not be malicious.

Streaming Skins, Icons and Translations

The stream type “Skins” is helpful, if you need to transfer the typical customization files of each project, for example:

  • plugins/deepsearch (for the custom deepsearch skin files)

  • plugins/opensearch/skin.css (for the custom opensearch skin files)

  • plugins/portal/skins/ (for portal skins)

  • images/library/contentserv/flat/vector (for the custom class icon in the PIM and DAM trees)

  • core/extensions/stringfunctions (for custom string functions)

  • templates/Dictionary.*.csv (for all project translations)

  • etc.

Options for E-Mail Notifications

To enable E-Mail notifications you must first configure the SMTP settings as described here: https://contentserv.atlassian.net/wiki/spaces/DOCU/pages/1394607080/Creating+E-Mails+using+BS+Skills+API

Within the Cloud Stream options there is also a section “E-Mail Notifications”:

To notify publishers and subscribers there are several checkboxes and settings:

When sending Cloud Streams:

  • E-Mail to the publisher, which is the user sending the stream

  • E-Mail to the subscriber, which is the receiver of the stream

When receiving Cloud Streams:

  • E-Mail to the publisher, which was creating the stream

  • E-Mail to the subscriber, which receives the stream

Additionally, a comma-separated list of other E-Mail addresses can be provided, which will always be notified by E-Mail when any stream is sent or received.

Further, an E-Mail Template can be selected for all Cloud Stream mails, which is overwriting the default BS Mail Template. See the documentation there for further information about creating such templates.

Using a private Streaming Server

Further, it is possible to define in the global options a custom streaming server. This can be important when the server can not access the internet and the standard streaming server from Contentserv. The Streaming Server must be a standard compliant WebDAV Server. All Cloud Stream transactions will run against this server and also the public marketplace streams will not be available.

Watch the Video

There is a webinar recording focusing on the Cloud Streams at

https://register.gotowebinar.com/recording/3728834662254366989

Changelog

Date

Topic

Details

Date

Topic

Details

Mai 01, 2021

Added Cloud Stream License Options Table

Added an article explaining the Cloud Streams and Cloud Streams Pro license scenarios. Click here for details.

Mar 21, 2021

Added Language Mapping

Added language mapping settings between the incoming stream languages and the local project languages

Feb 20, 2021

Added Skin Settings Stream

A new Cloud Stream data type plugin "Skin Settings" was added to stream core skin settings from the global preferences.

Feb 20, 2021

Added Comet Template Streams

A new Cloud Stream data type plugin was added to stream Comet page and product templates.

Feb 07, 2021

Added notarization of Cloud Streams

For improved security and for better support overview Cloud Stream Transactions are now automatically notarized when publishing and receiving.

Feb 01, 2021

Various improvements and bug fixes

Cloud Streams are constantly improved and became more reliable and flexible. Transport of products and their classes, attributes and value ranges are quite stable now. Leaving Beta Status, as soon as we are getting more positive feedback from testers.

Jan 27, 2021

Multiple Improvements

Multiple bug fixes and features like adding PIM classes and attributes when sending products, support for Windows file system when sending modules and more.

Jan 24, 2021

Added CloudStream with German Commodity Codes 2021 ("Zolltarifnummern")

Added a CloudStream to the marketplace that allows downloading the German Commodity Codes ("Zolltarifnummern") as Value Range

Dec 14, 2020

Added stream type for Measuring Units

Added a specialized and fully functional stream type taking care of streaming Measuring Units

03. Sep 2020

Streaming Mamfiles

Consider the case that a sent file already exists in the path of the receiving system. The algorithm works the following way:

  • we first try to find the target item by it’s CloudStreamID

  • we then try to find the item by its filename and the target path

  • the target path is first determined by the ParentCloudStreamID and the by the Target Folder set in the configuration

03. Sep 2020

Streaming Items

Standard fields like LastChangeDate are now transferred from the sending system to the receiving system

03. Sep 2020

Streaming attribute values for items

We now differentiate between different item types, which allows us to pack and unpack certain attribute types in a special way. E.g. values of value lists are identified by their external key. Supported types are:

  • Value Ranges

  • Normal types like Caption, Checkbox, etc.

References are not supported yet.

03. Sep 2020

Streaming Items

The class assignments that belong to an item are now identified by their external key. If a class does not have an external key this classification cannot be streamed and there is a warning.

03. Sep 2020

Streaming Items

  • If a class in the receiving system for a received item is inherited from above, then nothing needs to be done.

  • If the class is not inherited, then it must be assigned to the transferred root folder. If the class does not exist in the receiving system, there is a warning / log entry.

03. Sep 2020

Streaming Items

Users for streamed Items (Author and LastEditor) are identified by their Username. If the same user exists in the receiving system, the user will be set in the item data. As a fallback, we set those fields to the user executing the script.