Online shops involve a large number of data import and export processes. Product data and price data has to be imported, order data has to be exported. The origins of the product and price data are often files from another server. The target destination of the order data is also often another server. The Transport Framework can transport files from one location of the network to another, where they can be processed.
The source code belongs to the cartridges bc_transport and bc_transport_orm. Both are part of the p_platform component set.
The UI source code is added to the according Commerce Management cartridge via an extension point.
The Transport Framework allows to perform file transports via SFTP, FTP, HTTP(S), EMAIL or Azure Storage Account File Shares.
For SFTP and FTP, pushing and pulling are possible. Due to the nature of HTTP and EMAIL, files can only be read from HTTP(S) and send by mail.
Note
If using SFTP with key authentication, the private key must be provided in the correct format (PEM legacy). Newer open SSH implementations generate key pairs in the RFC4716 format by default, rather than the PEM format. If such a key is used, an exception will be thrown by the transport job. To avoid that use the following command to generate the key pair: ssh-keygen -t rsa -m PEM
(you may include certain additional parameters to define target file and/or key strength).
To access the Transport Framework, the user must have the access privilege Transport Manager.
A configuration of the Transport Framework is stored in the database tables TRANSPORTCONFIG
and TRANSPORTCONFIG_AV
.
The feature requires a DBinit of the cartridge bc_transport_orm from p_platform.
A Transport Configuration can also be created with a DBinit step or DBmigrate step.
The preparers are:
com.intershop.component.transport.dbinit.PrepareTransportConfiguration com.intershop.component.transport.dbmigrate.AddTransportConfiguration
Each preparer requires a property file for configuration purposes. In the property file, the keys from the class TransportConstants
are read and the result is written in the database. There is no parameter validation. This way, a blueprint can be created via DBmigrate. Such a blueprint can be completed later.
Examples can be found in the source code in the folder bc_transport_orm/staticfiles/cartridge/lib/com/intershop/component/transport/dbinit.
domain = inSPIRED-inTRONICS process.type = SFTP # Common settings process.displayname = testSFTP process.id = testSFTP file.include.pattern = file.exclude.pattern = location.local = ${IS_SHARE}\\sites\\${SITE_NAME}\\units\\${UNIT_NAME}\\impex\\export location.archive = # remote location settings remote.protocol = SFTP remote.hostname = localhost remote.port = location.remote = ./test # authentication settings authentication.method = PASSPHRASE authentication.username = tester authentication.password = password authentication.keyfilepath = # transfer settings process.direction = PUSH process.transferlimit = process.delete = true
Click Transport Configuration in the left navigation.
Each configuration belongs to a domain.
Select a domain and click Apply.
All available configurations for the domain are displayed in the list.
Note
A configuration cannot be saved until all mandatory fields are filled with valid parameters. Up to this point a message is displayed stating that the configuration is invalid.
Please note: Only basic validation is performed for the input fields, e.g., if a URL is required, the system only checks whether the string entered can be parsed as a URL, but not whether the endpoint exists.
The following example shows settings for SFTP:
For a push or pull, the host and the remote location must be entered. The remote location is the path to the folder which is the source or target of a push- or pull-event.
The following example shows settings for Azure storage:
Account name and key can be found in the Azure portal:
For testing purposes it is recommended to use the Microsoft Azure Storage Explorer:
A transport configuration can be executed in the System Management application.
Enter all required parameters and click Apply.
Parameter | Value |
---|---|
Pipeline | FileTransportJob |
Startnode | Start |
Switch to the Attributes tab.
The job needs two parameters:
To add an additional transport type, a class implementing com.intershop.component.transport.capi.provider.TransportProvider has to be created. This class is responsible for creating and determining existing TransportConfiguration objects and for creating the corresponding TransportExecutor which implements the technical transport of files.
public interface TransportProvider { /** * The type of the transport */ String getType(); /** * The name under which the created Objects are put into pipeline dictionary */ String getDictionaryKey(); /** * get a business object for the given persistent object * @param anID the process id * @param someContext a business object context * @param transportConfiguration the given persistent object * @return */ TransportProcessConfigBO getTransportProcessConfigBO(String anID, BusinessObjectContext someContext, TransportConfiguration transportConfiguration); /** * create a new business object for a transport configuration * @param displayName the display name * @param someContext a business object context * @param transportConfiguration the given persistent object * @return */ TransportProcessConfigBO createTransportProcessConfigBOByName(String displayName, Domain domain, BusinessObjectContext someContext); /** * create a new business object for a transport configuration * @param processID a process id * @param someContext a business object context * @param transportConfiguration the given persistent object * @return */ TransportProcessConfigBO createTransportProcessConfigBO(String processID, Domain domain, BusinessObjectContext someContext); /** * create a new transport executor to execute a file transport * @param processID a process id * @param someContext a business object context * @param transportConfiguration the given persistent object * @return */ TransportExecutorBO createTransportExecutorBO(String anID, BusinessObjectContext someContext, TransportProcessConfigBO aConfig); /** * update a transport configuration for the special config type during dbinit / dbmigrate * @param transportConfig the transport config to update * @param config map of properties */ default void updateTransportConfiguration(TransportConfiguration transportConfig, Map<String, String> config) { } }
Afterwards this class can be bound to the object graph:
public class AzureTransportModule extends AbstractNamingModule { @Override protected void configure() { MapBinder.newMapBinder(binder(), String.class, TransportProvider.class).addBinding(AzureTransportProvider.TYPE).to(AzureTransportProvider.class).in(Singleton.class); } }