The Intershop Commerce Management (ICM) provides the necessary data for running the default Intershop PWA deployment via a REST API.
Since release 0.23, it uses the new headless application type (see Migrations / From 0.22 to 0.23).
Using another backend is also possible as long as it provides a compatible REST API.
In order to facilitate server-side rendering (SSR), the default deployment uses dockerized express.js servers running Angular Universal, orchestrated by PM2.
On a new request, Angular Universal pre-renders the page and instantly provides the browser with meaningful content.
For an architectural overview of how SSR works in the Intershop PWA, see Deployment Scenarios.
Pre-rendering pages enables a number of features:
For an overview of the ever-growing list of third party integrations relating to SSR and deployment in general, see Third-party Integrations.
As a first point of contact for any browser requests directed at a default deployment, the custom nginx reverse proxy webserver serves a number of functions.
Each of these is separately configurable (see Building and Running nginx Docker Image).
Nginx enables the following features to be used in an Intershop PWA deployment:
For an overview of the ever-growing list of third party integrations relating to nginx and deployment in general, see Third-party Integrations.
The browser runs the bootstrapped/pre-rendered Angular application.
After initially communicating with the nginx webserver, later REST requests are directed to the configured ICM endpoint or custom backend.
For more information on the browser's role in rendering the Intershop PWA, see Deployment Scenarios for Angular Applications.
Chaining the building blocks together results in the depicted system.
Read on for a step-by-step walkthrough of the initial connection request.
The browser requests the page by URL from the nginx.
The node express.js server runs Angular Universal pre-rendering for the requested URL.
Angular Universal fills the requested page with content retrieved via the ICM REST API.
The response is delivered to nginx, where it is also cached if caching is enabled.
The response is delivered to the browser.
The initial page response is displayed to the user and the Angular Client application boots up in the browser.
Once booted up, additional REST calls are directed straight to the ICM, and the PWA acts as a single-page application. No further HTML pages are requested.
Deployment without using nginx is theoretically possible, even though many useful features of an nginx deployment are obviously not available.
Warning
Enabling service workers is not possible without using nginx. The Intershop PWA will not function as intended if you do.
For security reasons, it may be desirable to hide the backend address and prevent direct access to it.
The Intershop PWA supports this functionality and the default deployment uses this feature.
To enable it, set the PROXY_ICM
environment variable on the SSR container to a new URL.
Instead of directing REST calls straight to the ICM (see step seven in the Default Production Deployment), traffic is routed through the SSR container.
The express.js server is set up to proxy requests.
Upon completing the Angular Universal pre-rendering, all URLs referring explicitly to the used ICM (links, images, configuration) are replaced with URLs to the proxy.
For scalability and parallelization reasons, considering whether each building block is stateful or stateless is important.
The ICM deals with large databases, caches write and read requests and, therefore, manages a high amount of internal states.
Substituting ICM instances at runtime and managing fail-over capacities is not trivial.
The SSR container acts like a pure function.
Making the same request is always going to return the same result.
This way, setting up multiple instances, rebooting containers at runtime or even setting up serverless deployments are all possible.
It is important to keep the stateless nature of the Intershop PWA in mind when writing new code and expanding its functionalities.
The nginx container is not technically stateless, since it handles caching of SSR responses.
However, nginx is not functionally dependent on its internal state like the ICM is.
If an nginx container were to lose its internal state, all cached responses would instead be passed on to the express.js server and be answered at the cost of a delay.