Remote Staging Setup for Portal 6.2

This article describes how to configure a basic remote staging setup on Liferay Portal.

The troubleshooting guide at the end of the article pertains to Liferay Portal 6.2. Users who have upgraded to Portal 6.2 will see many changes and functionality from previous versions. Although some of the information below may pertain directly to users on Portal 6.2, other parts are generally applicable.


1. Set up two instances of Liferay.

Make sure that the same patches are applied to all servers being used in order to avoid possible staging issues. For more information on that please check Common Errors With Staging or Export/Import Between Servers.

This can be set up via a VM, or a second instance of Liferay running on another port (changes can be made via conf/server.xml). If neither of these are an option, a separate site on the same instance can be used. Keep in mind, these options are just for testing. The following directions will be for a VM, but can be adjusted for your needs.

2. Configure both instances of Liferay.

In order to be able to publish your Site's changes to a Remote Site, the remote server must be added to the staging server's list of allowed servers, and vice versa. In case on Liferay 6.2, you also need to specify an authentication key to be shared by your staging and your remote server and enable each Liferay server's tunneling servlet authentication verifier.

To do this, you need to setup your on both Liferay instances.

Staging Server Liferay 6.1 example:


Remote Server Liferay 6.1 example:


Staging Server Liferay 6.2 example:


Remote Server Liferay 6.2 example:


3. Set up administrator user with same credentials on both instances of Liferay.

When a user attempts to publish changes from the Staging Server to the Remote Server, the portal passes the user's email address, screen name, or user ID to the Remote Server to perform a permission check. In order for a publishing operation to succeed, the operation must be performed by a user that has identical credentials and permissions on both the Staging and the Remote environments. On Liferay 6.1, the users' password also needs the be the same, as there is no pre-shared key between the two servers.

4. On the Staging Server

1. Navigate to the Control Panel.
2. Create a new blank site called "Test Staging". This will be the site you will be creating content on, to publish to the remote server, your "source" site.
3. Create a page on "Test Staging" site, and add some web content articles on the page.

5. On the Remote Server

4. Navigate to the Control Panel.
5. Create a new blank site called "Test Remote". This will be the site that you would like to publish to, your "target" site.
6. Take note of the site ID (i.e. "10401") after it is created. If you somehow navigate away from this, you can always check it by navigating to the site settings panel of your site.

6. On the Staging Server

7. Return to "Test Staging".
8. Navigate to your site's site settings panel, and click on the staging option on the right.
9. Select remote live and notice the configuration settings section.
10. Under the first remote Host/IP field, type the IP address of the remote/target server that "Test Remote" site is on. If you are using the same instance, you can type: localhost.
11. The port will be your Remote/Target Server's port (8080 by default)
12. The Remote Site ID will be the ID of "Test Remote" Site as mentioned in step 6.
13. Leave all the other settings as default. Click save.
14. Leave the Control Panel and go to your "Test Staging" Site.
15. Click on the "Staging" button on the top, and select "Publish to Remote Live"
16. Verify the page(s) that you want to publish and that your Remote Live Connection Settings are correct.
17. Click Publish. On Liferay 6.2 a progress bar will appear that should end in "Successful" state. In Liferay 6.1 a green bar should appear that says "Your Request Completed Successfully." Close the pop-up.
18. Now click on the "Go to Remote Live" button top.
19. You have now published to "Test Remote" Site (remote/target Site). Notice the web content article that is now displayed from "Test Staging" Site.


Liferay remote staging can fail in one or more of the seven following areas:

  1. Connection: the staging server establishes a connection with the remote server in order to check the configuration.
  2. Export: the staging server exports the desired site and its content as archive (.lar) on the local storage (temp)
  3. Data transfer: staging server transfers the archive to the remote server
  4. Checksum: the remote server validates archive's integrity
  5. Validation: the remote server checks for missing references or invalid content
  6. Import: the remote server proceeds with the import of the content. If it fails, the entire import will be rolled back.
  7. Cleanup: both servers will cleanup temporary files.

If Remote Staging fails, first check the error logs which can reveal which one of the above errors where the error is. The publishing process will not state which step failed but there are places where the administrators can examine to find the error. Once the error has been identified, administrators can use some of the following steps to fix the problem.

  1. Connection Errors
    • Are the two servers connected properly? tunneling.servlet.shared.secret= and axis.servlet.hosts.allowed= values have been declared in the (see above)
    • Web Proxy Servers: if there is a web proxy server, make sure that the server preserves the origin host in the proxy request or the request will be rejected by the remote server due to an invalid IP address.
    • Warning: this step is not secure. If the proxy server cannot be configured, admins might want to change the axis.servlet.hosts.allowed= of the remote server to match the IP address of the proxy server. The risk is that it allows anyone to access Liferay's remote API.
  2. Data Transfer
    Common issues in this phase are:
    • Timeout. If this process fails, there might be other rules on proxies, servers, switches, application servers, or antivirus which affected this.
    • Size. By default, if the file is bigger than 10MB, the process will send the .lar file in multiple pieces. This value can be changed with the staging.remote.transfer.buffer.size= property. As suggested above, "export often and export small."
  3. Export
    Export rarely fails. If it does,
    • Make sure there is enough space on the disk drive.
    • Export the site as a .lar from the Control Panel.
    • Install the latest patches.
  4. Checksum
    This step should technically never fail. If it does:
    • Try to publish again. Maybe one sent file got corrupted during the transfer.
    • If there is a cluster environment: check to see if sent files (when archive > 10MB) are clustered. This should not happen but if it does, Liferay will end up with 50% of the files on one server and 50% on the other server (assuming a 2-node cluster). The checksum will then always fail.
    • Check the size of the archive. (see Data Transfer #2)
  5. Validation
    An error during validation will provide more information in the "remote publishing" interface (history):
    • Try to identify the missing reference and understand why the system thinks this reference is missing.
    • If the site has global references (structures, templates, categories, etc.), make sure to publish the Global site first. The Global site can be published from the Control Panel > Sites.
    • If the site does not have any external reference, try publishing the entire content. In the "remote publishing" options, choose "All Content".
  6. Import
    There are many possible areas where the import process may fail. Possible solutions are:
    • If there are errors about duplicated content, publish the entire site again.
    • Disable "Version history" to reduce the size of the publication and see if that makes any difference.
    • Check logs on the remote server for more detailed information:
      • If it is an OutOfMemoryException, increase the memory available for the server application (-Xmx).
      • If it is a GenericJDBCException, check the JDBC connection pool. It might be that all connections have been used and none released. Restart the application server to free them. Check the JDBC settings according to your needs.
    • If the import process is really slow, monitor the infrastructure such as the CPU and IO access and allocate more resources to the virtual machine or server. If the import process takes too long (typically 30 minutes and more), the staging server will timeout and will not cleanup properly even though the import still continues and completes (see next point).
  7. Cleanup
    Cleanup will never fail but the remote staging can still fail. At this point, the following will happen in Liferay:
    1. The staging server keeps waiting for the remote server to finish the import. After some time, the staging server will get a timeout and its connection will be reset.
    2. When this happens, the staging server will clean up on its side and set the status of the process to "failed".
    3. However, the import process could continue on the other end (remote server) and might succeed. The remote server display a wrong status and the "last publication date" not correctly set.
    If the publication technically succeeded but was reported as failure because of some timeout, start another publication and set the date range to "Last 12 hours". This smaller publication should execute faster, succeed and update the "last publication date" correctly.

Still not publishing?

If remote publishing is still failing, consider trying one of these:

  • If on a clustered environment, try to disable all but one node.
  • If a proxy server is installed in front of the application server, try publishing directly to the application server by opening firewalls if any.
  • If using a SSL connection for the remote publishing process, try without it. Another option is to regenerate the Java security certificate; some certificates may have expired and thus prevent the correct "handshake" between the staged and remote servers.
  • Try moving the remote server on the same network.
  • Try moving your remote server on the same machine.

Additional Information

Was this article helpful?
0 out of 0 found this helpful