This article documents a common performance issue related to running Too Many CPU Intensive Tasks within Liferay DXP 7.0. Continue reading to discover the necessary information about this issue and how to navigate through it.
The following are some of the more common causes and solutions:
Some subscribers may encounter a performance issue when many CPU intensive tasks are running at the same time. This is a generalized pattern that could come from a variety of use cases. Depending on the severity of the issue being faced, the thread dump may contain anywhere from tens to hundreds of threads working on such tasks. Customers will see higher CPU usage and slower response times, and in rare cases, the Liferay platform will become seemingly unresponsive.
Please review the following information regarding the more common causes, and the subsequent solutions.
Minifier is enabled by default but can be turned off by setting the below property in the
In a thread dump, we would notice this issue if we were seeing the following string dozens, or in some cases, hundreds of times:
com.liferay.portal.util.MinifierUtil._minifyCss "http-/0.0.0.0:8280-336" daemon prio=10 tid=0x00007f8bfc282ea0 nid=0x2c0d runnable [0x00007f8aa6d27000] java.lang.Thread.State: RUNNABLE at java.util.regex.Pattern$CharProperty$1.isSatisfiedBy(Pattern.java:3689) at java.util.regex.Pattern$7.isSatisfiedBy(Pattern.java:5171) at java.util.regex.Pattern$7.isSatisfiedBy(Pattern.java:5171) at java.util.regex.Pattern$7.isSatisfiedBy(Pattern.java:5171) at java.util.regex.Pattern$CharProperty.match(Pattern.java:3694) at java.util.regex.Pattern$Curly.match0(Pattern.java:4158) at java.util.regex.Pattern$Curly.match(Pattern.java:4132) at java.util.regex.Pattern$Start.match(Pattern.java:3408) at java.util.regex.Matcher.search(Matcher.java:1199) at java.util.regex.Matcher.find(Matcher.java:592) at java.util.regex.Matcher.replaceAll(Matcher.java:902) at java.lang.String.replaceAll(String.java:2162) at com.yahoo.platform.yui.compressor.CssCompressor.compress(CssCompressor.java:345) at com.liferay.portal.util.MinifierUtil._minifyCss(MinifierUtil.java:59) at com.liferay.portal.util.MinifierUtil.minifyCss(MinifierUtil.java:38)
Here is another occurrence from the same thread dump using Spotify's Online Thread Dump Analyzer:
In such cases, notice that the exact stacktrace may be different. Since all these threads are executing, they are at different stages of their lives: all of them waiting for CPU time to be able to progress which leads to resource starvation.
minifier.enabled property should ease the stress on the on the CPU.
The StripFilter is similar to the Minifier in that the end result is smaller files in exchange for CPU usage. See the comment for Liferay Portal 6.2: The strip filter will remove blank lines from the outputted content. This will speed up page rendering for users that are on dial up.
We recommend turning this off from the start because there is no real benefit compared to the cost on the CPU. Removing new lines from the response should save some space. However, gzipping the content is very efficient and it does not really matter whether there is newlines in a file or not (except in extreme edge cases). Furthermore, the time it takes for the browser to load a page is only affected minutely; the majority of time taken to load a page is spent waiting on resources to be downloaded anyway.
This means that there is no benefit of doing this for the network nor for the end user and yet it has performance impact on the CPU.
To find such threads, just search for the string
StripFilter.strip. - If there are more than five occurrences in any given thread dumps, be aware that your response time is suffering because of this. Included below the top of a stack trace that is actually executing the StripFilter:
"http-/0.0.0.0:8280-186" daemon prio=10 tid=0x00007f796c343250 nid=0x175e runnable [0x00007f7854134000] java.lang.Thread.State: RUNNABLE at java.nio.Buffer.checkIndex(Buffer.java:537) at java.nio.CharBuffer.charAt(CharBuffer.java:1238) at com.liferay.portal.kernel.util.KMPSearch.search(KMPSearch.java:218) at com.liferay.portal.kernel.util.KMPSearch.search(KMPSearch.java:200) at com.liferay.portal.servlet.filters.strip.StripFilter.processInput(StripFilter.java:410) at com.liferay.portal.servlet.filters.strip.StripFilter.strip(StripFilter.java:668) at com.liferay.portal.servlet.filters.strip.StripFilter.processFilter(StripFilter.java:399)
The GZipFilter's job is to compress the response going back to the browser to save on bandwidth and, ultimately, page load times. It is controlled by the below property:
The GZipFilter was introduced earlier when discussing the StripFilter's impact; in that section, it was stated that having the StripFilter enabled is not needed because the gzipping the response is more beneficial anyway. The choice of the word gzipping instead of GZipFilter is intentional; if at all possible, Liferay recommends alternative ways of gzipping the content. For example, Apache's mod_deflate has the benefit of being on another box which allows another server's CPU work on compressing the response. The GZipFilter is hard to catch as its main job is to compress everything once the response is ready. Therefore, the execution process is visible by seeing GZip related Java classes executing at the end and a
GZipResponse$1 object being locked, for example:
"http-/0.0.0.0:8280-324" daemon prio=10 tid=0x00007f8bfc26f460 nid=0x2c01 runnable [0x00007f8aa7933000] java.lang.Thread.State: RUNNABLE at java.util.zip.Deflater.deflateBytes(Native Method) at java.util.zip.Deflater.deflate(Deflater.java:430) - locked <0x00000006dd9bf380> (a java.util.zip.ZStreamRef) at java.util.zip.Deflater.deflate(Deflater.java:352) at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:251) at java.util.zip.DeflaterOutputStream.write(DeflaterOutputStream.java:211) at java.util.zip.GZIPOutputStream.write(GZIPOutputStream.java:146) - locked <0x00000006dd9bf2f8> (a com.liferay.portal.servlet.filters.gzip.GZipResponse$1) at com.liferay.portal.kernel.servlet.ServletOutputStreamAdapter.write(ServletOutputStreamAdapter.java:54)
As mentioned, our recommendation is turning the GZipFilter off and relying on other services if possible, even if there are currently no performance issues caused by or related to GZipFilter.
Comparing passwords is done by hashing the password that the user specified in the form, then comparing it with the already hashed (and saved) password in the database. By default, the Liferay platform uses a computational heavy algorithm: PBKDF2 with HMAC SHA1 with 160 bit hashes and 128,000 rounds.
Below is the portal property that sets the encryption algorithm:
The reason it has such a toll on the CPU is because of the way it works. It uses a pseudo-random function on the input text plus a salt to create -bit hashes (default: 160), then repeats it times (default: 128,000). As a result, many users logging in at the same time could potentially lock up the portal for a while until the CPU can handle all the requests. Below is the top of a stack trace from a thread working on a user login:
"ajp--127.0.0.1-8009-198" daemon prio=10 tid=0x00007fcef1575600 nid=0xfe4d runnable [0x00007fced4138000] java.lang.Thread.State: RUNNABLE at com.sun.crypto.provider.HmacSHA1.engineReset(HmacSHA1.java:118) at javax.crypto.Mac.doFinal(Mac.java:547) at javax.crypto.Mac.doFinal(Mac.java:589) at com.sun.crypto.provider.PBKDF2KeyImpl.deriveKey(PBKDF2KeyImpl.java:178) at com.sun.crypto.provider.PBKDF2KeyImpl.(PBKDF2KeyImpl.java:121) at com.sun.crypto.provider.PBKDF2HmacSHA1Factory.engineGenerateSecret(PBKDF2HmacSHA1Factory.java:71) at javax.crypto.SecretKeyFactory.generateSecret(SecretKeyFactory.java:335) at com.liferay.portal.security.pwd.PBKDF2PasswordEncryptor.doEncrypt(PBKDF2PasswordEncryptor.java:77)
Seeing multiple threads, like this one, working on encrypting the password is an indication that the Liferay platform is suffering and users will report (or have already reported) long wait times when they are trying to log in.
There are a few solutions to this issue.
- Tuning the
- Either lower the size of the hash that is generated or the number of rounds.
- Both of these steps, however, lower the strength of the password encryption which needs to be taken into consideration as well.
- User passwords may need to be forcibly reset.
- Using another algorithm altogether:
- Moving from
PBKDF2to another less computational heavy algorithm would also ease the impact on the CPU.
- This also has the downside of moving to a less secure encryption.
- User passwords may need to be forcibly reset.
- Moving from
- Using a third party authentication service like LDAP or SSO solutions.
- The process of authentication is offloaded and the Liferay platform simply takes care of negotiating with the third party services by sending credentials, managing sessions, etc.