Welcome Serena Central users! CLICK HERE
The migration of the Serena Central community is currently underway. Be sure to read THIS MESSAGE to get your new login set up to access your account.
Tony Okusanya Respected Contributor.
Respected Contributor.
444 views

Timeout Deploying Content pack

Jump to solution

Greetings

I am having problems deploying content pack into HPOO 10.6 using the browser or the OO Shell on one of the central servers. 

I have updated the session-timeout in /central/tomcat/conf/web.xml but this doesnt help. 

Any ideas on why this would timeout when a content pack is being deployed. The CP is only 50MB I dont think the size is an issue.

Any help would be appreciated.

0 Likes
1 Solution

Accepted Solutions
Highlighted
Tony Okusanya Respected Contributor.
Respected Contributor.

Re: Timeout Deploying Content pack

Jump to solution

I did check the logs on the central server (see below)

server.log error:
2017-06-27 18:16:28,143 [taskExecutor-2] (DeploymentProcessServiceImpl.java:376) ERROR - Exception During Deployment:
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded

I was able to make the following changes from the Tunning guide and got it to deploy however it took a while 

Changes
1. FIle: <installation_folder>/central/conf/central-wrapper.conf
wrapper.java.initmemory=4096
wrapper.java.maxmemory=4096
wrapper.java.additional.25=-Dcloudslang.worker.numberOfExecutionThreads=30


2. File: <installation_folder>/central/conf/database.properties
db.pool.maxPoolSize=100

Thanks for the assist. I am hoping this gets better after I break the project up into smaller pieces.

View solution in original post

0 Likes
4 Replies
AndreiTruta Outstanding Contributor.
Outstanding Contributor.

Re: Timeout Deploying Content pack

Jump to solution

It might happen if you for instance have mysql server not configured according to the database specs.

If you have doubts about the database configuration, I suggest to check the database requirements and match those with the current settings that you have for the oo database.

docs.software.hpe.com should get you to the operations orchestration docs where you could find the databse related information too.

Andrei Vasile Truta
0 Likes
Tony Okusanya Respected Contributor.
Respected Contributor.

Re: Timeout Deploying Content pack

Jump to solution

Thanks for the response Andrei.

we have a central installed on Windows 2012R2 x64 and are using Microsoft SQL Server 2014

when I try to deploy the cp using OOShell it looses connection before completing

*****************************
oosh.bat deploy --url https://central-url/oo --user admin --password ********** --files D:\OODeploy\MyOO9_base-cp-1.0.3.jar
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
unable to deploy, reason: Lost connection, please reconnect.

*************************

0 Likes
AndreiTruta Outstanding Contributor.
Outstanding Contributor.

Re: Timeout Deploying Content pack

Jump to solution

you can ignore the SLF4J messages at this stage.

As for the connection lost message: it would be great to be able to check the oosh logs and the Central logs.

Are you familiar if the MSSQL 2014 server is configured according to the data here? If not, please make sure it is before doing any other checks

Hope it helps,

Andrei Vasile Truta
0 Likes
Highlighted
Tony Okusanya Respected Contributor.
Respected Contributor.

Re: Timeout Deploying Content pack

Jump to solution

I did check the logs on the central server (see below)

server.log error:
2017-06-27 18:16:28,143 [taskExecutor-2] (DeploymentProcessServiceImpl.java:376) ERROR - Exception During Deployment:
java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded

I was able to make the following changes from the Tunning guide and got it to deploy however it took a while 

Changes
1. FIle: <installation_folder>/central/conf/central-wrapper.conf
wrapper.java.initmemory=4096
wrapper.java.maxmemory=4096
wrapper.java.additional.25=-Dcloudslang.worker.numberOfExecutionThreads=30


2. File: <installation_folder>/central/conf/database.properties
db.pool.maxPoolSize=100

Thanks for the assist. I am hoping this gets better after I break the project up into smaller pieces.

View solution in original post

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.