A fix is available
APAR status
Closed as program error.
Error description
In the <TPC dir>\wlp\usr\servers\deviceServer\logs\messages.log: [2019/08/02 0:12:47:314 CAT] 0000002f SystemErr R java.lang.OutOfMemoryError [2019/08/02 0:12:47:314 CAT] 0000002f SystemErr R : [2019/08/02 0:12:47:314 CAT] 0000002f SystemErr R Java heap space [2019/08/02 0:12:47:314 CAT] 0000002f SystemErr R at [2019/08/02 0:12:47:314 CAT] 00002de0 com.ibm.ws.webcontainer.util.ApplicationErrorUtils E SRVE0777E: Exception thrown by application class 'com.tivoli.sanmgmt.middleware.TSNMServiceManager.doPost:520' java.io.IOException: java.lang.OutOfMemoryError: Java heap space The OOM errors occur when there are more than 50 hypervisors probes started at the same time. All those probes have obtained a connection to the vCenter. Such a connection is about 17 mega bytes large each. The heapdump analyzer shows as the leak suspect the VIClientPool containing 56 connections in the active state. i.e: - com/sun/xml/internal/ws/client/sei/SEIStub holding 17,438,344 bytes at 0x1555e7108 Javacores show 50 threads or more blocked in the same lock ................................................................ ......................................... 3XMTHREADINFO "Process Processor (Thread-3428)" J9VMThread:0x000000000355B000, omrthread_t:0x00000000233498E8, java/lang/Thread:0x000000011DC41190, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3429)" J9VMThread:0x000000000355BC00, omrthread_t:0x0000000023349DB0, java/lang/Thread:0x000000011A7ACFC0, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3430)" J9VMThread:0x000000000355C700, omrthread_t:0x000000002334A430, java/lang/Thread:0x000000011D55B820, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3431)" J9VMThread:0x000000000355D300, omrthread_t:0x000000002334A8F8, java/lang/Thread:0x000000011053D420, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3432)" J9VMThread:0x0000000003566100, omrthread_t:0x000000002334ADC0, java/lang/Thread:0x00000001105413A0, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3435)" J9VMThread:0x0000000003568400, omrthread_t:0x0000000023344D60, java/lang/Thread:0x0000000110555AE8, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3437)" J9VMThread:0x000000000359B200, omrthread_t:0x0000000023343888, java/lang/Thread:0x000000011054BFD0, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3438)" J9VMThread:0x000000000359BD00, omrthread_t:0x0000000023343D50, java/lang/Thread:0x000000011DC41868, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3439)" J9VMThread:0x000000000359C900, omrthread_t:0x000000002334C450, java/lang/Thread:0x000000011A7AD718, state:B, prio=5 3XMTHREADINFO "Process Processor (Thread-3440)" J9VMThread:0x000000000359D500, omrthread_t:0x000000002334C918, java/lang/Thread:0x00000001105415E8, state:B, prio=5 4XESTACKTRACE at com/ibm/tpc/vmmgr/client/VIClientWrapperProxy.executeSvcCall(VIC lientWrapperProxy.java:243(Compiled Code)) 4XESTACKTRACE at com/ibm/tpc/vmmgr/client/VIClientWrapperProxy.invoke(VIClientWra pperProxy.java:153(Compiled Code)) 4XESTACKTRACE at com/sun/proxy/$Proxy58.getAPIVersion(Bytecode PC:11) 4XESTACKTRACE at com/ibm/tpc/vmmgr/api/impl/VMDatasourceManager.testSingleConnect ion(VMDatasourceManager.java:1083) 4XESTACKTRACE at com/ibm/tpc/vmmgr/api/impl/VMDatasourceManager.testConnection(VM DatasourceManager.java:984) 4XESTACKTRACE at com/ibm/tpc/vmmgr/api/impl/VMManagerService.testConnection(VMMan agerService.java:582) 4XESTACKTRACE at sun/reflect/NativeMethodAccessorImpl.invoke0(Native Method) 4XESTACKTRACE at sun/reflect/NativeMethodAccessorImpl.invoke(NativeMethodAccessor Impl.java:90(Compiled Code)) 4XESTACKTRACE at sun/reflect/DelegatingMethodAccessorImpl.invoke(DelegatingMethod AccessorImpl.java:55(Compiled Code)) 4XESTACKTRACE at java/lang/reflect/Method.invoke(Method.java:508(Compiled Code)) 4XESTACKTRACE at com/tivoli/sanmgmt/middleware/data/LocalServiceProxy.invoke(Loca lServiceProxy.java:41(Compiled Code)) 4XESTACKTRACE at com/sun/proxy/$Proxy33.testConnection(Bytecode PC:18) 4XESTACKTRACE at com/ibm/tpc/vmmgr/collection/VMMgrCollectDatasourceVersionsProce ss.process(VMMgrCollectDatasourceVersionsProcess.java:108) 4XESTACKTRACE at com/ibm/tpc/Router.perform(Router.java:724(Compiled Code)) 4XESTACKTRACE at com/ibm/tpc/Router.perform(Router.java:452(Compiled Code)) 4XESTACKTRACE at com/ibm/tpc/vmmgr/collection/step/VMMgrDiscoverStep.process(VMMg rDiscoverStep.java:108) 4XESTACKTRACE at com/ibm/tpc/vmmgr/collection/VMMgrProbeESXHypervisorProcess.proc ess(VMMgrProbeESXHypervisorProcess.java:186) 4XESTACKTRACE at com/ibm/tpc/Router.perform(Router.java:724(Compiled Code)) 4XESTACKTRACE at com/ibm/tpc/Router.perform(Router.java:452(Compiled Code))
Local fix
do one or more following actions: - restart the device server - spread a little bit those probes - increase the memory of the device to 3 Gb.
Problem summary
**************************************************************** * USERS AFFECTED: * * IBM Spectrum Control users running 20 or more ESX probes at * * the same time * **************************************************************** * PROBLEM DESCRIPTION: * * When Spectrum Control is running 20 or more ESX probes * * at the same time the Spectrum Control server may throw * * out of memory (OOM) errors. * * * * The number of ESX VMware API calls that can happen in * * the same time are in general limited by Spectrum Control * * during the ESX probes. But at the start of a probe * * Spectrum Control first attempts to retrieve the version of * * the vCenter through a VMware API, without regard that * * others are trying to do the same call at the same time. * * * * The amount of memory required by such a call is significant, * * * * and that is felt especially when the call takes a lot and * * that * * happens when the vCenter must respond to a lot of those. * * * * The fix is to wait for a such a call to return and reuse its * * result * * for all hypervisors managed by the same vCenter. * **************************************************************** * RECOMMENDATION: * ****************************************************************
Problem conclusion
The fix for this APAR is targeted for the following release: IBM Spectrum Control 5.3.5 [ 5.3.5-IBM-SC ] ( release target November 2019 ) http://www.ibm.com/support/docview.wss?&uid=swg21320822 The target dates for future releases do not represent a formal commitment by IBM. The dates are subject to change without notice.
Temporary fix
Comments
APAR Information
APAR number
IT29939
Reported component name
TPC ADVANCED
Reported component ID
5608TPCA0
Reported release
533
Status
CLOSED PER
PE
NoPE
HIPER
NoHIPER
Special Attention
NoSpecatt / Xsystem
Submitted date
2019-08-08
Closed date
2019-09-23
Last modified date
2019-09-23
APAR is sysrouted FROM one or more of the following:
APAR is sysrouted TO one or more of the following:
Fix information
Fixed component name
TPC ADVANCED
Fixed component ID
5608TPCA0
Applicable component levels
[{"Business Unit":{"code":"BU029","label":"Software"},"Product":{"code":"SSNECY","label":"Tivoli Storage Productivity Center Advanced"},"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"533"}]
Document Information
Modified date:
24 June 2022