IBM Support

PH28141: OUT OF MEMORY IN CELL POOL USING 500 CONNECTIONS.

Subscribe to this APAR

By subscribing, you receive periodic emails alerting you to the status of the APAR, along with a link to the fix after it becomes available. You can track this item individually or track all items by product.

Notify me when this APAR changes.

Notify me when an APAR for this component changes.

 

APAR status

  • Closed as program error.

Error description

  • Using a connection pool of 500 in Liberty:
    <connectionManager
    maxPoolSize="500" />
    
    causes connection failures with message
    CWWKX8057I and OOM in cell pool storage.
    

Local fix

  • Reduce the number of connections.
    

Problem summary

  • ****************************************************************
    * USERS AFFECTED:  All users of IBM WebSphere Application      *
    *                  Server Liberty for z/OS                     *
    ****************************************************************
    * PROBLEM DESCRIPTION: z/OS Local Comm above the bar storage   *
    *                      issues in client address space          *
    ****************************************************************
    * RECOMMENDATION:                                              *
    ****************************************************************
    The OLA getConnection API has a cell pool limit of 768
    conncurrent connections.  If the client address space exceeds
    that limit then they will receive an out of memory type of
    return.  The connection storage is obtained from a Local
    Communication (LCOM) cell pool within the BBGZLOCL structure.
    It currently supports a maximum of 768 allocated cells.
    
    Also, a couple small LCOM structure leaks were detected in the
    client address space.
    
    Under two scenarios there is a chance of leaking Local Comm
    queue elements.  The leak is in shared above the bar storage.
    This storage will be released when the bind is broken between
    the server and the client.  However, if the server and client
    are long lived then the leak may build over time.
    
    There are eye-catchers in the leaked queue elements: BBGZLDQE
    and BBGZLWQE.  The two scenarios involved in the leak occur
    during an attempt to queue a new element.
    
    First, if a queue limit has been reached which indicates that a
    side of the connection is being severely overrun.
    
    Second, the state of the associated connection is currently
    closing or cleaning up during the queuing attempt.
    
    In both of the above scenarios a newly created queue element
    failed to queue and is not freed.
    

Problem conclusion

  • Code has been modified in Local Comm storage management for
    Connections in the client to allow more conncurrent connections.
    Also, code has been modified in the Local Comm native queue
    routines to free a queue element if it failed to be added to the
    target queue.
    
    The fix for this APAR is currently targeted for inclusion in fix
    pack 20.0.0.10.  Please refer to the Recommended Updates page
    for delivery information:
    http://www.ibm.com/support/docview.wss?rs=180&uid=swg27004980
    

Temporary fix

Comments

APAR Information

  • APAR number

    PH28141

  • Reported component name

    LIBERTY PROF -

  • Reported component ID

    5655W6514

  • Reported release

    CD0

  • Status

    CLOSED PER

  • PE

    NoPE

  • HIPER

    NoHIPER

  • Special Attention

    NoSpecatt / Xsystem

  • Submitted date

    2020-08-04

  • Closed date

    2020-08-27

  • Last modified date

    2020-08-27

  • APAR is sysrouted FROM one or more of the following:

  • APAR is sysrouted TO one or more of the following:

Fix information

  • Fixed component name

    LIBERTY PROF -

  • Fixed component ID

    5655W6514

Applicable component levels

[{"Business Unit":{"code":"BU054","label":"Systems w\/TPS"},"Product":{"code":"SG19M","label":"APARs - z\/OS environment"},"Platform":[{"code":"PF054","label":"z Systems"}],"Version":"CD0"}]

Document Information

Modified date:
28 August 2020