IBM Support

POWER8 Watts, Temp, SSP I/O & Server/LPAR stats from HMC REST API - Version 10

How To


Summary

Learn how to extract server information from the HMC REST API with Python programs and modules.

Download updated to Version 17

Objective

Nigels Banner

Steps

WARNING:

  1. The Server and LPAR stats collection parts are superseded with a new tool called nextract_plus, which you can find here: https://www.ibm.com/support/pages/nextract-plus-hmc-rest-api-performance-statistics
  2. The electrical Watts and Temperature are only available on: S822, S824, E850, S922, and S924 models. E950 = not tested.
  3. The Shared Storage Pool stats are fine but covers only the Pool and VIOS levels. Tier, failure group, and individual disks are not captured.

Update 10 July 2019:
 New functions in this version:

  • This version has additions for HMC 930+ support and robustness to handle more unexpected data returned from the HMC!
  • Extra debug output:
    • See the top of Python programs to set debug=True for detailed debug output or False.
    • And uncomment the print statements to detail the data structure.
  • Extra option for output statistics format: output_csv, output_html, output_influxdb

  • NOTE: you need to edit the Python programs to decide which forms of output you want from CSV, HTML webpages with Graphs or InfluxDB. The default in the programs supplied is
    output_influxDB=True

    for which you then need the InfluxDB Python client library. For Ubuntu Linux, the package looks like this:
    dpkg -l | grep influx
    ii  python3-influxdb 4.1.1-2 all Client for InfluxDB - Python 3.x

    For Ubuntu, installation with:
    ​​​​​​​sudo apt install python3-influxdb
  • Other Linux's:
    python3 -m pip install influxdb

    For more details check: https://www.influxdata.com/blog/getting-started-python-influxdb/ 
  • Support for the data being sent to an InfluxDB and the Grafana for the graphs - already using these open source tools for AIX, VIOS, and Linux performance stats tool njmon (see http://nmon.sourceforge.net/pmwiki.php?n=Site.Njmon)
  • Alternative command-line options without command-line passwords so that ps -ef does not display them to other users.
    Passwords embedded in the Python Programs.  See top of the Python programs:
    hostname="", user="" and password=""

    Or in a separate nextract_config.json file like the following example
    1st three items:
    Hostname of the HMC
    User name
    Password 
  • The 2nd half are optional
    ihost for the hostname off a InfluxDB server
    idbname for the InfluxDB database name
  • Be careful with the syntax, Note: the last data line has no ending comma  
    
    {
    "hostname": "hmc42",  
    "user": "pcmadmin",  
    "password": "SECRET1", 
    "ihost": "influxdb.server.acme.com",  
    "iport": 8086,  
    "iuser": "admin",  
    "ipassword": "SECRET2",  
    "idbname": "nextract"  
    }
  • PowerVM Shared Storage Pool (SSP) Influx and Grafana graphs click to see full size
graphEnergy Influx and Grafana  graphs CPU, Inlet temperature, and the electrical Watts
graph
Server Stats Influx and Grafana graphs (a small sample) and then there are LPAR stats too
graph

Objectives:

In this version, POWER8 energy stats are extracted (Watts electricity and Temperatures) in 34 lines of Python code that uses a Python Module.

The Shared Storage Pool I/O and Server = LPAR stats work the same way but as there is more stats take a little more code but works the same way.

Output:

Sample output for Energy stats - for SSP I/O and Server +LPAR performance stats fine the graphs later on.

graph

Sample Comma Separated Value (CSV) file:

  • From a CSV file, you can load the data to a "database" for longer term & graphs
    time,   watts,   mb0,mb1,mb2,mb3,   cpu0,cpu1,cpu2,cpu3,cpu4,cpu5,cpu6,cpu7,   inlet    
    2017-10-13T13:13:00, 837, 30.0,32.0,33.0,0.0, 46.0,43.0,44.0,46.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:13:30, 837, 30.0,32.0,33.0,0.0, 45.0,44.0,46.0,45.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:14:00, 828, 30.0,32.0,33.0,0.0, 44.0,41.0,46.0,45.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:14:30, 810, 30.0,32.0,33.0,0.0, 44.0,40.0,45.0,46.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:15:00, 811, 30.0,32.0,33.0,0.0, 41.0,40.0,45.0,46.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:15:30, 808, 30.0,31.0,33.0,0.0, 43.0,39.0,45.0,45.0,0.0,0.0,0.0,0.0, 30.0  
    2017-10-13T13:16:00, 822, 30.0,31.0,33.0,0.0, 44.0,42.0,46.0,45.0,0.0,0.0,0.0,0.0, 29.0  
    2017-10-13T13:16:30, 823, 30.0,31.0,33.0,0.0, 44.0,42.0,45.0,45.0,0.0,0.0,0.0,0.0, 29.0  
    2017-10-13T13:17:00, 830, 30.0,31.0,33.0,0.0, 44.0,43.0,46.0,45.0,0.0,0.0,0.0,0.0, 29.0  
    2017-10-13T13:17:30, 825, 30.0,32.0,33.0,0.0, 43.0,43.0,44.0,45.0,0.0,0.0,0.0,0.0, 29.0  
    2017-10-13T13:18:00, 790, 30.0,32.0,33.0,0.0, 41.0,43.0,42.0,42.0,0.0,0.0,0.0,0.0, 29.0  
    2017-10-13T13:18:30, 777, 30.0,31.0,33.0,0.0, 41.0,42.0,44.0,43.0,0.0,0.0,0.0,0.0, 29.0 
  • Time is in an ISO standard "dateTtime"   yyyy-mm-ddThh:mm:ss
  • Temperatures units are Celsius (Centigrade)
  • Note some values are always zero as different POWER server models have different numbers of planars and CPU sensors.

Code Sample for Energy

So in the 34 line Python program to:

  1. Log in to the HMC.
  2. Fetch the Preferences for Energy, then set the wanted options. We assume they were switched "on" for capable Servers (POWER8 S822, S824, and E850). We have a second small program with this function.
  3. Next, we get energy stats - with SSP I/O and Server + LPAR performance stats we have to get a list of file names as an intermediate stage)
  4. The stats header information is then converted to a convenient Python dictionary.
  5. The stats themselves are converted to an array of Python dictionaries for transforming to whatever is needs - examples here are a JavaScipt web page or Comma Separated Values.

In green, are comments and highlighted the Python data structures being returned. 

Note: the HMC, HMC username, and Password are hardcoded in the programs. Alternatively, they can be supplied with the command with a half dozen extra lines of Python code.

Nigel’s Python Modules 

All the programs were simplified by two Python modules (called libraries in other languages) that hides the full gory detail of dealing with the HMC directly, which send back complicated XML and JSON data.  Here are the details:

  • Based on: Python v3
  • Modules used (default or easy to install on Linux.  Using another OS? You need to check):
    1. "requests" for HTTP REST API access to the HMC over the network.
    2. "ElementTree" for handling XML data.
    3. "json" for JSON data.
    4. "os" for operating systems details.
    5. "sys" for system functions.
    6. "time" for time measurement.
    7. "atexit" is used to automatically disconnect from the HMC when we end Python.

"hmc_pcm.py" Module

  •  60 KB = ~1150 lines of Python.
  •  42 Functions = core function and data specific.
  •  Note this module logs out of the HMC automatically - even if the code crashes.
Includes Python functions from our example like
  • hmc_pcm.HMC(hostname, user, password)
    hmc.get_server_details_pcm()
    JSONdata = hmc.get_energy(atomid,server_name)
    info = hmc.extract_energy_info(JSONdata)
    headline, stats = hmc.extract_energy_stats(JSONdata)
There are similar functions for Shared Storage Pool and Performance Stats

nchart.py Module

  • This program uses the JavaScipt Google chart library to make good looking graphs.
  • 20 KB = 371 lines of Python.
  • 17 core functions.
  • Including specific Graph functions for our REST API DATA as well as general purpose graphing functions.
nchart_energy(self, filename, information, data)
nchart_ssp(self, filename, information, SSP, header, vios)
nchart_server(self, filename, information, data)
nchart_lpar(self, filename, information, data)

Nigel’s Python Extraction Programs

Small example user scripts for HMC 860+ software

nextract_energy.py

  • Extracts from the HMC Watts of electricity and multiple Temperatures in Celsius per server.
  • For selected POWER8 servers or later.
    • Supported: POWER8 Scale-Out S8nn(L) & E850
    • Not supported: POWER8 Enterprise E870 or E880 or LC models
  • 74 lines of Python code.
  • Output format .html or Comma Separated Values file (CSV) or older style CSV.
  • Last 2 hours at 30-second intervals.
  • "nextract_energy_on.py" to switch on energy collection.

nextract_ssp.py - for VIOS 2.2.5.10 or later version

  • Extracts from the HMC overall Shared Storage Pool I/O stats (read and write: KB/s, IO per second, response times, Pool size, and Pool used plus VIOS level I/O KB/s.
  • 127 lines of Python code.
  • Output format .html, or CSV or old CSV.
  • Last 25 hours at 5 minutes intervals.
nextract_server.py - Performance stats for POWER8 onwards
  • Extracts from the HMC the Server level CPU Busy, memory allocated Network, and Storage I/O stats (read and write: KB/s, ops, response times, Pool size, and Pool used plus stats at the VIOS level - here the VIOS I/O KB/s, which shows the busiest VIOS in the SSP.
  • 150 lines of Python code.
  • Covers Managed System (server) stats and LPAR stats.
  • Output format .html or CSV.
  • Last 2 hours at 30-second intervals.

Python Code Downloads

Python Modules and Python Programs

  • Version 10 - July 2019 -  nextract_v10.tar - Current release
    • Support for new HMC 9 series.
    • Added InfluxDB output for EnergyServer, and LPAR performance stats and Shared Storage Pool (SSP) programs.
    • Merged regular and InfluxDB version.
    • All updated to version 10 for easier identification.
    • Embedded HMC, user, and password option.
    • HMC, user, and password can be placed in a file: "nextract_config.json".

Install and running the programs

  1. Install by extracting the Python modules and Python Programs into one directory on a server with Python3 installed and network access to your HMC
    $ tar xvf nextract_vXX.tar
  2. The Modules do not need installing as official Python modules. They have them in the same directory.
  3. Ensure you have Python 3 installed like - 
    $ python --version
    Python 3.5.2
  4. Then, run the programs with your HMC hostname, HMC user, and HMC password
  5. The nextract_energy, nextract_ssp and nextract_server (and LPAR) programs create a CSV and webpages with graphs.
    You can switch energy stats on and off near the top of the program. See the lines:
    output_html=True
    output_csv=True

    Set the True to False to sop that format being generated.
  6. For energy stats, run the nextract_energy_on - once first then wait 15 minutes.
  7.   If there are problems in the hmc_pcm.py module, you can switch on or off debug output from the nextract program as follows.
    • After the line: 
      hmc = hmc_pcm.HMC(hostname, user, password) # Log on to the HMC
    • Then, add the following line:
      hmc.set_debug(True)
    • If necessary, a sub directory called "debug" is created and hundreds of files are placed therein.  

Live Graphs of Sample Output

  1. Server Energy graphs (Watts & Celsius) 
  2. Shared Storage Pool I/O graphs for whole SSP and VIOS level  
  3. Performance stats Server level graphs
  4. Performance stats LPAR level graphs

Examples of running the Programs

Switch on Energy Collection

  $ ./nextract_energy_on.py mint pcmadmin pcm123pcm  
HMC hostanme=mint 
User=pcmadmin 
Password=pcm123pcm  
-> Logging on to mint as user pcmadmin  
-->Server 0 P8-S822-lemon        not capable of supplying Energy stats  
-->Server 1 P7-p750-pear         not capable of supplying Energy stats  
-->Server 2 P7-p750-peach        not capable of supplying Energy stats  
-->Server 3 P7-p730b-green       not capable of supplying Energy stats  
-->Server 4 P7-p710b-cyan        not capable of supplying Energy stats  
-->Server 5 P7-p770-purple       not capable of supplying Energy stats  
-->Server 6 P8-E850-ruby         capable of collecting Enegry stats and enabled  
-->Server 7 P7-p710c-indigo      not capable of supplying Energy stats  
-->Server 8 P8-S824-emerald      capable of collecting Enegry stats and enabled  
-->Server 9 P8-S822-lime         capable of collecting Enegry stats and enabled 
 -> Finished  $

Energy

Note: the Server P8-E850-ruby currently reports, the error number "erno", and no data

  $ ./nextract_energy.py mint pcmadmin pcm123pcm  
HMC hostanme=mint 
User=pcmadmin 
Password=pcm123pcm  
-> Logging on to mint as user pcmadmin  
-->Server 1 P8-S822-lemon not capable of supplying energy stats  
-->Server 2 P7-p750-pear not capable of supplying energy stats  
-->Server 3 P7-p750-peach not capable of supplying energy stats  
-->Server 4 P7-p730b-green not capable of supplying energy stats  
-->Server 5 P7-p710b-cyan not capable of supplying energy stats  
-->Server 6 P7-p770-purple not capable of supplying energy stats  
-->Server 7 P8-E850-ruby collecting Energy stats  
Error 204: returned for GET "Energy-P8-E850-ruby" file of filenames  Hint: 204=No Content  
-->Server 8 P7-p710c-indigo not capable of supplying energy stats  
-->Server 9 P8-S824-emerald collecting Energy stats  
-->Summary:  {'starttime': '2017-10-31T20:51:30+0000', 'server': 'P8-S824-emerald', 'freq': 30, 'mtm': '8286-42A', 'serial': '100EC7V'}  
Create Energy-P8-S824-emerald.html  
Saved webpage to Energy-P8-S824-emerald.html  

Saved comma separated values to Energy-P8-S824-emerald.csv  
-->Server 10 P8-S822-lime collecting Energy stats  
-->Summary:  {'starttime': '2017-10-31T20:51:30+0000', 'server': 'P8-S822-lime', 'freq': 30, 'mtm': '8284-22A', 'serial': '215296V'}  
Create Energy-P8-S822-lime.html  
Saved webpage to Energy-P8-S822-lime.html  
Saved comma separated values to Energy-P8-S822-lime.csv

Shared Storage Pools

Note: the Spiral SSP is not collecting stats at the start

  $ ./nextract_ssp.py mint pcmadmin pcm123pcm  
HMC hostanme=mint 
User=pcmadmin 
Password=pcm123pcm  
-> Logging on to mint as user pcmadmin  
-> Get Stripped Preferences  -> Parse Preferences  
-> cluster=spiral     pool=spiral     AggregrateEnabled=false Monitoring Enabled=false =BAD  
-> cluster=orbit      pool=orbit      AggregrateEnabled= true Monitoring Enabled= true =GOOD  
-> Set Preferences - please wait 10+ minutes for stats to appear!  
-> Processing SSP  --> SSP=1 Getting filenames for cluster=orbit pool=orbit  
---> Requesting orbit as monitoring enabled  
---> Received 1 file(s) in 32.95 seconds  
---> File=1 Getting stats from SharedStoragePool_13ca2caa-87e2-34af-a2d3-48757a46d28f_20171030T220000+0000_20171031T225500+0000_300.json  
---> Processing JSON data size=57838962 bytes  
Create SSP-orbit.html  Saved webpage to SSP-orbit.html  Logging off the HMC

Server and LPAR Performance Stats

  $ time ./nextract_server.py mint pcmadmin pcm123pcm  
HMC hostanme=mint 
User=pcmadmin 
Password=pcm123pcm  
-> Logging on to mint as user pcmadmin  
-> Get Preferences  -> Parse Preferences  
-> ALL servers: 
 -> Server name=P8-S822-lemon    agg=false longterm=false - remove 
 -> Server name=P7-p750-pear     agg=false longterm=false - remove  
-> Server name=P7-p750-peach    agg=true  longterm=true  - OK  
-> Server name=P7-p730b-green   agg=false longterm=false - remove  
-> Server name=P7-p710b-cyan    agg=false longterm=false - remove  
-> Server name=P7-p770-purple   agg=false longterm=false - remove  
-> Server name=P8-E850-ruby     agg=true  longterm=true  - OK  
-> Server name=P7-p710c-indigo  agg=false longterm=false - remove  
-> Server name=P8-S824-emerald  agg=true  longterm=true  - OK  
-> Server name=P8-S822-lime     agg=true  longterm=true  - OK  
-> Servers with Perf Stats    --> Server=1 Getting filenames for P7-p750-peach  
---> Received 5 file(s) in 6.14 seconds  
---> Server=P7-p750-peach File=1 ManagedSystem_6502c9d1-2b91-3bfe-8745-e2ce2547c791_20171031T205900+0000_20171031T225800+0000_30.json  
---> Server=P7-p750-peach File=2 LogicalPartition_6F32128A-AB6D-4F6E-B7BD-DEEA6079DF88  
---> Server=P7-p750-peach File=3 LogicalPartition_26BF85CC-F519-402C-BEA1-E00D8ACE721A  
---> Server=P7-p750-peach File=4 LogicalPartition_7713D33D-E1C0-457A-A1B5-8A4B6352191C  
---> Server=P7-p750-peach File=5 LogicalPartition_6BE66483-A110-4C08-9DDB-1357199004D4      

ManagedSystem  
---> Save readable JSON File=1 bytes=1849517 name=ManagedSystem_6502c9d1-2b91-3bfe-8745-e2ce2547c791_20171031T205900+0000_20171031T225800+0000_30.JSON  
----> ServerInfo name=P7-p750-peach mtms=8233-E8B*100272P type=Processed frequency=30 seconds 
 ----> ServerInfo Date=2017-10-31 start=20:59:00 end=22:58:00  
*** Error Server P7-p750-peach: status=2 (not zero)  
**** mgs=To get Network Bridge utilization the VIOS should be running 2.2.3 or later and System Firmware level should be 780 or later  
*** Error Server P7-p750-peach: status=2 (not zero)  
**** mgs=vios: 1 is not in running state on Managed System: 8233-E8B*100272P  
*** Error Server P7-p750-peach: status=2 (not zero)  
**** mgs=vios: 1 is not in running state on Managed System: 8233-E8B*100272P  
----> Records=0 Errors=3  Stopping processing of this server P7-p750-peach due to errors    
--> Server=2 Getting filenames for P8-E850-ruby  
---> Received 13 file(s) in 7.32 seconds  
---> Server=P8-E850-ruby File=1 ManagedSystem_caac7a03-4eac-3256-b32b-5b203809d674_20171031T205930+0000_20171031T225800+0000_30.json  
---> Server=P8-E850-ruby File=2 LogicalPartition_22360EE8-3423-4A44-A10E-F1AD87861DBF  
---> Server=P8-E850-ruby File=3 LogicalPartition_2DF7AC9E-9F02-4726-887C-3788D931C802  
---> Server=P8-E850-ruby File=4 LogicalPartition_2EC5F7EC-78D2-49CF-B557-B4A7178AA006  
---> Server=P8-E850-ruby File=5 LogicalPartition_24202A43-7629-4053-9C45-98328FE4D53B  
---> Server=P8-E850-ruby File=6 LogicalPartition_7CEA0762-1B47-455D-B20E-54F8C9BC41ED  
---> Server=P8-E850-ruby File=7 LogicalPartition_311920CE-9218-47D1-B306-5693004EB3CF  
---> Server=P8-E850-ruby File=8 LogicalPartition_6F1286D2-9FE5-456F-B165-DE831AECF799  
---> Server=P8-E850-ruby File=9 LogicalPartition_1E56DA75-EC9E-48A1-8CD6-AEE085BD8950  
---> Server=P8-E850-ruby File=10 LogicalPartition_5088EE02-51E0-4264-9F4B-5C02C9FA134E  
---> Server=P8-E850-ruby File=11 LogicalPartition_7BED2B9F-A418-43D8-B2BD-5D1500C8497A  
---> Server=P8-E850-ruby File=12 LogicalPartition_484C1F38-F8F8-4C18-A713-A28BA1A29859  
---> Server=P8-E850-ruby File=13 LogicalPartition_31A2873B-34BF-4F47-8AA3-1D25DF8567F4      ManagedSystem  ---> Save readable JSON File=1 bytes=3749872 name=ManagedSystem_caac7a03-4eac-3256-b32b-5b203809d674_20171031T205930+0000_20171031T225800+0000_30.JSON  
----> ServerInfo name=P8-E850-ruby mtms=8408-E8E*21D494V type=Processed frequency=30 seconds  
----> ServerInfo Date=2017-10-31 start=20:59:30 end=22:58:00  
----> Records=238 Errors=0  Saved webpage to Server-P8-E850-ruby.html  Saved comma separated values to Server-P8-E850-ruby.csv    
----> Server=P8-E850-ruby Filenames XML File=2 bytes1317  Created webpage LPAR-vm91-AIX72.html  Saved comma separated values to LPAR-vm91-AIX72.csv    
----> Server=P8-E850-ruby Filenames XML File=3 bytes1317  Created webpage LPAR-vm16-PowerVC132.html  Saved comma separated values to LPAR-vm16-PowerVC132.csv    
----> Server=P8-E850-ruby Filenames XML File=4 bytes1317  Created webpage LPAR-vm6-bee7a81e-0000001f.html  Saved comma separated values to LPAR-vm6-bee7a81e-0000001f.csv    
----> Server=P8-E850-ruby Filenames XML File=5 bytes1317  Created webpage LPAR-vm17-PowerVC132cloud.html  Saved comma separated values to LPAR-vm17-PowerVC132cloud.csv    
----> Server=P8-E850-ruby Filenames XML File=6 bytes1317  Created webpage LPAR-vm96withSSDviaVIOS3-4.html  Saved comma separated values to LPAR-vm96withSSDviaVIOS3-4.csv    
...    

--> Server=3 Getting filenames for P8-S824-emerald  
---> Received 12 file(s) in 9.53 seconds  
---> Server=P8-S824-emerald File=1 ManagedSystem_be63c7f1-6222-3a57-bc7d-d0510aa2705a_20171031T205930+0000_20171031T225830+0000_30.json  
---> Server=P8-S824-emerald File=2 LogicalPartition_19085F46-7771-4243-9DF7-AA90897BD514  
---> Server=P8-S824-emerald File=3 LogicalPartition_22B3CF31-B429-4011-BAE2-9E91760258CA 
---> Server=P8-S824-emerald File=4 LogicalPartition_647F2479-80CF-47B0-B490-826E29D25B42  
---> Server=P8-S824-emerald File=5 LogicalPartition_32931DBA-AB3F-4618-8074-F042ADA55B9F  
---> Server=P8-S824-emerald File=6 LogicalPartition_5BFABC30-FD13-4870-9C07-8061C1F6CCA3  
---> Server=P8-S824-emerald File=7 LogicalPartition_1EBB0B53-B85F-436D-BE6D-05658243246B  
---> Server=P8-S824-emerald File=8 LogicalPartition_458ECBC4-0A7F-46FC-AC3D-2B16FA854FFA 
 ---> Server=P8-S824-emerald File=9 LogicalPartition_6BE25438-3AE3-449E-8725-900D7AC4492E  
---> Server=P8-S824-emerald File=10 LogicalPartition_754DA588-AC65-4854-9869-A87CA5C2A0A0  
---> Server=P8-S824-emerald File=11 LogicalPartition_7E29F9D6-10A4-4F89-A4D0-B5A2DBB1CB87  
---> Server=P8-S824-emerald File=12 LogicalPartition_5C1E377A-EEF2-4942-BE3D-6AB7FC821829      ManagedSystem  
---> Save readable JSON File=1 bytes=3873347 name=ManagedSystem_be63c7f1-6222-3a57-bc7d-d0510aa2705a_20171031T205930+0000_20171031T225830+0000_30.JSON  
----> ServerInfo name=P8-S824-emerald mtms=8286-42A*100EC7V type=Processed frequency=30 seconds  
----> ServerInfo Date=2017-10-31 start=20:59:30 end=22:58:30  
----> Records=239 Errors=0  Saved webpage to Server-P8-S824-emerald.html  Saved comma separated values to Server-P8-S824-emerald.csv    
----> Server=P8-S824-emerald Filenames XML File=2 bytes1317  Created webpage LPAR-download-repo.html  Saved comma separated values to LPAR-download-repo.csv    
----> Server=P8-S824-emerald Filenames XML File=3 bytes1317  Created webpage LPAR-vm26.html  Saved comma separated values to LPAR-vm26.csv    
----> Server=P8-S824-emerald Filenames XML File=4 bytes1317  Created webpage LPAR-vm178.html  Saved comma separated values to LPAR-vm178.csv    
----> Server=P8-S824-emerald Filenames XML File=5 bytes1317  Created webpage LPAR-vm20.html  Saved comma separated values to LPAR-vm20.csv    
----> Server=P8-S824-emerald Filenames XML File=6 bytes1317  Created webpage LPAR-emeraldbackup-76665db5-00000009.html  Saved comma separated values to LPAR-emeraldbackup-76665db5-00000009.csv   
 ...    
Created webpage LPAR-vm51.html  Saved comma separated values to LPAR-vm51.csv    
--> Server=4 Getting filenames for P8-S822-lime  
---> Received 1 file(s) in 9.12 seconds  
---> Server=P8-S822-lime File=1 ManagedSystem_6ed2156b-342c-38cc-93d2-e01dad3c3fc7_20171031T210030+0000_20171031T225900+0000_30.json      ManagedSystem  
---> Save readable JSON File=1 bytes=1771368 name=ManagedSystem_6ed2156b-342c-38cc-93d2-e01dad3c3fc7_20171031T210030+0000_20171031T225900+0000_30.JSON  
----> ServerInfo name=P8-S822-lime mtms=8284-22A*215296V type=Processed frequency=30 seconds  
----> ServerInfo Date=2017-10-31 start=21:00:30 end=22:59:00  
----> Records=238 Errors=0  Saved webpage to Server-P8-S822-lime.html  
Saved comma separated values to Server-P8-S822-lime.csv  
Logging off the HMC    real    1m15.409s  user    0m5.948s  sys     0m0.128s  $

You need the HMC to be collecting Server data. Make the change on the HMC Enhanced+ GUI:

graph

- - - The End - - -

    Additional Information


    If you find errors or have question, email me: 

    • Subject: nextract original
    • Email: n a g @ u k . i b m . c o m  

    Find Nigel on

    Document Location

    Worldwide

    [{"Business Unit":{"code":"BU058","label":"IBM Infrastructure w\/TPS"},"Product":{"code":"SWG10","label":"AIX"},"Component":"","Platform":[{"code":"PF002","label":"AIX"}],"Version":"All Versions","Edition":"","Line of Business":{"code":"LOB08","label":"Cognitive Systems"}},{"Business Unit":{"code":"BU054","label":"Systems w\/TPS"},"Product":{"code":"HW1W1","label":"Power ->PowerLinux"},"Component":"","Platform":[{"code":"PF016","label":"Linux"}],"Version":"All Versions","Edition":"","Line of Business":{"code":"","label":""}},{"Business Unit":{"code":"BU058","label":"IBM Infrastructure w\/TPS"},"Product":{"code":"SWG60","label":"IBM i"},"Component":"","Platform":[{"code":"PF012","label":"IBM i"}],"Version":"All Versions","Edition":"","Line of Business":{"code":"LOB57","label":"Power"}}]

    Document Information

    Modified date:
    28 October 2021

    UID

    ibm11115601