Average Hourly Memory Usage

Hi,

 

Can some one help me to find the average Hourly memory utilization node wise from the vsql. 

We dont have access to the MC but i need to understand the trend of memory utilization while we load the data into vertica. i need to indentify the peak hrs of the day for memory utilization.

 

Thanks

Dilip

Comments

  •  

    You can create a crontab job to populate a operational table 

     

      -- see tbl definition

    CREATE TABLE dba.memory_usage
    (
    "time" timestamptz,
    node_name varchar(128),
    avg_mem_useage_percent float
    );



    CREATE PROJECTION dba.memory_usage /*+createtype(P)*/
    (
    "time" ENCODING DELTARANGE_COMP,
    node_name ENCODING RLE,
    avg_mem_useage_percent
    )
    AS
    SELECT memory_usage."time",
    memory_usage.node_name,
    memory_usage.avg_mem_useage_percent
    FROM dba.memory_usage
    ORDER BY memory_usage.node_name,
    memory_usage."time",
    memory_usage.avg_mem_useage_percent
    SEGMENTED BY hash(memory_usage.node_name, memory_usage."time", memory_usage.avg_mem_useage_percent) ALL NODES KSAFE 1;

     

    crontab runs every 30 mins

     

    */30 * * * * /home/dbadmin/scripts/dba/memory_usage.sh

    Where your .sh script would be 

     

     

    #!/bin/bash
    #!/bin/bash

    . /home/dbadmin/.profile
    export VSQLDIR=/opt/vertica/bin
    export SQLDIR=/home/dbadmin/scripts/dba
    export data=`date`
    echo " Execution date -" $data >> /home/dbadmin/scripts/dba/log/memory_usage.log
    echo "#################################################" >> /home/dbadmin/scripts/dba/log/memory_usage.log
    $VSQLDIR/vsql -U $username -w $password -d $dbname -f $SQLDIR/memory_usage.sql >> /home/dbadmin/scripts/dba/log/memory_usage.log
    echo "#################################################" >> /home/dbadmin/scripts/dba/log/memory_usage.log

     

    and memory_usage.sql would be 

    -- loads the dba.memory_usage table - this table will keep historical data for reporting
    insert /* +direct*/ into dba.memory_usage
    select
    start_time,
    node_name,
    average_memory_usage_percent
    from memory_usage
    where
    start_time >(select max(time) from dba.memory_usage);

    you can them use DBvizualizer to create graph view of the values 

     

    some thing like this:

    Capture.PNG

     

     

     

     

    This is the poor man`s monitoring :) - before the last Vconsole version 

     

    Hope this helped you. 

     

    Note - you can do this with all the key metrics and them corelate them to look for bottle necks 

  • Thanks Adrian for this cool feature of DBVisualiser.

    I did not know it had charts utility .

     

    Regarding System resources, When it comes to measureing hardware bottlenecks, I prefer SAR over the System tables inside vertica.

     

    You can find many instances that memory / cpu is nevery fully utlized when Vertica is at full throttle.

    Actually behind the scenes it uses all resources, but does not log it properly in these system tables.

     

    So even if CPU usgae seem to be 30% in Vertica tables, it will more if you check TOP command.

    You can try it yourself.

     

    I developed a SAR report scheduling tool, which dumps the SAR files at end of the day to one of the tables in Vertica and then I can correlate different metrics like, CPU , I\O, Network, Load avg, Memory.

     

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file

Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.