The Vertica Forum recently got a makeover! Let us know what you think by filling out this short, anonymous survey.

RESET IN VERTICA

Hello All,
I have in TeraData
Select PER_ID, CONTACT_ID, START_DT, PREV_CONTACT_ID,
ROW_NUMBER() Over (Partition By PER_ID Order By START_DT RESET WHEN PREV_CONTACT_ID Is Null) As CONTACT_SERIES

Explanation: if in partition a column PREV_CONTACT_ID has a value NULL, in this case the ROW_NUMBER must restarts from 1.
But in Vertica does not exists: 'OVER (.... RESET WHEN condition)'. How can I write similar condition in Vertica?

Tried conditional true event and change event but it does not work that way.

Answers

  • Jim_KnicelyJim_Knicely Administrator
    edited August 30

    Blast from the past!

    See:
    http://www.vertica-forums.com/viewtopic.php?t=2095

    Same answer today?

    dbadmin=> SELECT * FROM some_table ORDER BY per_id, start_dt;
     PER_ID | CONTACT_ID |  START_DT  | PREV_CONTACT_ID
    --------+------------+------------+-----------------
          1 |         10 | 2015-03-02 |             100
          1 |         11 | 2015-03-03 |             101
          1 |         12 | 2015-03-04 |             102
          1 |         13 | 2015-03-05 |
          2 |         23 | 2015-02-25 |             201
          2 |         24 | 2015-02-26 |
          2 |         23 | 2015-02-27 |             203
          2 |         22 | 2015-02-28 |             202
          2 |         21 | 2015-03-01 |             201
          2 |         20 | 2015-03-02 |             200
    (10 rows)
    
    dbadmin=> SELECT per_id, contact_id, start_dt, prev_contact_id, ROW_NUMBER() OVER (PARTITION BY NVL2(prev_contact_id, per_id, per_id*100) ORDER BY start_dt) AS contact_series
    dbadmin->   FROM some_table
    dbadmin->  ORDER BY per_id, start_dt;
     per_id | contact_id |  start_dt  | prev_contact_id | contact_series
    --------+------------+------------+-----------------+----------------
          1 |         10 | 2015-03-02 |             100 |              1
          1 |         11 | 2015-03-03 |             101 |              2
          1 |         12 | 2015-03-04 |             102 |              3
          1 |         13 | 2015-03-05 |                 |              1
          2 |         23 | 2015-02-25 |             201 |              1
          2 |         24 | 2015-02-26 |                 |              1
          2 |         23 | 2015-02-27 |             203 |              2
          2 |         22 | 2015-02-28 |             202 |              3
          2 |         21 | 2015-03-01 |             201 |              4
          2 |         20 | 2015-03-02 |             200 |              5
    (10 rows)
    
  • Nice Jim!
    @hdk0310,
    It would be very helpful, if you could provide a small executable sample with create table and data. Based on your query I am making assumptions about your data. For example, I thought per_id is the primary key. Can start_dt be null or repeat?
    More verbose alternative below:

    dbadmin=> drop table scratchpad.null_contract_reset;
    DROP TABLE
    dbadmin=> select sysdate, version();
              sysdate           |              version
    ----------------------------+------------------------------------
     2019-08-30 09:57:47.699307 | Vertica Analytic Database v9.2.0-0
    (1 row)
    
    dbadmin=> create table scratchpad.null_contract_reset
    dbadmin-> (per_id identity,
    dbadmin(> contract_id int,
    dbadmin(> start_dt date,
    dbadmin(> prev_contract_id int)
    dbadmin-> ;
    CREATE TABLE
    dbadmin=> \o /dev/null
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(1,sysdate-20,null);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(3,sysdate-18,98);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(4,sysdate-17,99);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(5,sysdate-16,null);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(6,sysdate-15,13);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(7,sysdate-14,13);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(8,sysdate-13,14);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(9,sysdate-12,17);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(10,sysdate-11,null);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(11,sysdate-10,121);
    insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(12,sysdate-9,122);
    commit;
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(2,sysdate-19,99);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(3,sysdate-18,98);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(4,sysdate-17,99);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(5,sysdate-16,null);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(6,sysdate-15,13);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(7,sysdate-14,13);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(8,sysdate-13,14);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(9,sysdate-12,17);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(10,sysdate-11,null);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(11,sysdate-10,121);
    dbadmin=> insert into scratchpad.null_contract_reset(contract_id ,start_dt ,prev_contract_id) values(12,sysdate-9,122);
    dbadmin=> commit;
    dbadmin=> \o
    dbadmin=> select * from scratchpad.null_contract_reset order by start_dt,per_id;
     per_id | contract_id |  start_dt  | prev_contract_id
    --------+-------------+------------+------------------
          1 |           1 | 2019-08-10 |
          2 |           2 | 2019-08-11 |               99
          3 |           3 | 2019-08-12 |               98
          4 |           4 | 2019-08-13 |               99
          5 |           5 | 2019-08-14 |
          6 |           6 | 2019-08-15 |               13
          7 |           7 | 2019-08-16 |               13
          8 |           8 | 2019-08-17 |               14
          9 |           9 | 2019-08-18 |               17
         10 |          10 | 2019-08-19 |
         11 |          11 | 2019-08-20 |              121
         12 |          12 | 2019-08-21 |              122
    (12 rows)
    
    dbadmin=> with a as
    dbadmin-> (select t1.*, case when prev_contract_id is null then per_id else null end per_grp from scratchpad.null_contract_reset t1 order by start_dt, per_id)
    dbadmin-> ,b as
    dbadmin-> (select a.*, last_value(per_grp ignore nulls) over(order by start_dt, per_id) per_grp_full from a order by start_dt, per_id)
    dbadmin-> select b.* , row_number() over(partition by per_grp_full order by start_dt, per_id) contract_series
    dbadmin-> from b order by start_dt, per_id
    dbadmin-> ;
     per_id | contract_id |  start_dt  | prev_contract_id | per_grp | per_grp_full | contract_series
    --------+-------------+------------+------------------+---------+--------------+-----------------
          1 |           1 | 2019-08-10 |                  |       1 |            1 |               1
          2 |           2 | 2019-08-11 |               99 |         |            1 |               2
          3 |           3 | 2019-08-12 |               98 |         |            1 |               3
          4 |           4 | 2019-08-13 |               99 |         |            1 |               4
          5 |           5 | 2019-08-14 |                  |       5 |            5 |               1
          6 |           6 | 2019-08-15 |               13 |         |            5 |               2
          7 |           7 | 2019-08-16 |               13 |         |            5 |               3
          8 |           8 | 2019-08-17 |               14 |         |            5 |               4
          9 |           9 | 2019-08-18 |               17 |         |            5 |               5
         10 |          10 | 2019-08-19 |                  |      10 |           10 |               1
         11 |          11 | 2019-08-20 |              121 |         |           10 |               2
         12 |          12 | 2019-08-21 |              122 |         |           10 |               3
    (12 rows)
    
    dbadmin=> drop table scratchpad.null_contract_reset;
    DROP TABLE
    dbadmin=>
    
  • hdk0310hdk0310
    edited September 3

    Hi ,
    @Jim and Sudhakar
    Thanks for your response. I tried both the solutions but somehow it is not helping me achieve the objective as my data set is different.

    Data set:
    PER_ID,Global_PER_ID,START_DT (timestamp field),PREV_CONTACT = (possible values = 'A','B',NULL)

    Teradata logic: (working in Teradata)
    ,sum(case when PREV_CONTACT is not null then 1 else 0 end) OVER (PARTITION BY PER_ID, Global_PER_ID ORDER BY PER_ID, Global_PER_ID, START_DT RESET WHEN PREV_CONTACT is null ROWS UNBOUNDED PRECEDING ) as ConsecutiveIntervals

    Vertica: (Tried Vertica Alternative)
    ,CONDITIONAL_TRUE_EVENT(PREV_CONTACT is not null ) over (PARTITION by PER_ID, Global_PER_ID ORDER BY PER_ID, Global_PER_ID,START_DT) AS ConsecutiveIntervals

    To Achieve ConsecutiveIntervals:

    Current_SysTime Per_ID Global_Per_ID STart_DT Prev_Contact ConsecutiveIntervals
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 9:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 9:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 9:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 8:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:20 A 3
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:10 A 2
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 7:00 A 1
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 6:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 5:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 4:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 3:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:20 A 1
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 2:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:40 A 1
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 1:00 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:50 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:40 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:30 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:20 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:10 NULL 0
    2019-08-30 09:23:21+00:00 50000 1500 30/08/19 0:00 NULL 0

    Thanks

  • Jim_KnicelyJim_Knicely Administrator
    edited September 3

    How about this?

    SELECT Per_ID,
           Global_Per_ID,
           STart_DT,
           Prev_Contact,
           NVL2(change, ROW_NUMBER() OVER (PARTITION BY change ORDER BY STart_DT), 0) ConsecutiveIntervals
      FROM (SELECT Per_ID,
                   Global_Per_ID,
                   STart_DT,
                   Prev_Contact,
                   NVL2(Prev_Contact, CONDITIONAL_CHANGE_EVENT(Prev_Contact) OVER (PARTITION BY Per_ID ORDER BY start_dt), NULL) change
              FROM AllPDisks) foo
     ORDER BY Per_ID, STart_DT DESC;
    

    Example:

    dbadmin=> SELECT Per_ID,
    dbadmin->        Global_Per_ID,
    dbadmin->        STart_DT,
    dbadmin->        Prev_Contact,
    dbadmin->        NVL2(change, ROW_NUMBER() OVER (PARTITION BY change ORDER BY STart_DT), 0) ConsecutiveIntervals
    dbadmin->   FROM (SELECT Per_ID,
    dbadmin(>                Global_Per_ID,
    dbadmin(>                STart_DT,
    dbadmin(>                Prev_Contact,
    dbadmin(>                NVL2(Prev_Contact, CONDITIONAL_CHANGE_EVENT(Prev_Contact) OVER (PARTITION BY Per_ID ORDER BY start_dt), NULL) change
    dbadmin(>           FROM AllPDisks) foo
    dbadmin->  ORDER BY Per_ID, STart_DT DESC;
     Per_ID | Global_Per_ID |      STart_DT       | Prev_Contact | ConsecutiveIntervals
    --------+---------------+---------------------+--------------+----------------------
      50000 |          1500 | 2019-08-30 09:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 09:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 09:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 08:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 07:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 07:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 07:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 07:20:00 | A            |                    3
      50000 |          1500 | 2019-08-30 07:10:00 | A            |                    2
      50000 |          1500 | 2019-08-30 07:00:00 | A            |                    1
      50000 |          1500 | 2019-08-30 06:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 06:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 06:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 06:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 06:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 06:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 05:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 04:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 03:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 02:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 02:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 02:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 02:20:00 | A            |                    1
      50000 |          1500 | 2019-08-30 02:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 02:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 01:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 01:40:00 | A            |                    1
      50000 |          1500 | 2019-08-30 01:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 01:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 01:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 01:00:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:50:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:40:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:30:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:20:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:10:00 |              |                    0
      50000 |          1500 | 2019-08-30 00:00:00 |              |                    0
    (57 rows)
    
  • Thanks @Jim_Knicely . It worked as expected.

  • Jim_KnicelyJim_Knicely Administrator

    Cool!

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file

Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.