0

I have table like this:

p_id | store |      createdat      | device  | deviceserial | application
------+-------+---------------------+---------+--------------+-------------
      | z10   | 2020-09-02 08:02:39 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 08:08:18 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 08:10:10 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 08:20:10 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 10:40:11 | IOS     | 6625839827   | app-b
      | z10   | 2020-09-02 10:45:11 | IOS     | 6625839827   | app-b
      | z10   | 2020-09-02 10:50:11 | IOS     | 6625839827   | app-b
      | z11   | 2020-09-02 08:47:10 | Android | 636363636891 | app-a
      | z11   | 2020-09-02 08:55:10 | Android | 636363636891 | app-a
      | z11   | 2020-09-02 08:59:10 | Android | 636363636891 | app-a
      | z11   | 2020-09-02 13:01:11 | IOS     | 6625839828   | app-b
      | z11   | 2020-09-02 13:15:11 | IOS     | 6625839828   | app-b
      | z10   | 2020-09-02 12:03:10 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 12:09:10 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 12:12:10 | Android | 636363636890 | app-a
      | z10   | 2020-09-02 15:15:11 | IOS     | 6625839827   | app-b
      | z10   | 2020-09-02 15:20:11 | IOS     | 6625839827   | app-b
      | z11   | 2020-09-02 10:25:10 | Android | 636363636891 | app-a
      | z11   | 2020-09-02 10:35:10 | Android | 636363636891 | app-a

I am trying to insert some queries to another table(device_usage_test1).Thats my table:

create table if not exists device_usage_test1(id SERIAL,deviceserial VARCHAR(50) UNIQUE,device VARCHAR(50),deviceusage DOUBLE PRECISION)

And this is my insert command :

insert into device_usage_test1(deviceserial,device,deviceusage) select deviceserial,device,sum(deviceusage) as deviceusage
from (
    select deviceserial,device,
         
     extract(epoch from (max(createdat)::timestamp - min(createdat)::timestamp)) as deviceusage,
     date_trunc('hour', createdat) +
     (((date_part('minute', createdat)::integer / 10::integer) * 10::integer)|| ' minutes')::interval AS hr
    FROM datatable  
    group by deviceserial,hr,device
) t
group by deviceserial,device;

Later I will make upsert query on device_usage_test1.So my deviceserial have to be unique. But when I try the insert with deviceserial (unique).It gives me error: ERROR:duplicate key value violates unique constraint "device_usage_deviceserial_key""

DETAIL: Key (deviceserial)=(636363636890) already exists.

1 Answer 1

1

You have a data issue. Your select is returning at least 2 devices with the same deviceserail (1004573GT7). Try running the following:

select * 
  from (select deviceserial
              ,device       
              ,extract(epoch from (max(createdat)::timestamp - min(createdat)::timestamp)) as deviceusage
              ,date_trunc('hour', createdat) +
               (((date_part('minute', createdat)::integer / 10::integer) * 10::integer)|| ' minutes')::interval AS hr
         from datatable where createdat > now() - interval '1 day' 
        group by deviceserial,hr,device
       ) s
having count(deviceserial)>1
order by deviceserial;

NOTE: No table definitions nor sample data given , above not tested.


Second:
You can not have a unique constraint deviceserail and at the same time have multiple devices for a deviceserail. Having multiple devices per deviceserail is by definition not unique on deviceserail. You can instead define your unique key on both (deviceserail,device).

drop table device_usage_test1; 
create table device_usage_test1(id serial
                               ,deviceserial varchar(50) 
                               ,device varchar(50)
                               ,deviceusage double precision
                               ,constraint device_usage_test1_pk 
                                           primary key (id)
                               ,constraint device_usage_test1_bk
                                           unique (deviceserial,device)
                               );   

However, this allows your initial bad data having both "android" and "Android" to co-exist. you may want to restrict capitalization on device. For example:

alter table device_usage_test1
        add constraint device_initcap_check 
            check (device = initcap(device));
Sign up to request clarification or add additional context in comments.

5 Comments

Thank you,@Belayer Your query is giving me error.I added sample data.
Saying it is gives an error is useless. What Error! But in this case that is not necessary. The first 2 rows of your sample data is sufficient to detect the initial error. You group by deviceserial,device but have a unique constraint on deviceserial. Since "android" does not equal "Android" you have 2 groups with deviceserial 636363636890. Thus violating the unique constraint. It would seem you have 2 options: #1 and perfered clean-up your data; #2 just group by deviceserial; #3 group by deviceserial, initcap(device) - however it is unpredictable as to which device is choose.
Thanks,@Belayer.Somehow I did not notice it.Its working now with sample data.But in real data there is not such issue of capital and small letter.If I remove device from my query and groupby with deviceSerial, it give successful result.But If i add device in my query ,it gives me error.But I need device column value from corresponding deviceSerial.I am pretty new to Postgres.Any idea,how can I select device without putting it in groupby.
See revised answer. It changes the unique key.
Thank you @Belayer.It worked perfectly .Thats a big help.If I want to update the table with upsert query Postgres which constraint,should I target?On conflict (deviceserial, device)?.This table (device_usage_test1)will update everyday with different time(createdAt).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.