83

I do not understand what is wrong with this query? Query tool does not want to create a table in PostgreSQL.

CREATE TABLE article (
article_id bigint(20) NOT NULL auto_increment,
article_name varchar(20) NOT NULL,
article_desc text NOT NULL,
date_added datetime default NULL,
PRIMARY KEY (article_id)
);
2
  • ERROR: syntax error at or near "(" SQL state: 42601 Character: 41 Commented Mar 22, 2012 at 16:51
  • 1
    I also get ERROR: syntax error at or near "(" LINE 2: article_id bigint(20) NOT NULL auto_increment, Commented Mar 22, 2012 at 16:55

6 Answers 6

156

First the bigint(20) not null auto_increment will not work, simply use bigserial primary key. Then datetime is timestamp in PostgreSQL. All in all:

CREATE TABLE article (
    article_id bigserial primary key,
    article_name varchar(20) NOT NULL,
    article_desc text NOT NULL,
    date_added timestamp default NULL
);
Sign up to request clarification or add additional context in comments.

3 Comments

On why, auto_increment is a MySQL feature. Postgres uses serial columns for the same purpose.
@BradKoch: Yes and no. PostgreSQL offers serial and bigserial. Since the Q contains bigint(20) I had chosen bigserial in my answer. This is a better match in this case.
Given OP's apparent use case as a blog article engine, a maximum serial size of 2 billion entries (serial) should be sufficient unless he is extremely prolific. ;) But, still, bigserial is a closer match to bigint.
8
-- Table: "user"

-- DROP TABLE "user";

CREATE TABLE "user"
(
  id bigserial NOT NULL,
  name text NOT NULL,
  email character varying(20) NOT NULL,
  password text NOT NULL,
  CONSTRAINT user_pkey PRIMARY KEY (id)
)
WITH (
  OIDS=FALSE
);
ALTER TABLE "user"
  OWNER TO postgres;

Comments

4

Replace bigint(20) not null auto_increment by bigserial not null and datetime by timestamp

1 Comment

Then create table will succeed.
0

To create N tables with a prefix use this script. This code uses a for loop and variable to creates 10 table starting with prefix 'sbtest' namely sbtest1, sbtest2 ... sbtest10

create_table.sql

do $$
    DECLARE myvar integer;
begin
    for myvar in 1..10 loop
        EXECUTE format('CREATE TABLE sbtest%s (
        id SERIAL NOT NULL,
        k INTEGER NOT NULL,
        c CHAR(120) NOT NULL,
        pad CHAR(60) NOT NULL,
        PRIMARY KEY (id))', myvar);
    end loop;
end; $$

Command to run: psql -U user_name -d database_name -f create_table.sql

Command: \d+ sbtest

id | k | c | pad
----+---+---+-----
(0 rows)

                                                   Table "public.sbtest1"
 Column |      Type      | Collation | Nullable |               Default               | Storage  | Stats
 target | Description
--------+----------------+-----------+----------+-------------------------------------+----------+------
--------+-------------
 id     | integer        |           | not null | nextval('sbtest1_id_seq'::regclass) | plain    |
        |
 k      | integer        |           | not null |                                     | plain    |
        |
 c      | character(120) |           | not null |                                     | extended |
        |
 pad    | character(60)  |           | not null |                                     | extended |
        |
Indexes:
    "sbtest1_pkey" PRIMARY KEY, btree (id)
Access method: heap

Comments

0

For PostgreSQL 10 and above (please avoid using serial)

CREATE TABLE article (
  article_id bigint generated always as identity primary key,
  article_name varchar(20) NOT NULL,
  article_desc text NOT NULL,
  date_added timestamp default NULL
);

Comments

-6

Please try this:

CREATE TABLE article (
  article_id bigint(20) NOT NULL serial,
  article_name varchar(20) NOT NULL,
  article_desc text NOT NULL,
  date_added datetime default NULL,
  PRIMARY KEY (article_id)
);

1 Comment

bigint(20) is invalid in Postgres. Additionally you can't specify bigint and serial at the same time. See the accepted answer for the correct syntax

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.