0

Because of a migration from FileGDB to PostGIS with FME we have several tables in our PostGIS database with a lot of empty columns (due to PostGIS writer in FME).

I would like to have a function to detect (and thereafter drop) these empty columns per schema or database. This is something that could be useful in the future too. I started with a PL/pgSQL function to detect for a given column if it is empty. Thereafter I could make a script to drop every column within a schema where my first script returns 'TRUE'.

Now I'm stuck with the first one, this is what I,ve got so far:

CREATE OR REPLACE FUNCTION admin.check_empty_column(
inputschemaname character varying,
inputtablename character varying,
inputfieldname character varying)
RETURNS boolean
LANGUAGE 'plpgsql'
COST 100
VOLATILE 
AS $BODY$
BEGIN
IF not exists (select inputfieldname from inputschemaname.inputtablename where inputfieldname is not null)
    THEN RETURN true;
else
    return false;
END IF;
END;
$BODY$;

ALTER FUNCTION admin.check_empty_column(character varying, character varying, character varying)
OWNER TO postgres;

If I run this scripts like this:

select admin.check_empty_column('ref','be_gem','numac')

I get this error:

"ERROR:  relation "inputschemaname.inputtablename" does not exist
LINE 1: SELECT not exists (select inputfieldname from inputschemanam..."

Any idea how to work around it?

2
  • You need dynamic SQL for that. Commented May 4, 2018 at 12:26
  • This problem has been solved using format() It is now rewritten to Commented May 7, 2018 at 7:48

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.