Re: Getting rid of regression test input/ and output/ files

Поиск
Список
Период
Сортировка
От Tom Lane
Тема Re: Getting rid of regression test input/ and output/ files
Дата
Msg-id 2207463.1639948087@sss.pgh.pa.us
обсуждение исходный текст
Ответ на Getting rid of regression test input/ and output/ files  (Tom Lane <tgl@sss.pgh.pa.us>)
Ответы Re: Getting rid of regression test input/ and output/ files  (Corey Huinker <corey.huinker@gmail.com>)
Список pgsql-hackers
I wrote:
> This led me to wonder why we couldn't get rid of that entire
> mechanism in favor of some less-painful way of getting that
> information into the scripts.  If we had the desired values in
> psql variables, we could do what we need easily, for example ...

Here's some fleshed-out patches for this.

0001 adds the \getenv command to psql; now with documentation
and a simple regression test.

0002 tweaks pg_regress to export the needed values as environment
variables, and modifies the test scripts to use those variables.
(For ease of review, this patch modifies the scripts in-place,
and then 0003 will move them.)  A few comments on this:

* I didn't see any value in exporting @testtablespace@ as a separate
variable; we might as well let the test script know how to construct
that path name.

* I concluded that the right way to handle the concatenation issue
is *not* to rely on SQL literal concatenation, but to use psql's
\set command to concatenate parts of a string.  In particular this
gives us a clean way to handle quoting/escaping rules in the places
where a pathname has to be embedded in some larger string, such as
a function body.  The golden rule for that seems to be "use one \set
per level of quoting".  I believe this code is now fairly proof
against situations that would completely break the existing way of
doing things, such as pathnames with quotes or backslashes in them.
(It's hard to test the embedded-quote case, because that breaks the
Makefiles too, but I did get through the regression tests with a
path including a backslash.)

* There are a couple of places where the existing tests involve
substituting a path name into expected query output or error messages.
This technique cannot handle that, but we have plenty of prior art for
dealing with such cases.  I changed file_fdw to use a filter function
to hide the pathnames in EXPLAIN output, and tweaked create_function_0
to show only an edited version of an error message (this is based on a
similar case in infinite_recurse.sql).

0003 simply "git mv"'s the scripts and output files into place as
normal not-requiring-editing files.  Be careful to "make clean"
before applying this, else you may have conflicts with the target
files already being present.  Also, while you can run the tests
between 0003 and 0004, don't do "make clean" in this state or the
hacky EXTRA_CLEAN rules in dblink and file_fdw will remove files
you want.

0004 finally removes the no-longer-needed infrastructure in
pg_regress and the makefiles.  (BTW, as far as I can find, the
MSVC scripts have no provisions for cleaning these generated files?)

There's some refactoring that could be done afterwards, for example
there seems little reason for dblink's paths.sql to continue to exist
as a separate script.  But it seemed best for this patch series to
convert the scripts as mechanically as possible.

I'm fairly pleased with how this came out.  I think these scripts
will be *much* easier to maintain in this form.  Updating the
output/*.source files was always a major pain in the rear, since
you couldn't just copy results/ files to them.

Comments?

            regards, tom lane

diff --git a/doc/src/sgml/ref/psql-ref.sgml b/doc/src/sgml/ref/psql-ref.sgml
index 48248f750e..ae38d3dcc3 100644
--- a/doc/src/sgml/ref/psql-ref.sgml
+++ b/doc/src/sgml/ref/psql-ref.sgml
@@ -2237,6 +2237,28 @@ Tue Oct 26 21:40:57 CEST 1999
       </varlistentry>


+      <varlistentry>
+        <term><literal>\getenv <replaceable class="parameter">psql_var</replaceable> <replaceable
class="parameter">env_var</replaceable></literal></term>
+
+        <listitem>
+        <para>
+         Gets the value of the environment
+         variable <replaceable class="parameter">env_var</replaceable>
+         and assigns it to the <application>psql</application>
+         variable <replaceable class="parameter">psql_var</replaceable>.
+         If <replaceable class="parameter">env_var</replaceable> is
+         not defined in the <application>psql</application> process's
+         environment, <replaceable class="parameter">psql_var</replaceable>
+         is not changed.  Example:
+<programlisting>
+=> <userinput>\getenv home HOME</userinput>
+=> <userinput>\echo :home</userinput>
+/home/postgres
+</programlisting></para>
+        </listitem>
+      </varlistentry>
+
+
       <varlistentry>
         <term><literal>\gexec</literal></term>

diff --git a/src/bin/psql/command.c b/src/bin/psql/command.c
index ccd7b48108..fb3bab9494 100644
--- a/src/bin/psql/command.c
+++ b/src/bin/psql/command.c
@@ -98,6 +98,8 @@ static backslashResult process_command_g_options(char *first_option,
                                                  bool active_branch,
                                                  const char *cmd);
 static backslashResult exec_command_gdesc(PsqlScanState scan_state, bool active_branch);
+static backslashResult exec_command_getenv(PsqlScanState scan_state, bool active_branch,
+                                           const char *cmd);
 static backslashResult exec_command_gexec(PsqlScanState scan_state, bool active_branch);
 static backslashResult exec_command_gset(PsqlScanState scan_state, bool active_branch);
 static backslashResult exec_command_help(PsqlScanState scan_state, bool active_branch);
@@ -348,6 +350,8 @@ exec_command(const char *cmd,
         status = exec_command_g(scan_state, active_branch, cmd);
     else if (strcmp(cmd, "gdesc") == 0)
         status = exec_command_gdesc(scan_state, active_branch);
+    else if (strcmp(cmd, "getenv") == 0)
+        status = exec_command_getenv(scan_state, active_branch, cmd);
     else if (strcmp(cmd, "gexec") == 0)
         status = exec_command_gexec(scan_state, active_branch);
     else if (strcmp(cmd, "gset") == 0)
@@ -1481,6 +1485,43 @@ exec_command_gdesc(PsqlScanState scan_state, bool active_branch)
     return status;
 }

+/*
+ * \getenv -- set variable from environment variable
+ */
+static backslashResult
+exec_command_getenv(PsqlScanState scan_state, bool active_branch,
+                    const char *cmd)
+{
+    bool        success = true;
+
+    if (active_branch)
+    {
+        char       *myvar = psql_scan_slash_option(scan_state,
+                                                   OT_NORMAL, NULL, false);
+        char       *envvar = psql_scan_slash_option(scan_state,
+                                                    OT_NORMAL, NULL, false);
+
+        if (!myvar || !envvar)
+        {
+            pg_log_error("\\%s: missing required argument", cmd);
+            success = false;
+        }
+        else
+        {
+            char       *envval = getenv(envvar);
+
+            if (envval && !SetVariable(pset.vars, myvar, envval))
+                success = false;
+        }
+        free(myvar);
+        free(envvar);
+    }
+    else
+        ignore_slash_options(scan_state);
+
+    return success ? PSQL_CMD_SKIP_LINE : PSQL_CMD_ERROR;
+}
+
 /*
  * \gexec -- send query and execute each field of result
  */
diff --git a/src/test/regress/expected/psql.out b/src/test/regress/expected/psql.out
index 930ce8597a..6428ebc507 100644
--- a/src/test/regress/expected/psql.out
+++ b/src/test/regress/expected/psql.out
@@ -282,6 +282,18 @@ select '2000-01-01'::date as party_over
 (1 row)

 \unset FETCH_COUNT
+-- \setenv, \getenv
+-- ensure MYVAR isn't set
+\setenv MYVAR
+-- in which case, reading it doesn't change the target
+\getenv res MYVAR
+\echo :res
+:res
+-- now set it
+\setenv MYVAR 'environment value'
+\getenv res MYVAR
+\echo :res
+environment value
 -- show all pset options
 \pset
 border                   1
diff --git a/src/test/regress/sql/psql.sql b/src/test/regress/sql/psql.sql
index e9d504baf2..d4e4fdbbb7 100644
--- a/src/test/regress/sql/psql.sql
+++ b/src/test/regress/sql/psql.sql
@@ -141,6 +141,18 @@ select 'drop table gexec_test', 'select ''2000-01-01''::date as party_over'

 \unset FETCH_COUNT

+-- \setenv, \getenv
+
+-- ensure MYVAR isn't set
+\setenv MYVAR
+-- in which case, reading it doesn't change the target
+\getenv res MYVAR
+\echo :res
+-- now set it
+\setenv MYVAR 'environment value'
+\getenv res MYVAR
+\echo :res
+
 -- show all pset options
 \pset

diff --git a/contrib/dblink/input/paths.source b/contrib/dblink/input/paths.source
index 881a65314f..895a5e11e0 100644
--- a/contrib/dblink/input/paths.source
+++ b/contrib/dblink/input/paths.source
@@ -1,14 +1,23 @@
 -- Initialization that requires path substitution.

+-- directory paths and DLSUFFIX are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
+
 CREATE FUNCTION setenv(text, text)
    RETURNS void
-   AS '@libdir@/regress@DLSUFFIX@', 'regress_setenv'
+   AS :'regresslib', 'regress_setenv'
    LANGUAGE C STRICT;

 CREATE FUNCTION wait_pid(int)
    RETURNS void
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;

+\set path :ABS_SRCDIR '/'
+\set fnbody 'SELECT setenv(''PGSERVICEFILE'', ' :'path' ' || $1)'
 CREATE FUNCTION set_pgservicefile(text) RETURNS void LANGUAGE SQL
-    AS $$SELECT setenv('PGSERVICEFILE', '@abs_srcdir@/' || $1)$$;
+    AS :'fnbody';
diff --git a/contrib/dblink/output/paths.source b/contrib/dblink/output/paths.source
index 8ed95e1f78..9bbbeebf1c 100644
--- a/contrib/dblink/output/paths.source
+++ b/contrib/dblink/output/paths.source
@@ -1,11 +1,18 @@
 -- Initialization that requires path substitution.
+-- directory paths and DLSUFFIX are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
 CREATE FUNCTION setenv(text, text)
    RETURNS void
-   AS '@libdir@/regress@DLSUFFIX@', 'regress_setenv'
+   AS :'regresslib', 'regress_setenv'
    LANGUAGE C STRICT;
 CREATE FUNCTION wait_pid(int)
    RETURNS void
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;
+\set path :ABS_SRCDIR '/'
+\set fnbody 'SELECT setenv(''PGSERVICEFILE'', ' :'path' ' || $1)'
 CREATE FUNCTION set_pgservicefile(text) RETURNS void LANGUAGE SQL
-    AS $$SELECT setenv('PGSERVICEFILE', '@abs_srcdir@/' || $1)$$;
+    AS :'fnbody';
diff --git a/contrib/file_fdw/input/file_fdw.source b/contrib/file_fdw/input/file_fdw.source
index 45b728eeb3..553e3a4e3e 100644
--- a/contrib/file_fdw/input/file_fdw.source
+++ b/contrib/file_fdw/input/file_fdw.source
@@ -2,6 +2,9 @@
 -- Test foreign-data wrapper file_fdw.
 --

+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+
 -- Clean up in case a prior regression run failed
 SET client_min_messages TO 'warning';
 DROP ROLE IF EXISTS regress_file_fdw_superuser, regress_file_fdw_user, regress_no_priv_user;
@@ -14,6 +17,22 @@ CREATE ROLE regress_no_priv_user LOGIN;                 -- has priv but no user
 -- Install file_fdw
 CREATE EXTENSION file_fdw;

+-- create function to filter unstable results of EXPLAIN
+CREATE FUNCTION explain_filter(text) RETURNS setof text
+LANGUAGE plpgsql AS
+$$
+declare
+    ln text;
+begin
+    for ln in execute $1
+    loop
+        -- Remove the path portion of foreign file names
+        ln := regexp_replace(ln, 'Foreign File: .*/([a-z.]+)$', 'Foreign File: .../\1');
+        return next ln;
+    end loop;
+end;
+$$;
+
 -- regress_file_fdw_superuser owns fdw-related objects
 SET ROLE regress_file_fdw_superuser;
 CREATE SERVER file_server FOREIGN DATA WRAPPER file_fdw;
@@ -61,33 +80,39 @@ CREATE FOREIGN TABLE tbl () SERVER file_server OPTIONS (format 'csv', null '
 ');       -- ERROR
 CREATE FOREIGN TABLE tbl () SERVER file_server;  -- ERROR

+\set filename :ABS_SRCDIR '/data/agg.data'
 CREATE FOREIGN TABLE agg_text (
     a    int2 CHECK (a >= 0),
     b    float4
 ) SERVER file_server
-OPTIONS (format 'text', filename '@abs_srcdir@/data/agg.data', delimiter '    ', null '\N');
+OPTIONS (format 'text', filename :'filename', delimiter '    ', null '\N');
 GRANT SELECT ON agg_text TO regress_file_fdw_user;
+
+\set filename :ABS_SRCDIR '/data/agg.csv'
 CREATE FOREIGN TABLE agg_csv (
     a    int2,
     b    float4
 ) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/agg.csv', header 'true', delimiter ';', quote '@', escape '"', null
'');
+OPTIONS (format 'csv', filename :'filename', header 'true', delimiter ';', quote '@', escape '"', null '');
 ALTER FOREIGN TABLE agg_csv ADD CHECK (a >= 0);
+
+\set filename :ABS_SRCDIR '/data/agg.bad'
 CREATE FOREIGN TABLE agg_bad (
     a    int2,
     b    float4
 ) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/agg.bad', header 'true', delimiter ';', quote '@', escape '"', null
'');
+OPTIONS (format 'csv', filename :'filename', header 'true', delimiter ';', quote '@', escape '"', null '');
 ALTER FOREIGN TABLE agg_bad ADD CHECK (a >= 0);

 -- per-column options tests
+\set filename :ABS_SRCDIR '/data/text.csv'
 CREATE FOREIGN TABLE text_csv (
     word1 text OPTIONS (force_not_null 'true'),
     word2 text OPTIONS (force_not_null 'off'),
     word3 text OPTIONS (force_null 'true'),
     word4 text OPTIONS (force_null 'off')
 ) SERVER file_server
-OPTIONS (format 'text', filename '@abs_srcdir@/data/text.csv', null 'NULL');
+OPTIONS (format 'text', filename :'filename', null 'NULL');
 SELECT * FROM text_csv; -- ERROR
 ALTER FOREIGN TABLE text_csv OPTIONS (SET format 'csv');
 \pset null _null_
@@ -119,7 +144,7 @@ SELECT * FROM agg_bad;               -- ERROR

 -- misc query tests
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv');
 \t off
 PREPARE st(int) AS SELECT * FROM agg_csv WHERE a = $1;
 EXECUTE st(100);
@@ -143,12 +168,12 @@ COPY agg_csv FROM STDIN;

 -- constraint exclusion tests
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0');
 \t off
 SELECT * FROM agg_csv WHERE a < 0;
 SET constraint_exclusion = 'on';
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0');
 \t off
 SELECT * FROM agg_csv WHERE a < 0;
 RESET constraint_exclusion;
@@ -170,14 +195,17 @@ DROP TABLE agg;
 -- declarative partitioning tests
 SET ROLE regress_file_fdw_superuser;
 CREATE TABLE pt (a int, b text) partition by list (a);
+\set filename :ABS_SRCDIR '/data/list1.csv'
 CREATE FOREIGN TABLE p1 partition of pt for values in (1) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/list1.csv', delimiter ',');
+OPTIONS (format 'csv', filename :'filename', delimiter ',');
 CREATE TABLE p2 partition of pt for values in (2);
 SELECT tableoid::regclass, * FROM pt;
 SELECT tableoid::regclass, * FROM p1;
 SELECT tableoid::regclass, * FROM p2;
-COPY pt FROM '@abs_srcdir@/data/list2.bad' with (format 'csv', delimiter ','); -- ERROR
-COPY pt FROM '@abs_srcdir@/data/list2.csv' with (format 'csv', delimiter ',');
+\set filename :ABS_SRCDIR '/data/list2.bad'
+COPY pt FROM :'filename' with (format 'csv', delimiter ','); -- ERROR
+\set filename :ABS_SRCDIR '/data/list2.csv'
+COPY pt FROM :'filename' with (format 'csv', delimiter ',');
 SELECT tableoid::regclass, * FROM pt;
 SELECT tableoid::regclass, * FROM p1;
 SELECT tableoid::regclass, * FROM p2;
@@ -190,8 +218,9 @@ SELECT tableoid::regclass, * FROM p2;
 DROP TABLE pt;

 -- generated column tests
+\set filename :ABS_SRCDIR '/data/list1.csv'
 CREATE FOREIGN TABLE gft1 (a int, b text, c text GENERATED ALWAYS AS ('foo') STORED) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/list1.csv', delimiter ',');
+OPTIONS (format 'csv', filename :'filename', delimiter ',');
 SELECT a, c FROM gft1;
 DROP FOREIGN TABLE gft1;

@@ -204,7 +233,7 @@ SET ROLE regress_no_priv_user;
 SELECT * FROM agg_text ORDER BY a;   -- ERROR
 SET ROLE regress_file_fdw_user;
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_text WHERE a > 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_text WHERE a > 0');
 \t off
 -- file FDW allows foreign tables to be accessed without user mapping
 DROP USER MAPPING FOR regress_file_fdw_user SERVER file_server;
diff --git a/contrib/file_fdw/output/file_fdw.source b/contrib/file_fdw/output/file_fdw.source
index 52b4d5f1df..da3e5eac2c 100644
--- a/contrib/file_fdw/output/file_fdw.source
+++ b/contrib/file_fdw/output/file_fdw.source
@@ -1,6 +1,8 @@
 --
 -- Test foreign-data wrapper file_fdw.
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
 -- Clean up in case a prior regression run failed
 SET client_min_messages TO 'warning';
 DROP ROLE IF EXISTS regress_file_fdw_superuser, regress_file_fdw_user, regress_no_priv_user;
@@ -10,6 +12,21 @@ CREATE ROLE regress_file_fdw_user LOGIN;                -- has priv and user map
 CREATE ROLE regress_no_priv_user LOGIN;                 -- has priv but no user mapping
 -- Install file_fdw
 CREATE EXTENSION file_fdw;
+-- create function to filter unstable results of EXPLAIN
+CREATE FUNCTION explain_filter(text) RETURNS setof text
+LANGUAGE plpgsql AS
+$$
+declare
+    ln text;
+begin
+    for ln in execute $1
+    loop
+        -- Remove the path portion of foreign file names
+        ln := regexp_replace(ln, 'Foreign File: .*/([a-z.]+)$', 'Foreign File: .../\1');
+        return next ln;
+    end loop;
+end;
+$$;
 -- regress_file_fdw_superuser owns fdw-related objects
 SET ROLE regress_file_fdw_superuser;
 CREATE SERVER file_server FOREIGN DATA WRAPPER file_fdw;
@@ -77,32 +94,36 @@ CREATE FOREIGN TABLE tbl () SERVER file_server OPTIONS (format 'csv', null '
 ERROR:  COPY null representation cannot use newline or carriage return
 CREATE FOREIGN TABLE tbl () SERVER file_server;  -- ERROR
 ERROR:  either filename or program is required for file_fdw foreign tables
+\set filename :ABS_SRCDIR '/data/agg.data'
 CREATE FOREIGN TABLE agg_text (
     a    int2 CHECK (a >= 0),
     b    float4
 ) SERVER file_server
-OPTIONS (format 'text', filename '@abs_srcdir@/data/agg.data', delimiter '    ', null '\N');
+OPTIONS (format 'text', filename :'filename', delimiter '    ', null '\N');
 GRANT SELECT ON agg_text TO regress_file_fdw_user;
+\set filename :ABS_SRCDIR '/data/agg.csv'
 CREATE FOREIGN TABLE agg_csv (
     a    int2,
     b    float4
 ) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/agg.csv', header 'true', delimiter ';', quote '@', escape '"', null
'');
+OPTIONS (format 'csv', filename :'filename', header 'true', delimiter ';', quote '@', escape '"', null '');
 ALTER FOREIGN TABLE agg_csv ADD CHECK (a >= 0);
+\set filename :ABS_SRCDIR '/data/agg.bad'
 CREATE FOREIGN TABLE agg_bad (
     a    int2,
     b    float4
 ) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/agg.bad', header 'true', delimiter ';', quote '@', escape '"', null
'');
+OPTIONS (format 'csv', filename :'filename', header 'true', delimiter ';', quote '@', escape '"', null '');
 ALTER FOREIGN TABLE agg_bad ADD CHECK (a >= 0);
 -- per-column options tests
+\set filename :ABS_SRCDIR '/data/text.csv'
 CREATE FOREIGN TABLE text_csv (
     word1 text OPTIONS (force_not_null 'true'),
     word2 text OPTIONS (force_not_null 'off'),
     word3 text OPTIONS (force_null 'true'),
     word4 text OPTIONS (force_null 'off')
 ) SERVER file_server
-OPTIONS (format 'text', filename '@abs_srcdir@/data/text.csv', null 'NULL');
+OPTIONS (format 'text', filename :'filename', null 'NULL');
 SELECT * FROM text_csv; -- ERROR
 ERROR:  COPY force not null available only in CSV mode
 ALTER FOREIGN TABLE text_csv OPTIONS (SET format 'csv');
@@ -176,10 +197,10 @@ ERROR:  invalid input syntax for type real: "aaa"
 CONTEXT:  COPY agg_bad, line 3, column b: "aaa"
 -- misc query tests
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv');
  Foreign Scan on public.agg_csv
    Output: a, b
-   Foreign File: @abs_srcdir@/data/agg.csv
+   Foreign File: .../agg.csv

 \t off
 PREPARE st(int) AS SELECT * FROM agg_csv WHERE a = $1;
@@ -226,11 +247,11 @@ COPY agg_csv FROM STDIN;
 ERROR:  cannot insert into foreign table "agg_csv"
 -- constraint exclusion tests
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0');
  Foreign Scan on public.agg_csv
    Output: a, b
    Filter: (agg_csv.a < 0)
-   Foreign File: @abs_srcdir@/data/agg.csv
+   Foreign File: .../agg.csv

 \t off
 SELECT * FROM agg_csv WHERE a < 0;
@@ -240,7 +261,7 @@ SELECT * FROM agg_csv WHERE a < 0;

 SET constraint_exclusion = 'on';
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_csv WHERE a < 0');
  Result
    Output: a, b
    One-Time Filter: false
@@ -295,8 +316,9 @@ DROP TABLE agg;
 -- declarative partitioning tests
 SET ROLE regress_file_fdw_superuser;
 CREATE TABLE pt (a int, b text) partition by list (a);
+\set filename :ABS_SRCDIR '/data/list1.csv'
 CREATE FOREIGN TABLE p1 partition of pt for values in (1) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/list1.csv', delimiter ',');
+OPTIONS (format 'csv', filename :'filename', delimiter ',');
 CREATE TABLE p2 partition of pt for values in (2);
 SELECT tableoid::regclass, * FROM pt;
  tableoid | a |  b
@@ -317,10 +339,12 @@ SELECT tableoid::regclass, * FROM p2;
 ----------+---+---
 (0 rows)

-COPY pt FROM '@abs_srcdir@/data/list2.bad' with (format 'csv', delimiter ','); -- ERROR
+\set filename :ABS_SRCDIR '/data/list2.bad'
+COPY pt FROM :'filename' with (format 'csv', delimiter ','); -- ERROR
 ERROR:  cannot insert into foreign table "p1"
 CONTEXT:  COPY pt, line 2: "1,qux"
-COPY pt FROM '@abs_srcdir@/data/list2.csv' with (format 'csv', delimiter ',');
+\set filename :ABS_SRCDIR '/data/list2.csv'
+COPY pt FROM :'filename' with (format 'csv', delimiter ',');
 SELECT tableoid::regclass, * FROM pt;
  tableoid | a |  b
 ----------+---+-----
@@ -376,8 +400,9 @@ SELECT tableoid::regclass, * FROM p2;

 DROP TABLE pt;
 -- generated column tests
+\set filename :ABS_SRCDIR '/data/list1.csv'
 CREATE FOREIGN TABLE gft1 (a int, b text, c text GENERATED ALWAYS AS ('foo') STORED) SERVER file_server
-OPTIONS (format 'csv', filename '@abs_srcdir@/data/list1.csv', delimiter ',');
+OPTIONS (format 'csv', filename :'filename', delimiter ',');
 SELECT a, c FROM gft1;
  a |   c
 ---+--------
@@ -412,11 +437,11 @@ SELECT * FROM agg_text ORDER BY a;   -- ERROR
 ERROR:  permission denied for foreign table agg_text
 SET ROLE regress_file_fdw_user;
 \t on
-EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_text WHERE a > 0;
+SELECT explain_filter('EXPLAIN (VERBOSE, COSTS FALSE) SELECT * FROM agg_text WHERE a > 0');
  Foreign Scan on public.agg_text
    Output: a, b
    Filter: (agg_text.a > 0)
-   Foreign File: @abs_srcdir@/data/agg.data
+   Foreign File: .../agg.data

 \t off
 -- file FDW allows foreign tables to be accessed without user mapping
diff --git a/src/pl/plpgsql/src/input/plpgsql_copy.source b/src/pl/plpgsql/src/input/plpgsql_copy.source
index b7bcbb7d17..35e1c6af3f 100644
--- a/src/pl/plpgsql/src/input/plpgsql_copy.source
+++ b/src/pl/plpgsql/src/input/plpgsql_copy.source
@@ -1,3 +1,11 @@
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+
+-- set up file names to use
+\set srcfilename :ABS_SRCDIR '/data/copy1.data'
+\set destfilename :ABS_BUILDDIR '/results/copy1.data'
+
 CREATE TABLE copy1 (a int, b float);

 -- COPY TO/FROM not authorized from client.
@@ -24,38 +32,26 @@ $$;

 -- Valid cases
 -- COPY FROM
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 FROM '@abs_srcdir@/data/copy1.data';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 FROM ' :'srcfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
 SELECT * FROM copy1 ORDER BY 1;
 TRUNCATE copy1;
-DO LANGUAGE plpgsql $$
-BEGIN
-  EXECUTE 'COPY copy1 FROM ''@abs_srcdir@/data/copy1.data''';
-END;
-$$;
+\set cmd 'COPY copy1 FROM ' :'srcfilename'
+\set dobody 'BEGIN EXECUTE ' :'cmd' '; END'
+DO LANGUAGE plpgsql :'dobody';
 SELECT * FROM copy1 ORDER BY 1;

 -- COPY TO
 -- Copy the data externally once, then process it back to the table.
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 TO '@abs_builddir@/results/copy1.data';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 TO ' :'destfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
 TRUNCATE copy1;
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 FROM '@abs_builddir@/results/copy1.data';
-END;
-$$;
-DO LANGUAGE plpgsql $$
-BEGIN
-  EXECUTE 'COPY copy1 FROM ''@abs_builddir@/results/copy1.data''';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 FROM ' :'destfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
+
+\set cmd 'COPY copy1 FROM ' :'destfilename'
+\set dobody 'BEGIN EXECUTE ' :'cmd' '; END'
+DO LANGUAGE plpgsql :'dobody';

 SELECT * FROM copy1 ORDER BY 1;

diff --git a/src/pl/plpgsql/src/output/plpgsql_copy.source b/src/pl/plpgsql/src/output/plpgsql_copy.source
index 86e833d055..58b080eaaf 100644
--- a/src/pl/plpgsql/src/output/plpgsql_copy.source
+++ b/src/pl/plpgsql/src/output/plpgsql_copy.source
@@ -1,3 +1,9 @@
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+-- set up file names to use
+\set srcfilename :ABS_SRCDIR '/data/copy1.data'
+\set destfilename :ABS_BUILDDIR '/results/copy1.data'
 CREATE TABLE copy1 (a int, b float);
 -- COPY TO/FROM not authorized from client.
 DO LANGUAGE plpgsql $$
@@ -30,11 +36,8 @@ ERROR:  cannot COPY to/from client in PL/pgSQL
 CONTEXT:  PL/pgSQL function inline_code_block line 3 at EXECUTE
 -- Valid cases
 -- COPY FROM
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 FROM '@abs_srcdir@/data/copy1.data';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 FROM ' :'srcfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
 SELECT * FROM copy1 ORDER BY 1;
  a |  b
 ---+-----
@@ -44,11 +47,9 @@ SELECT * FROM copy1 ORDER BY 1;
 (3 rows)

 TRUNCATE copy1;
-DO LANGUAGE plpgsql $$
-BEGIN
-  EXECUTE 'COPY copy1 FROM ''@abs_srcdir@/data/copy1.data''';
-END;
-$$;
+\set cmd 'COPY copy1 FROM ' :'srcfilename'
+\set dobody 'BEGIN EXECUTE ' :'cmd' '; END'
+DO LANGUAGE plpgsql :'dobody';
 SELECT * FROM copy1 ORDER BY 1;
  a |  b
 ---+-----
@@ -59,22 +60,14 @@ SELECT * FROM copy1 ORDER BY 1;

 -- COPY TO
 -- Copy the data externally once, then process it back to the table.
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 TO '@abs_builddir@/results/copy1.data';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 TO ' :'destfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
 TRUNCATE copy1;
-DO LANGUAGE plpgsql $$
-BEGIN
-  COPY copy1 FROM '@abs_builddir@/results/copy1.data';
-END;
-$$;
-DO LANGUAGE plpgsql $$
-BEGIN
-  EXECUTE 'COPY copy1 FROM ''@abs_builddir@/results/copy1.data''';
-END;
-$$;
+\set dobody 'BEGIN COPY copy1 FROM ' :'destfilename' '; END'
+DO LANGUAGE plpgsql :'dobody';
+\set cmd 'COPY copy1 FROM ' :'destfilename'
+\set dobody 'BEGIN EXECUTE ' :'cmd' '; END'
+DO LANGUAGE plpgsql :'dobody';
 SELECT * FROM copy1 ORDER BY 1;
  a |  b
 ---+-----
diff --git a/src/test/regress/input/constraints.source b/src/test/regress/input/constraints.source
index 6bb7648321..e6737f72bc 100644
--- a/src/test/regress/input/constraints.source
+++ b/src/test/regress/input/constraints.source
@@ -8,6 +8,9 @@
 --  - EXCLUDE clauses
 --

+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+
 --
 -- DEFAULT syntax
 --
@@ -239,11 +242,13 @@ CREATE TABLE COPY_TBL (x INT, y TEXT, z INT,
     CONSTRAINT COPY_CON
     CHECK (x > 3 AND y <> 'check failed' AND x < 7 ));

-COPY COPY_TBL FROM '@abs_srcdir@/data/constro.data';
+\set filename :ABS_SRCDIR '/data/constro.data'
+COPY COPY_TBL FROM :'filename';

 SELECT * FROM COPY_TBL;

-COPY COPY_TBL FROM '@abs_srcdir@/data/constrf.data';
+\set filename :ABS_SRCDIR '/data/constrf.data'
+COPY COPY_TBL FROM :'filename';

 SELECT * FROM COPY_TBL;

diff --git a/src/test/regress/input/copy.source b/src/test/regress/input/copy.source
index 8acb516801..8bc1379695 100644
--- a/src/test/regress/input/copy.source
+++ b/src/test/regress/input/copy.source
@@ -2,65 +2,90 @@
 -- COPY
 --

+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+
 -- CLASS POPULATION
 --    (any resemblance to real life is purely coincidental)
 --
-COPY aggtest FROM '@abs_srcdir@/data/agg.data';
+\set filename :ABS_SRCDIR '/data/agg.data'
+COPY aggtest FROM :'filename';

-COPY onek FROM '@abs_srcdir@/data/onek.data';
+\set filename :ABS_SRCDIR '/data/onek.data'
+COPY onek FROM :'filename';

-COPY onek TO '@abs_builddir@/results/onek.data';
+\set filename :ABS_BUILDDIR '/results/onek.data'
+COPY onek TO :'filename';

 DELETE FROM onek;

-COPY onek FROM '@abs_builddir@/results/onek.data';
+COPY onek FROM :'filename';

-COPY tenk1 FROM '@abs_srcdir@/data/tenk.data';
+\set filename :ABS_SRCDIR '/data/tenk.data'
+COPY tenk1 FROM :'filename';

-COPY slow_emp4000 FROM '@abs_srcdir@/data/rect.data';
+\set filename :ABS_SRCDIR '/data/rect.data'
+COPY slow_emp4000 FROM :'filename';

-COPY person FROM '@abs_srcdir@/data/person.data';
+\set filename :ABS_SRCDIR '/data/person.data'
+COPY person FROM :'filename';

-COPY emp FROM '@abs_srcdir@/data/emp.data';
+\set filename :ABS_SRCDIR '/data/emp.data'
+COPY emp FROM :'filename';

-COPY student FROM '@abs_srcdir@/data/student.data';
+\set filename :ABS_SRCDIR '/data/student.data'
+COPY student FROM :'filename';

-COPY stud_emp FROM '@abs_srcdir@/data/stud_emp.data';
+\set filename :ABS_SRCDIR '/data/stud_emp.data'
+COPY stud_emp FROM :'filename';

-COPY road FROM '@abs_srcdir@/data/streets.data';
+\set filename :ABS_SRCDIR '/data/streets.data'
+COPY road FROM :'filename';

-COPY real_city FROM '@abs_srcdir@/data/real_city.data';
+\set filename :ABS_SRCDIR '/data/real_city.data'
+COPY real_city FROM :'filename';

-COPY hash_i4_heap FROM '@abs_srcdir@/data/hash.data';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY hash_i4_heap FROM :'filename';

-COPY hash_name_heap FROM '@abs_srcdir@/data/hash.data';
+COPY hash_name_heap FROM :'filename';

-COPY hash_txt_heap FROM '@abs_srcdir@/data/hash.data';
+COPY hash_txt_heap FROM :'filename';

-COPY hash_f8_heap FROM '@abs_srcdir@/data/hash.data';
+COPY hash_f8_heap FROM :'filename';

-COPY test_tsvector FROM '@abs_srcdir@/data/tsearch.data';
+\set filename :ABS_SRCDIR '/data/tsearch.data'
+COPY test_tsvector FROM :'filename';

-COPY testjsonb FROM '@abs_srcdir@/data/jsonb.data';
+\set filename :ABS_SRCDIR '/data/jsonb.data'
+COPY testjsonb FROM :'filename';

 -- the data in this file has a lot of duplicates in the index key
 -- fields, leading to long bucket chains and lots of table expansion.
 -- this is therefore a stress test of the bucket overflow code (unlike
 -- the data in hash.data, which has unique index keys).
 --
--- COPY hash_ovfl_heap FROM '@abs_srcdir@/data/hashovfl.data';
+-- \set filename :ABS_SRCDIR '/data/hashovfl.data'
+-- COPY hash_ovfl_heap FROM :'filename';

-COPY bt_i4_heap FROM '@abs_srcdir@/data/desc.data';
+\set filename :ABS_SRCDIR '/data/desc.data'
+COPY bt_i4_heap FROM :'filename';

-COPY bt_name_heap FROM '@abs_srcdir@/data/hash.data';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY bt_name_heap FROM :'filename';

-COPY bt_txt_heap FROM '@abs_srcdir@/data/desc.data';
+\set filename :ABS_SRCDIR '/data/desc.data'
+COPY bt_txt_heap FROM :'filename';

-COPY bt_f8_heap FROM '@abs_srcdir@/data/hash.data';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY bt_f8_heap FROM :'filename';

-COPY array_op_test FROM '@abs_srcdir@/data/array.data';
+\set filename :ABS_SRCDIR '/data/array.data'
+COPY array_op_test FROM :'filename';

-COPY array_index_op_test FROM '@abs_srcdir@/data/array.data';
+\set filename :ABS_SRCDIR '/data/array.data'
+COPY array_index_op_test FROM :'filename';

 -- analyze all the data we just loaded, to ensure plan consistency
 -- in later tests
@@ -100,11 +125,12 @@ insert into copytest values('Unix',E'abc\ndef',2);
 insert into copytest values('Mac',E'abc\rdef',3);
 insert into copytest values(E'esc\\ape',E'a\\r\\\r\\\n\\nb',4);

-copy copytest to '@abs_builddir@/results/copytest.csv' csv;
+\set filename :ABS_BUILDDIR '/results/copytest.csv'
+copy copytest to :'filename' csv;

 create temp table copytest2 (like copytest);

-copy copytest2 from '@abs_builddir@/results/copytest.csv' csv;
+copy copytest2 from :'filename' csv;

 select * from copytest except select * from copytest2;

@@ -112,9 +138,9 @@ truncate copytest2;

 --- same test but with an escape char different from quote char

-copy copytest to '@abs_builddir@/results/copytest.csv' csv quote '''' escape E'\\';
+copy copytest to :'filename' csv quote '''' escape E'\\';

-copy copytest2 from '@abs_builddir@/results/copytest.csv' csv quote '''' escape E'\\';
+copy copytest2 from :'filename' csv quote '''' escape E'\\';

 select * from copytest except select * from copytest2;

@@ -153,16 +179,17 @@ insert into parted_copytest select x,1,'One' from generate_series(1,1000) x;
 insert into parted_copytest select x,2,'Two' from generate_series(1001,1010) x;
 insert into parted_copytest select x,1,'One' from generate_series(1011,1020) x;

-copy (select * from parted_copytest order by a) to '@abs_builddir@/results/parted_copytest.csv';
+\set filename :ABS_BUILDDIR '/results/parted_copytest.csv'
+copy (select * from parted_copytest order by a) to :'filename';

 truncate parted_copytest;

-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv';
+copy parted_copytest from :'filename';

 -- Ensure COPY FREEZE errors for partitioned tables.
 begin;
 truncate parted_copytest;
-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv' (freeze);
+copy parted_copytest from :'filename' (freeze);
 rollback;

 select tableoid::regclass,count(*),sum(a) from parted_copytest
@@ -182,7 +209,7 @@ create trigger part_ins_trig
     for each row
     execute procedure part_ins_func();

-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv';
+copy parted_copytest from :'filename';

 select tableoid::regclass,count(*),sum(a) from parted_copytest
 group by tableoid order by tableoid::regclass::name;
@@ -257,7 +284,8 @@ bill    20    (11,10)    1000    sharon

 -- Generate COPY FROM report with FILE, with some excluded tuples.
 truncate tab_progress_reporting;
-copy tab_progress_reporting from '@abs_srcdir@/data/emp.data'
+\set filename :ABS_SRCDIR '/data/emp.data'
+copy tab_progress_reporting from :'filename'
     where (salary < 2000);

 drop trigger check_after_tab_progress_reporting on tab_progress_reporting;
diff --git a/src/test/regress/input/create_function_0.source b/src/test/regress/input/create_function_0.source
index f47f635789..c4fcaf13dc 100644
--- a/src/test/regress/input/create_function_0.source
+++ b/src/test/regress/input/create_function_0.source
@@ -2,70 +2,78 @@
 -- CREATE_FUNCTION_0
 --

+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+
+\set autoinclib :LIBDIR '/autoinc' :DLSUFFIX
+\set refintlib :LIBDIR '/refint' :DLSUFFIX
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
+
 -- Create a bunch of C functions that will be used by later tests:

 CREATE FUNCTION check_primary_key ()
     RETURNS trigger
-    AS '@libdir@/refint@DLSUFFIX@'
+    AS :'refintlib'
     LANGUAGE C;

 CREATE FUNCTION check_foreign_key ()
     RETURNS trigger
-    AS '@libdir@/refint@DLSUFFIX@'
+    AS :'refintlib'
     LANGUAGE C;

 CREATE FUNCTION autoinc ()
     RETURNS trigger
-    AS '@libdir@/autoinc@DLSUFFIX@'
+    AS :'autoinclib'
     LANGUAGE C;

 CREATE FUNCTION trigger_return_old ()
         RETURNS trigger
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C;

 CREATE FUNCTION ttdummy ()
         RETURNS trigger
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C;

 CREATE FUNCTION set_ttdummy (int4)
         RETURNS int4
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C STRICT;

 CREATE FUNCTION make_tuple_indirect (record)
         RETURNS record
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C STRICT;

 CREATE FUNCTION test_atomic_ops()
     RETURNS bool
-    AS '@libdir@/regress@DLSUFFIX@'
+    AS :'regresslib'
     LANGUAGE C;

 CREATE FUNCTION test_fdw_handler()
     RETURNS fdw_handler
-    AS '@libdir@/regress@DLSUFFIX@', 'test_fdw_handler'
+    AS :'regresslib', 'test_fdw_handler'
     LANGUAGE C;

 CREATE FUNCTION test_support_func(internal)
     RETURNS internal
-    AS '@libdir@/regress@DLSUFFIX@', 'test_support_func'
+    AS :'regresslib', 'test_support_func'
     LANGUAGE C STRICT;

 CREATE FUNCTION test_opclass_options_func(internal)
     RETURNS void
-    AS '@libdir@/regress@DLSUFFIX@', 'test_opclass_options_func'
+    AS :'regresslib', 'test_opclass_options_func'
     LANGUAGE C;

 CREATE FUNCTION test_enc_conversion(bytea, name, name, bool, validlen OUT int, result OUT bytea)
-    AS '@libdir@/regress@DLSUFFIX@', 'test_enc_conversion'
+    AS :'regresslib', 'test_enc_conversion'
     LANGUAGE C STRICT;

 CREATE FUNCTION binary_coercible(oid, oid)
     RETURNS bool
-    AS '@libdir@/regress@DLSUFFIX@', 'binary_coercible'
+    AS :'regresslib', 'binary_coercible'
     LANGUAGE C STRICT STABLE PARALLEL SAFE;

 -- Things that shouldn't work:
@@ -88,8 +96,13 @@ CREATE FUNCTION test1 (int) RETURNS int LANGUAGE SQL
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE C
     AS 'nosuchfile';

+-- To produce stable regression test output, we have to filter the name
+-- of the regresslib file out of the error message in this test.
+\set VERBOSITY sqlstate
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE C
-    AS '@libdir@/regress@DLSUFFIX@', 'nosuchsymbol';
+    AS :'regresslib', 'nosuchsymbol';
+\set VERBOSITY default
+SELECT regexp_replace(:'LAST_ERROR_MESSAGE', 'file ".*"', 'file "..."');

 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE internal
     AS 'nosuch';
diff --git a/src/test/regress/input/create_function_1.source b/src/test/regress/input/create_function_1.source
index 79a41562bb..814f999cf6 100644
--- a/src/test/regress/input/create_function_1.source
+++ b/src/test/regress/input/create_function_1.source
@@ -2,24 +2,30 @@
 -- CREATE_FUNCTION_1
 --

+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
+
 -- Create C functions needed by create_type.sql

 CREATE FUNCTION widget_in(cstring)
    RETURNS widget
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;

 CREATE FUNCTION widget_out(widget)
    RETURNS cstring
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;

 CREATE FUNCTION int44in(cstring)
    RETURNS city_budget
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;

 CREATE FUNCTION int44out(city_budget)
    RETURNS cstring
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;
diff --git a/src/test/regress/input/create_function_2.source b/src/test/regress/input/create_function_2.source
index 9e6d2942ec..a908d8410f 100644
--- a/src/test/regress/input/create_function_2.source
+++ b/src/test/regress/input/create_function_2.source
@@ -1,6 +1,14 @@
 --
 -- CREATE_FUNCTION_2
 --
+
+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
+
+
 CREATE FUNCTION hobbies(person)
    RETURNS setof hobbies_r
    AS 'select * from hobbies_r where person = $1.name'
@@ -64,25 +72,25 @@ CREATE FUNCTION equipment_named_ambiguous_2b(hobby text)

 CREATE FUNCTION pt_in_widget(point, widget)
    RETURNS bool
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;

 CREATE FUNCTION overpaid(emp)
    RETURNS bool
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;

 CREATE FUNCTION interpt_pp(path, path)
    RETURNS point
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;

 CREATE FUNCTION reverse_name(name)
    RETURNS name
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;

 --
 -- Function dynamic loading
 --
-LOAD '@libdir@/regress@DLSUFFIX@';
+LOAD :'regresslib';
diff --git a/src/test/regress/input/largeobject.source b/src/test/regress/input/largeobject.source
index b1e7ae9909..a86086f6be 100644
--- a/src/test/regress/input/largeobject.source
+++ b/src/test/regress/input/largeobject.source
@@ -2,6 +2,10 @@
 -- Test large object support
 --

+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+
 -- ensure consistent test output regardless of the default bytea format
 SET bytea_output TO escape;

@@ -124,16 +128,13 @@ BEGIN;
 SELECT lo_open(loid, x'40000'::int) from lotest_stash_values;
 ABORT;

-DO $$
-DECLARE
-  loid oid;
-BEGIN
-  SELECT tbl.loid INTO loid FROM lotest_stash_values tbl;
-  PERFORM lo_export(loid, '@abs_builddir@/results/invalid/path');
-EXCEPTION
-  WHEN UNDEFINED_FILE THEN RAISE NOTICE 'could not open file, as expected';
-END;
-$$;
+\set filename :ABS_BUILDDIR '/results/invalid/path'
+\set dobody 'DECLARE loid oid; BEGIN '
+\set dobody :dobody 'SELECT tbl.loid INTO loid FROM lotest_stash_values tbl; '
+\set dobody :dobody 'PERFORM lo_export(loid, ' :'filename' '); '
+\set dobody :dobody 'EXCEPTION WHEN UNDEFINED_FILE THEN '
+\set dobody :dobody 'RAISE NOTICE ''could not open file, as expected''; END'
+DO :'dobody';

 -- Test truncation.
 BEGIN;
@@ -183,7 +184,8 @@ SELECT lo_unlink(loid) from lotest_stash_values;

 TRUNCATE lotest_stash_values;

-INSERT INTO lotest_stash_values (loid) SELECT lo_import('@abs_srcdir@/data/tenk.data');
+\set filename :ABS_SRCDIR '/data/tenk.data'
+INSERT INTO lotest_stash_values (loid) SELECT lo_import(:'filename');

 BEGIN;
 UPDATE lotest_stash_values SET fd=lo_open(loid, CAST(x'20000' | x'40000' AS integer));
@@ -212,14 +214,16 @@ SELECT loread(fd, 36) FROM lotest_stash_values;
 SELECT lo_close(fd) FROM lotest_stash_values;
 END;

-SELECT lo_export(loid, '@abs_builddir@/results/lotest.txt') FROM lotest_stash_values;
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+SELECT lo_export(loid, :'filename') FROM lotest_stash_values;

-\lo_import '@abs_builddir@/results/lotest.txt'
+\lo_import :filename

 \set newloid :LASTOID

 -- just make sure \lo_export does not barf
-\lo_export :newloid '@abs_builddir@/results/lotest2.txt'
+\set filename :ABS_BUILDDIR '/results/lotest2.txt'
+\lo_export :newloid :filename

 -- This is a hack to test that export/import are reversible
 -- This uses knowledge about the inner workings of large object mechanism
@@ -234,7 +238,8 @@ TRUNCATE lotest_stash_values;

 \lo_unlink :newloid

-\lo_import '@abs_builddir@/results/lotest.txt'
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+\lo_import :filename

 \set newloid_1 :LASTOID

diff --git a/src/test/regress/input/misc.source b/src/test/regress/input/misc.source
index b1dbc573c9..a778e7e7f4 100644
--- a/src/test/regress/input/misc.source
+++ b/src/test/regress/input/misc.source
@@ -2,6 +2,10 @@
 -- MISC
 --

+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+
 --
 -- BTREE
 --
@@ -51,25 +55,27 @@ DROP TABLE tmp;
 --
 -- copy
 --
-COPY onek TO '@abs_builddir@/results/onek.data';
+\set filename :ABS_BUILDDIR '/results/onek.data'
+COPY onek TO :'filename';

 DELETE FROM onek;

-COPY onek FROM '@abs_builddir@/results/onek.data';
+COPY onek FROM :'filename';

 SELECT unique1 FROM onek WHERE unique1 < 2 ORDER BY unique1;

 DELETE FROM onek2;

-COPY onek2 FROM '@abs_builddir@/results/onek.data';
+COPY onek2 FROM :'filename';

 SELECT unique1 FROM onek2 WHERE unique1 < 2 ORDER BY unique1;

-COPY BINARY stud_emp TO '@abs_builddir@/results/stud_emp.data';
+\set filename :ABS_BUILDDIR '/results/stud_emp.data'
+COPY BINARY stud_emp TO :'filename';

 DELETE FROM stud_emp;

-COPY BINARY stud_emp FROM '@abs_builddir@/results/stud_emp.data';
+COPY BINARY stud_emp FROM :'filename';

 SELECT * FROM stud_emp;

diff --git a/src/test/regress/input/tablespace.source b/src/test/regress/input/tablespace.source
index c133e73499..1cf9a84307 100644
--- a/src/test/regress/input/tablespace.source
+++ b/src/test/regress/input/tablespace.source
@@ -1,6 +1,11 @@
+-- directory paths are passed to us in environment variables
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+
+\set testtablespace :ABS_BUILDDIR '/testtablespace'
+
 -- create a tablespace using WITH clause
-CREATE TABLESPACE regress_tblspacewith LOCATION '@testtablespace@' WITH (some_nonexistent_parameter = true); -- fail
-CREATE TABLESPACE regress_tblspacewith LOCATION '@testtablespace@' WITH (random_page_cost = 3.0); -- ok
+CREATE TABLESPACE regress_tblspacewith LOCATION :'testtablespace' WITH (some_nonexistent_parameter = true); -- fail
+CREATE TABLESPACE regress_tblspacewith LOCATION :'testtablespace' WITH (random_page_cost = 3.0); -- ok

 -- check to see the parameter was used
 SELECT spcoptions FROM pg_tablespace WHERE spcname = 'regress_tblspacewith';
@@ -9,7 +14,7 @@ SELECT spcoptions FROM pg_tablespace WHERE spcname = 'regress_tblspacewith';
 DROP TABLESPACE regress_tblspacewith;

 -- create a tablespace we can use
-CREATE TABLESPACE regress_tblspace LOCATION '@testtablespace@';
+CREATE TABLESPACE regress_tblspace LOCATION :'testtablespace';

 -- try setting and resetting some properties for the new tablespace
 ALTER TABLESPACE regress_tblspace SET (random_page_cost = 1.0, seq_page_cost = 1.1);
diff --git a/src/test/regress/output/constraints.source b/src/test/regress/output/constraints.source
index eff793cc3d..2fbac21188 100644
--- a/src/test/regress/output/constraints.source
+++ b/src/test/regress/output/constraints.source
@@ -7,6 +7,8 @@
 --  - UNIQUE clauses
 --  - EXCLUDE clauses
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
 --
 -- DEFAULT syntax
 --
@@ -346,7 +348,8 @@ SELECT * FROM INSERT_TBL;
 CREATE TABLE COPY_TBL (x INT, y TEXT, z INT,
     CONSTRAINT COPY_CON
     CHECK (x > 3 AND y <> 'check failed' AND x < 7 ));
-COPY COPY_TBL FROM '@abs_srcdir@/data/constro.data';
+\set filename :ABS_SRCDIR '/data/constro.data'
+COPY COPY_TBL FROM :'filename';
 SELECT * FROM COPY_TBL;
  x |       y       | z
 ---+---------------+---
@@ -354,7 +357,8 @@ SELECT * FROM COPY_TBL;
  6 | OK            | 4
 (2 rows)

-COPY COPY_TBL FROM '@abs_srcdir@/data/constrf.data';
+\set filename :ABS_SRCDIR '/data/constrf.data'
+COPY COPY_TBL FROM :'filename';
 ERROR:  new row for relation "copy_tbl" violates check constraint "copy_con"
 DETAIL:  Failing row contains (7, check failed, 6).
 CONTEXT:  COPY copy_tbl, line 2: "7    check failed    6"
diff --git a/src/test/regress/output/copy.source b/src/test/regress/output/copy.source
index 25bdec6c60..dfc7006f8e 100644
--- a/src/test/regress/output/copy.source
+++ b/src/test/regress/output/copy.source
@@ -1,40 +1,64 @@
 --
 -- COPY
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
 -- CLASS POPULATION
 --    (any resemblance to real life is purely coincidental)
 --
-COPY aggtest FROM '@abs_srcdir@/data/agg.data';
-COPY onek FROM '@abs_srcdir@/data/onek.data';
-COPY onek TO '@abs_builddir@/results/onek.data';
+\set filename :ABS_SRCDIR '/data/agg.data'
+COPY aggtest FROM :'filename';
+\set filename :ABS_SRCDIR '/data/onek.data'
+COPY onek FROM :'filename';
+\set filename :ABS_BUILDDIR '/results/onek.data'
+COPY onek TO :'filename';
 DELETE FROM onek;
-COPY onek FROM '@abs_builddir@/results/onek.data';
-COPY tenk1 FROM '@abs_srcdir@/data/tenk.data';
-COPY slow_emp4000 FROM '@abs_srcdir@/data/rect.data';
-COPY person FROM '@abs_srcdir@/data/person.data';
-COPY emp FROM '@abs_srcdir@/data/emp.data';
-COPY student FROM '@abs_srcdir@/data/student.data';
-COPY stud_emp FROM '@abs_srcdir@/data/stud_emp.data';
-COPY road FROM '@abs_srcdir@/data/streets.data';
-COPY real_city FROM '@abs_srcdir@/data/real_city.data';
-COPY hash_i4_heap FROM '@abs_srcdir@/data/hash.data';
-COPY hash_name_heap FROM '@abs_srcdir@/data/hash.data';
-COPY hash_txt_heap FROM '@abs_srcdir@/data/hash.data';
-COPY hash_f8_heap FROM '@abs_srcdir@/data/hash.data';
-COPY test_tsvector FROM '@abs_srcdir@/data/tsearch.data';
-COPY testjsonb FROM '@abs_srcdir@/data/jsonb.data';
+COPY onek FROM :'filename';
+\set filename :ABS_SRCDIR '/data/tenk.data'
+COPY tenk1 FROM :'filename';
+\set filename :ABS_SRCDIR '/data/rect.data'
+COPY slow_emp4000 FROM :'filename';
+\set filename :ABS_SRCDIR '/data/person.data'
+COPY person FROM :'filename';
+\set filename :ABS_SRCDIR '/data/emp.data'
+COPY emp FROM :'filename';
+\set filename :ABS_SRCDIR '/data/student.data'
+COPY student FROM :'filename';
+\set filename :ABS_SRCDIR '/data/stud_emp.data'
+COPY stud_emp FROM :'filename';
+\set filename :ABS_SRCDIR '/data/streets.data'
+COPY road FROM :'filename';
+\set filename :ABS_SRCDIR '/data/real_city.data'
+COPY real_city FROM :'filename';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY hash_i4_heap FROM :'filename';
+COPY hash_name_heap FROM :'filename';
+COPY hash_txt_heap FROM :'filename';
+COPY hash_f8_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/tsearch.data'
+COPY test_tsvector FROM :'filename';
+\set filename :ABS_SRCDIR '/data/jsonb.data'
+COPY testjsonb FROM :'filename';
 -- the data in this file has a lot of duplicates in the index key
 -- fields, leading to long bucket chains and lots of table expansion.
 -- this is therefore a stress test of the bucket overflow code (unlike
 -- the data in hash.data, which has unique index keys).
 --
--- COPY hash_ovfl_heap FROM '@abs_srcdir@/data/hashovfl.data';
-COPY bt_i4_heap FROM '@abs_srcdir@/data/desc.data';
-COPY bt_name_heap FROM '@abs_srcdir@/data/hash.data';
-COPY bt_txt_heap FROM '@abs_srcdir@/data/desc.data';
-COPY bt_f8_heap FROM '@abs_srcdir@/data/hash.data';
-COPY array_op_test FROM '@abs_srcdir@/data/array.data';
-COPY array_index_op_test FROM '@abs_srcdir@/data/array.data';
+-- \set filename :ABS_SRCDIR '/data/hashovfl.data'
+-- COPY hash_ovfl_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/desc.data'
+COPY bt_i4_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY bt_name_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/desc.data'
+COPY bt_txt_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/hash.data'
+COPY bt_f8_heap FROM :'filename';
+\set filename :ABS_SRCDIR '/data/array.data'
+COPY array_op_test FROM :'filename';
+\set filename :ABS_SRCDIR '/data/array.data'
+COPY array_index_op_test FROM :'filename';
 -- analyze all the data we just loaded, to ensure plan consistency
 -- in later tests
 ANALYZE aggtest;
@@ -68,9 +92,10 @@ insert into copytest values('DOS',E'abc\r\ndef',1);
 insert into copytest values('Unix',E'abc\ndef',2);
 insert into copytest values('Mac',E'abc\rdef',3);
 insert into copytest values(E'esc\\ape',E'a\\r\\\r\\\n\\nb',4);
-copy copytest to '@abs_builddir@/results/copytest.csv' csv;
+\set filename :ABS_BUILDDIR '/results/copytest.csv'
+copy copytest to :'filename' csv;
 create temp table copytest2 (like copytest);
-copy copytest2 from '@abs_builddir@/results/copytest.csv' csv;
+copy copytest2 from :'filename' csv;
 select * from copytest except select * from copytest2;
  style | test | filler
 -------+------+--------
@@ -78,8 +103,8 @@ select * from copytest except select * from copytest2;

 truncate copytest2;
 --- same test but with an escape char different from quote char
-copy copytest to '@abs_builddir@/results/copytest.csv' csv quote '''' escape E'\\';
-copy copytest2 from '@abs_builddir@/results/copytest.csv' csv quote '''' escape E'\\';
+copy copytest to :'filename' csv quote '''' escape E'\\';
+copy copytest2 from :'filename' csv quote '''' escape E'\\';
 select * from copytest except select * from copytest2;
  style | test | filler
 -------+------+--------
@@ -110,13 +135,14 @@ alter table parted_copytest attach partition parted_copytest_a2 for values in(2)
 insert into parted_copytest select x,1,'One' from generate_series(1,1000) x;
 insert into parted_copytest select x,2,'Two' from generate_series(1001,1010) x;
 insert into parted_copytest select x,1,'One' from generate_series(1011,1020) x;
-copy (select * from parted_copytest order by a) to '@abs_builddir@/results/parted_copytest.csv';
+\set filename :ABS_BUILDDIR '/results/parted_copytest.csv'
+copy (select * from parted_copytest order by a) to :'filename';
 truncate parted_copytest;
-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv';
+copy parted_copytest from :'filename';
 -- Ensure COPY FREEZE errors for partitioned tables.
 begin;
 truncate parted_copytest;
-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv' (freeze);
+copy parted_copytest from :'filename' (freeze);
 ERROR:  cannot perform COPY FREEZE on a partitioned table
 rollback;
 select tableoid::regclass,count(*),sum(a) from parted_copytest
@@ -138,7 +164,7 @@ create trigger part_ins_trig
     before insert on parted_copytest_a2
     for each row
     execute procedure part_ins_func();
-copy parted_copytest from '@abs_builddir@/results/parted_copytest.csv';
+copy parted_copytest from :'filename';
 select tableoid::regclass,count(*),sum(a) from parted_copytest
 group by tableoid order by tableoid::regclass::name;
       tableoid      | count |  sum
@@ -213,7 +239,8 @@ copy tab_progress_reporting from stdin;
 INFO:  progress: {"type": "PIPE", "command": "COPY FROM", "relname": "tab_progress_reporting", "has_bytes_total":
false,"tuples_excluded": 0, "tuples_processed": 3, "has_bytes_processed": true} 
 -- Generate COPY FROM report with FILE, with some excluded tuples.
 truncate tab_progress_reporting;
-copy tab_progress_reporting from '@abs_srcdir@/data/emp.data'
+\set filename :ABS_SRCDIR '/data/emp.data'
+copy tab_progress_reporting from :'filename'
     where (salary < 2000);
 INFO:  progress: {"type": "FILE", "command": "COPY FROM", "relname": "tab_progress_reporting", "has_bytes_total":
true,"tuples_excluded": 1, "tuples_processed": 2, "has_bytes_processed": true} 
 drop trigger check_after_tab_progress_reporting on tab_progress_reporting;
diff --git a/src/test/regress/output/create_function_0.source b/src/test/regress/output/create_function_0.source
index 342bc40e11..3ff4fa9870 100644
--- a/src/test/regress/output/create_function_0.source
+++ b/src/test/regress/output/create_function_0.source
@@ -1,57 +1,63 @@
 --
 -- CREATE_FUNCTION_0
 --
+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+\set autoinclib :LIBDIR '/autoinc' :DLSUFFIX
+\set refintlib :LIBDIR '/refint' :DLSUFFIX
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
 -- Create a bunch of C functions that will be used by later tests:
 CREATE FUNCTION check_primary_key ()
     RETURNS trigger
-    AS '@libdir@/refint@DLSUFFIX@'
+    AS :'refintlib'
     LANGUAGE C;
 CREATE FUNCTION check_foreign_key ()
     RETURNS trigger
-    AS '@libdir@/refint@DLSUFFIX@'
+    AS :'refintlib'
     LANGUAGE C;
 CREATE FUNCTION autoinc ()
     RETURNS trigger
-    AS '@libdir@/autoinc@DLSUFFIX@'
+    AS :'autoinclib'
     LANGUAGE C;
 CREATE FUNCTION trigger_return_old ()
         RETURNS trigger
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C;
 CREATE FUNCTION ttdummy ()
         RETURNS trigger
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C;
 CREATE FUNCTION set_ttdummy (int4)
         RETURNS int4
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C STRICT;
 CREATE FUNCTION make_tuple_indirect (record)
         RETURNS record
-        AS '@libdir@/regress@DLSUFFIX@'
+        AS :'regresslib'
         LANGUAGE C STRICT;
 CREATE FUNCTION test_atomic_ops()
     RETURNS bool
-    AS '@libdir@/regress@DLSUFFIX@'
+    AS :'regresslib'
     LANGUAGE C;
 CREATE FUNCTION test_fdw_handler()
     RETURNS fdw_handler
-    AS '@libdir@/regress@DLSUFFIX@', 'test_fdw_handler'
+    AS :'regresslib', 'test_fdw_handler'
     LANGUAGE C;
 CREATE FUNCTION test_support_func(internal)
     RETURNS internal
-    AS '@libdir@/regress@DLSUFFIX@', 'test_support_func'
+    AS :'regresslib', 'test_support_func'
     LANGUAGE C STRICT;
 CREATE FUNCTION test_opclass_options_func(internal)
     RETURNS void
-    AS '@libdir@/regress@DLSUFFIX@', 'test_opclass_options_func'
+    AS :'regresslib', 'test_opclass_options_func'
     LANGUAGE C;
 CREATE FUNCTION test_enc_conversion(bytea, name, name, bool, validlen OUT int, result OUT bytea)
-    AS '@libdir@/regress@DLSUFFIX@', 'test_enc_conversion'
+    AS :'regresslib', 'test_enc_conversion'
     LANGUAGE C STRICT;
 CREATE FUNCTION binary_coercible(oid, oid)
     RETURNS bool
-    AS '@libdir@/regress@DLSUFFIX@', 'binary_coercible'
+    AS :'regresslib', 'binary_coercible'
     LANGUAGE C STRICT STABLE PARALLEL SAFE;
 -- Things that shouldn't work:
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE SQL
@@ -80,9 +86,19 @@ ERROR:  only one AS item needed for language "sql"
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE C
     AS 'nosuchfile';
 ERROR:  could not access file "nosuchfile": No such file or directory
+-- To produce stable regression test output, we have to filter the name
+-- of the regresslib file out of the error message in this test.
+\set VERBOSITY sqlstate
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE C
-    AS '@libdir@/regress@DLSUFFIX@', 'nosuchsymbol';
-ERROR:  could not find function "nosuchsymbol" in file "@libdir@/regress@DLSUFFIX@"
+    AS :'regresslib', 'nosuchsymbol';
+ERROR:  42883
+\set VERBOSITY default
+SELECT regexp_replace(:'LAST_ERROR_MESSAGE', 'file ".*"', 'file "..."');
+                    regexp_replace
+------------------------------------------------------
+ could not find function "nosuchsymbol" in file "..."
+(1 row)
+
 CREATE FUNCTION test1 (int) RETURNS int LANGUAGE internal
     AS 'nosuch';
 ERROR:  there is no built-in function named "nosuch"
diff --git a/src/test/regress/output/create_function_1.source b/src/test/regress/output/create_function_1.source
index 616b610e86..f43662547a 100644
--- a/src/test/regress/output/create_function_1.source
+++ b/src/test/regress/output/create_function_1.source
@@ -1,26 +1,30 @@
 --
 -- CREATE_FUNCTION_1
 --
+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
 -- Create C functions needed by create_type.sql
 CREATE FUNCTION widget_in(cstring)
    RETURNS widget
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;
 NOTICE:  type "widget" is not yet defined
 DETAIL:  Creating a shell type definition.
 CREATE FUNCTION widget_out(widget)
    RETURNS cstring
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;
 NOTICE:  argument type widget is only a shell
 CREATE FUNCTION int44in(cstring)
    RETURNS city_budget
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;
 NOTICE:  type "city_budget" is not yet defined
 DETAIL:  Creating a shell type definition.
 CREATE FUNCTION int44out(city_budget)
    RETURNS cstring
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT IMMUTABLE;
 NOTICE:  argument type city_budget is only a shell
diff --git a/src/test/regress/output/create_function_2.source b/src/test/regress/output/create_function_2.source
index ac9a7f5cf8..e6f036c0d1 100644
--- a/src/test/regress/output/create_function_2.source
+++ b/src/test/regress/output/create_function_2.source
@@ -1,6 +1,10 @@
 --
 -- CREATE_FUNCTION_2
 --
+-- directory path and DLSUFFIX are passed to us in environment variables
+\getenv LIBDIR LIBDIR
+\getenv DLSUFFIX DLSUFFIX
+\set regresslib :LIBDIR '/regress' :DLSUFFIX
 CREATE FUNCTION hobbies(person)
    RETURNS setof hobbies_r
    AS 'select * from hobbies_r where person = $1.name'
@@ -49,21 +53,21 @@ CREATE FUNCTION equipment_named_ambiguous_2b(hobby text)
    LANGUAGE SQL;
 CREATE FUNCTION pt_in_widget(point, widget)
    RETURNS bool
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;
 CREATE FUNCTION overpaid(emp)
    RETURNS bool
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;
 CREATE FUNCTION interpt_pp(path, path)
    RETURNS point
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;
 CREATE FUNCTION reverse_name(name)
    RETURNS name
-   AS '@libdir@/regress@DLSUFFIX@'
+   AS :'regresslib'
    LANGUAGE C STRICT;
 --
 -- Function dynamic loading
 --
-LOAD '@libdir@/regress@DLSUFFIX@';
+LOAD :'regresslib';
diff --git a/src/test/regress/output/largeobject.source b/src/test/regress/output/largeobject.source
index 91d33b4d0c..848f02e4f2 100644
--- a/src/test/regress/output/largeobject.source
+++ b/src/test/regress/output/largeobject.source
@@ -1,6 +1,9 @@
 --
 -- Test large object support
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
 -- ensure consistent test output regardless of the default bytea format
 SET bytea_output TO escape;
 -- Load a file
@@ -161,16 +164,13 @@ SELECT lo_open(loid, x'40000'::int) from lotest_stash_values;
 (1 row)

 ABORT;
-DO $$
-DECLARE
-  loid oid;
-BEGIN
-  SELECT tbl.loid INTO loid FROM lotest_stash_values tbl;
-  PERFORM lo_export(loid, '@abs_builddir@/results/invalid/path');
-EXCEPTION
-  WHEN UNDEFINED_FILE THEN RAISE NOTICE 'could not open file, as expected';
-END;
-$$;
+\set filename :ABS_BUILDDIR '/results/invalid/path'
+\set dobody 'DECLARE loid oid; BEGIN '
+\set dobody :dobody 'SELECT tbl.loid INTO loid FROM lotest_stash_values tbl; '
+\set dobody :dobody 'PERFORM lo_export(loid, ' :'filename' '); '
+\set dobody :dobody 'EXCEPTION WHEN UNDEFINED_FILE THEN '
+\set dobody :dobody 'RAISE NOTICE ''could not open file, as expected''; END'
+DO :'dobody';
 NOTICE:  could not open file, as expected
 -- Test truncation.
 BEGIN;
@@ -327,7 +327,8 @@ SELECT lo_unlink(loid) from lotest_stash_values;
 (1 row)

 TRUNCATE lotest_stash_values;
-INSERT INTO lotest_stash_values (loid) SELECT lo_import('@abs_srcdir@/data/tenk.data');
+\set filename :ABS_SRCDIR '/data/tenk.data'
+INSERT INTO lotest_stash_values (loid) SELECT lo_import(:'filename');
 BEGIN;
 UPDATE lotest_stash_values SET fd=lo_open(loid, CAST(x'20000' | x'40000' AS integer));
 -- verify length of large object
@@ -390,16 +391,18 @@ SELECT lo_close(fd) FROM lotest_stash_values;
 (1 row)

 END;
-SELECT lo_export(loid, '@abs_builddir@/results/lotest.txt') FROM lotest_stash_values;
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+SELECT lo_export(loid, :'filename') FROM lotest_stash_values;
  lo_export
 -----------
          1
 (1 row)

-\lo_import '@abs_builddir@/results/lotest.txt'
+\lo_import :filename
 \set newloid :LASTOID
 -- just make sure \lo_export does not barf
-\lo_export :newloid '@abs_builddir@/results/lotest2.txt'
+\set filename :ABS_BUILDDIR '/results/lotest2.txt'
+\lo_export :newloid :filename
 -- This is a hack to test that export/import are reversible
 -- This uses knowledge about the inner workings of large object mechanism
 -- which should not be used outside it.  This makes it a HACK
@@ -418,7 +421,8 @@ SELECT lo_unlink(loid) FROM lotest_stash_values;

 TRUNCATE lotest_stash_values;
 \lo_unlink :newloid
-\lo_import '@abs_builddir@/results/lotest.txt'
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+\lo_import :filename
 \set newloid_1 :LASTOID
 SELECT lo_from_bytea(0, lo_get(:newloid_1)) AS newloid_2
 \gset
diff --git a/src/test/regress/output/largeobject_1.source b/src/test/regress/output/largeobject_1.source
index cb910e2eef..aed406b182 100644
--- a/src/test/regress/output/largeobject_1.source
+++ b/src/test/regress/output/largeobject_1.source
@@ -1,6 +1,9 @@
 --
 -- Test large object support
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
 -- ensure consistent test output regardless of the default bytea format
 SET bytea_output TO escape;
 -- Load a file
@@ -161,16 +164,13 @@ SELECT lo_open(loid, x'40000'::int) from lotest_stash_values;
 (1 row)

 ABORT;
-DO $$
-DECLARE
-  loid oid;
-BEGIN
-  SELECT tbl.loid INTO loid FROM lotest_stash_values tbl;
-  PERFORM lo_export(loid, '@abs_builddir@/results/invalid/path');
-EXCEPTION
-  WHEN UNDEFINED_FILE THEN RAISE NOTICE 'could not open file, as expected';
-END;
-$$;
+\set filename :ABS_BUILDDIR '/results/invalid/path'
+\set dobody 'DECLARE loid oid; BEGIN '
+\set dobody :dobody 'SELECT tbl.loid INTO loid FROM lotest_stash_values tbl; '
+\set dobody :dobody 'PERFORM lo_export(loid, ' :'filename' '); '
+\set dobody :dobody 'EXCEPTION WHEN UNDEFINED_FILE THEN '
+\set dobody :dobody 'RAISE NOTICE ''could not open file, as expected''; END'
+DO :'dobody';
 NOTICE:  could not open file, as expected
 -- Test truncation.
 BEGIN;
@@ -327,7 +327,8 @@ SELECT lo_unlink(loid) from lotest_stash_values;
 (1 row)

 TRUNCATE lotest_stash_values;
-INSERT INTO lotest_stash_values (loid) SELECT lo_import('@abs_srcdir@/data/tenk.data');
+\set filename :ABS_SRCDIR '/data/tenk.data'
+INSERT INTO lotest_stash_values (loid) SELECT lo_import(:'filename');
 BEGIN;
 UPDATE lotest_stash_values SET fd=lo_open(loid, CAST(x'20000' | x'40000' AS integer));
 -- verify length of large object
@@ -390,16 +391,18 @@ SELECT lo_close(fd) FROM lotest_stash_values;
 (1 row)

 END;
-SELECT lo_export(loid, '@abs_builddir@/results/lotest.txt') FROM lotest_stash_values;
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+SELECT lo_export(loid, :'filename') FROM lotest_stash_values;
  lo_export
 -----------
          1
 (1 row)

-\lo_import '@abs_builddir@/results/lotest.txt'
+\lo_import :filename
 \set newloid :LASTOID
 -- just make sure \lo_export does not barf
-\lo_export :newloid '@abs_builddir@/results/lotest2.txt'
+\set filename :ABS_BUILDDIR '/results/lotest2.txt'
+\lo_export :newloid :filename
 -- This is a hack to test that export/import are reversible
 -- This uses knowledge about the inner workings of large object mechanism
 -- which should not be used outside it.  This makes it a HACK
@@ -418,7 +421,8 @@ SELECT lo_unlink(loid) FROM lotest_stash_values;

 TRUNCATE lotest_stash_values;
 \lo_unlink :newloid
-\lo_import '@abs_builddir@/results/lotest.txt'
+\set filename :ABS_BUILDDIR '/results/lotest.txt'
+\lo_import :filename
 \set newloid_1 :LASTOID
 SELECT lo_from_bytea(0, lo_get(:newloid_1)) AS newloid_2
 \gset
diff --git a/src/test/regress/output/misc.source b/src/test/regress/output/misc.source
index b9595cc239..f6da489185 100644
--- a/src/test/regress/output/misc.source
+++ b/src/test/regress/output/misc.source
@@ -1,6 +1,9 @@
 --
 -- MISC
 --
+-- directory paths are passed to us in environment variables
+\getenv ABS_SRCDIR ABS_SRCDIR
+\getenv ABS_BUILDDIR ABS_BUILDDIR
 --
 -- BTREE
 --
@@ -41,9 +44,10 @@ DROP TABLE tmp;
 --
 -- copy
 --
-COPY onek TO '@abs_builddir@/results/onek.data';
+\set filename :ABS_BUILDDIR '/results/onek.data'
+COPY onek TO :'filename';
 DELETE FROM onek;
-COPY onek FROM '@abs_builddir@/results/onek.data';
+COPY onek FROM :'filename';
 SELECT unique1 FROM onek WHERE unique1 < 2 ORDER BY unique1;
  unique1
 ---------
@@ -52,7 +56,7 @@ SELECT unique1 FROM onek WHERE unique1 < 2 ORDER BY unique1;
 (2 rows)

 DELETE FROM onek2;
-COPY onek2 FROM '@abs_builddir@/results/onek.data';
+COPY onek2 FROM :'filename';
 SELECT unique1 FROM onek2 WHERE unique1 < 2 ORDER BY unique1;
  unique1
 ---------
@@ -60,9 +64,10 @@ SELECT unique1 FROM onek2 WHERE unique1 < 2 ORDER BY unique1;
        1
 (2 rows)

-COPY BINARY stud_emp TO '@abs_builddir@/results/stud_emp.data';
+\set filename :ABS_BUILDDIR '/results/stud_emp.data'
+COPY BINARY stud_emp TO :'filename';
 DELETE FROM stud_emp;
-COPY BINARY stud_emp FROM '@abs_builddir@/results/stud_emp.data';
+COPY BINARY stud_emp FROM :'filename';
 SELECT * FROM stud_emp;
  name  | age |  location  | salary | manager | gpa | percent
 -------+-----+------------+--------+---------+-----+---------
diff --git a/src/test/regress/output/tablespace.source b/src/test/regress/output/tablespace.source
index 1bbe7e0323..54e7018574 100644
--- a/src/test/regress/output/tablespace.source
+++ b/src/test/regress/output/tablespace.source
@@ -1,7 +1,10 @@
+-- directory paths are passed to us in environment variables
+\getenv ABS_BUILDDIR ABS_BUILDDIR
+\set testtablespace :ABS_BUILDDIR '/testtablespace'
 -- create a tablespace using WITH clause
-CREATE TABLESPACE regress_tblspacewith LOCATION '@testtablespace@' WITH (some_nonexistent_parameter = true); -- fail
+CREATE TABLESPACE regress_tblspacewith LOCATION :'testtablespace' WITH (some_nonexistent_parameter = true); -- fail
 ERROR:  unrecognized parameter "some_nonexistent_parameter"
-CREATE TABLESPACE regress_tblspacewith LOCATION '@testtablespace@' WITH (random_page_cost = 3.0); -- ok
+CREATE TABLESPACE regress_tblspacewith LOCATION :'testtablespace' WITH (random_page_cost = 3.0); -- ok
 -- check to see the parameter was used
 SELECT spcoptions FROM pg_tablespace WHERE spcname = 'regress_tblspacewith';
        spcoptions
@@ -12,7 +15,7 @@ SELECT spcoptions FROM pg_tablespace WHERE spcname = 'regress_tblspacewith';
 -- drop the tablespace so we can re-use the location
 DROP TABLESPACE regress_tblspacewith;
 -- create a tablespace we can use
-CREATE TABLESPACE regress_tblspace LOCATION '@testtablespace@';
+CREATE TABLESPACE regress_tblspace LOCATION :'testtablespace';
 -- try setting and resetting some properties for the new tablespace
 ALTER TABLESPACE regress_tblspace SET (random_page_cost = 1.0, seq_page_cost = 1.1);
 ALTER TABLESPACE regress_tblspace SET (some_nonexistent_parameter = true);  -- fail
diff --git a/src/test/regress/pg_regress.c b/src/test/regress/pg_regress.c
index 2c8a600bad..b19b39aa09 100644
--- a/src/test/regress/pg_regress.c
+++ b/src/test/regress/pg_regress.c
@@ -746,6 +746,14 @@ initialize_environment(void)
      */
     setenv("PGAPPNAME", "pg_regress", 1);

+    /*
+     * Set variables that the test scripts may need to refer to.
+     */
+    setenv("ABS_SRCDIR", inputdir, 1);
+    setenv("ABS_BUILDDIR", outputdir, 1);
+    setenv("LIBDIR", dlpath, 1);
+    setenv("DLSUFFIX", DLSUFFIX, 1);
+
     if (nolocale)
     {
         /*
diff --git a/contrib/dblink/output/paths.source b/contrib/dblink/expected/paths.out
similarity index 100%
rename from contrib/dblink/output/paths.source
rename to contrib/dblink/expected/paths.out
diff --git a/contrib/dblink/input/paths.source b/contrib/dblink/sql/paths.sql
similarity index 100%
rename from contrib/dblink/input/paths.source
rename to contrib/dblink/sql/paths.sql
diff --git a/contrib/file_fdw/output/file_fdw.source b/contrib/file_fdw/expected/file_fdw.out
similarity index 100%
rename from contrib/file_fdw/output/file_fdw.source
rename to contrib/file_fdw/expected/file_fdw.out
diff --git a/contrib/file_fdw/input/file_fdw.source b/contrib/file_fdw/sql/file_fdw.sql
similarity index 100%
rename from contrib/file_fdw/input/file_fdw.source
rename to contrib/file_fdw/sql/file_fdw.sql
diff --git a/src/pl/plpgsql/src/output/plpgsql_copy.source b/src/pl/plpgsql/src/expected/plpgsql_copy.out
similarity index 100%
rename from src/pl/plpgsql/src/output/plpgsql_copy.source
rename to src/pl/plpgsql/src/expected/plpgsql_copy.out
diff --git a/src/pl/plpgsql/src/input/plpgsql_copy.source b/src/pl/plpgsql/src/sql/plpgsql_copy.sql
similarity index 100%
rename from src/pl/plpgsql/src/input/plpgsql_copy.source
rename to src/pl/plpgsql/src/sql/plpgsql_copy.sql
diff --git a/src/test/regress/output/constraints.source b/src/test/regress/expected/constraints.out
similarity index 100%
rename from src/test/regress/output/constraints.source
rename to src/test/regress/expected/constraints.out
diff --git a/src/test/regress/output/copy.source b/src/test/regress/expected/copy.out
similarity index 100%
rename from src/test/regress/output/copy.source
rename to src/test/regress/expected/copy.out
diff --git a/src/test/regress/output/create_function_0.source b/src/test/regress/expected/create_function_0.out
similarity index 100%
rename from src/test/regress/output/create_function_0.source
rename to src/test/regress/expected/create_function_0.out
diff --git a/src/test/regress/output/create_function_1.source b/src/test/regress/expected/create_function_1.out
similarity index 100%
rename from src/test/regress/output/create_function_1.source
rename to src/test/regress/expected/create_function_1.out
diff --git a/src/test/regress/output/create_function_2.source b/src/test/regress/expected/create_function_2.out
similarity index 100%
rename from src/test/regress/output/create_function_2.source
rename to src/test/regress/expected/create_function_2.out
diff --git a/src/test/regress/output/largeobject.source b/src/test/regress/expected/largeobject.out
similarity index 100%
rename from src/test/regress/output/largeobject.source
rename to src/test/regress/expected/largeobject.out
diff --git a/src/test/regress/output/largeobject_1.source b/src/test/regress/expected/largeobject_1.out
similarity index 100%
rename from src/test/regress/output/largeobject_1.source
rename to src/test/regress/expected/largeobject_1.out
diff --git a/src/test/regress/output/misc.source b/src/test/regress/expected/misc.out
similarity index 100%
rename from src/test/regress/output/misc.source
rename to src/test/regress/expected/misc.out
diff --git a/src/test/regress/output/tablespace.source b/src/test/regress/expected/tablespace.out
similarity index 100%
rename from src/test/regress/output/tablespace.source
rename to src/test/regress/expected/tablespace.out
diff --git a/src/test/regress/input/constraints.source b/src/test/regress/sql/constraints.sql
similarity index 100%
rename from src/test/regress/input/constraints.source
rename to src/test/regress/sql/constraints.sql
diff --git a/src/test/regress/input/copy.source b/src/test/regress/sql/copy.sql
similarity index 100%
rename from src/test/regress/input/copy.source
rename to src/test/regress/sql/copy.sql
diff --git a/src/test/regress/input/create_function_0.source b/src/test/regress/sql/create_function_0.sql
similarity index 100%
rename from src/test/regress/input/create_function_0.source
rename to src/test/regress/sql/create_function_0.sql
diff --git a/src/test/regress/input/create_function_1.source b/src/test/regress/sql/create_function_1.sql
similarity index 100%
rename from src/test/regress/input/create_function_1.source
rename to src/test/regress/sql/create_function_1.sql
diff --git a/src/test/regress/input/create_function_2.source b/src/test/regress/sql/create_function_2.sql
similarity index 100%
rename from src/test/regress/input/create_function_2.source
rename to src/test/regress/sql/create_function_2.sql
diff --git a/src/test/regress/input/largeobject.source b/src/test/regress/sql/largeobject.sql
similarity index 100%
rename from src/test/regress/input/largeobject.source
rename to src/test/regress/sql/largeobject.sql
diff --git a/src/test/regress/input/misc.source b/src/test/regress/sql/misc.sql
similarity index 100%
rename from src/test/regress/input/misc.source
rename to src/test/regress/sql/misc.sql
diff --git a/src/test/regress/input/tablespace.source b/src/test/regress/sql/tablespace.sql
similarity index 100%
rename from src/test/regress/input/tablespace.source
rename to src/test/regress/sql/tablespace.sql
diff --git a/contrib/dblink/Makefile b/contrib/dblink/Makefile
index b008c8c4c4..6bb3ece38c 100644
--- a/contrib/dblink/Makefile
+++ b/contrib/dblink/Makefile
@@ -13,7 +13,6 @@ PGFILEDESC = "dblink - connect to other PostgreSQL databases"

 REGRESS = paths dblink
 REGRESS_OPTS = --dlpath=$(top_builddir)/src/test/regress
-EXTRA_CLEAN = sql/paths.sql expected/paths.out

 ifdef USE_PGXS
 PG_CONFIG = pg_config
diff --git a/contrib/dblink/expected/.gitignore b/contrib/dblink/expected/.gitignore
deleted file mode 100644
index d9c7942c64..0000000000
--- a/contrib/dblink/expected/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/paths.out
diff --git a/contrib/dblink/sql/.gitignore b/contrib/dblink/sql/.gitignore
deleted file mode 100644
index d17507846d..0000000000
--- a/contrib/dblink/sql/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/paths.sql
diff --git a/contrib/file_fdw/Makefile b/contrib/file_fdw/Makefile
index 4da9f2d697..885459d3c1 100644
--- a/contrib/file_fdw/Makefile
+++ b/contrib/file_fdw/Makefile
@@ -8,8 +8,6 @@ PGFILEDESC = "file_fdw - foreign data wrapper for files"

 REGRESS = file_fdw

-EXTRA_CLEAN = sql/file_fdw.sql expected/file_fdw.out
-
 ifdef USE_PGXS
 PG_CONFIG = pg_config
 PGXS := $(shell $(PG_CONFIG) --pgxs)
diff --git a/contrib/file_fdw/expected/.gitignore b/contrib/file_fdw/expected/.gitignore
deleted file mode 100644
index a464ad144f..0000000000
--- a/contrib/file_fdw/expected/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/file_fdw.out
diff --git a/contrib/file_fdw/sql/.gitignore b/contrib/file_fdw/sql/.gitignore
deleted file mode 100644
index ebf16fed94..0000000000
--- a/contrib/file_fdw/sql/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/file_fdw.sql
diff --git a/src/interfaces/ecpg/test/pg_regress_ecpg.c b/src/interfaces/ecpg/test/pg_regress_ecpg.c
index 15f588a802..9465ba7845 100644
--- a/src/interfaces/ecpg/test/pg_regress_ecpg.c
+++ b/src/interfaces/ecpg/test/pg_regress_ecpg.c
@@ -166,9 +166,14 @@ ecpg_start_test(const char *testname,
     snprintf(inprg, sizeof(inprg), "%s/%s", inputdir, testname);
     snprintf(insource, sizeof(insource), "%s.c", testname);

+    /* make a version of the test name that has dashes in place of slashes */
     initStringInfo(&testname_dash);
     appendStringInfoString(&testname_dash, testname);
-    replace_string(&testname_dash, "/", "-");
+    for (char *c = testname_dash.data; *c != '\0'; c++)
+    {
+        if (*c == '/')
+            *c = '-';
+    }

     snprintf(expectfile_stdout, sizeof(expectfile_stdout),
              "%s/expected/%s.stdout",
diff --git a/src/pl/plpgsql/src/Makefile b/src/pl/plpgsql/src/Makefile
index 9946abbc1d..f7eb42d54f 100644
--- a/src/pl/plpgsql/src/Makefile
+++ b/src/pl/plpgsql/src/Makefile
@@ -41,11 +41,6 @@ TOOLSDIR = $(top_srcdir)/src/tools
 GEN_KEYWORDLIST = $(PERL) -I $(TOOLSDIR) $(TOOLSDIR)/gen_keywordlist.pl
 GEN_KEYWORDLIST_DEPS = $(TOOLSDIR)/gen_keywordlist.pl $(TOOLSDIR)/PerfectHash.pm

-# Test input and expected files.  These are created by pg_regress itself, so we
-# don't have a rule to create them.  We do need rules to clean them however.
-input_files = $(patsubst $(srcdir)/input/%.source,sql/%.sql, $(wildcard $(srcdir)/input/*.source))
-output_files := $(patsubst $(srcdir)/output/%.source,expected/%.out, $(wildcard $(srcdir)/output/*.source))
-
 all: all-lib

 # Shared library stuff
@@ -116,7 +111,6 @@ distprep: pl_gram.h pl_gram.c plerrcodes.h pl_reserved_kwlist_d.h pl_unreserved_
 # are not cleaned here.
 clean distclean: clean-lib
     rm -f $(OBJS)
-    rm -f $(output_files) $(input_files)
     rm -rf $(pg_regress_clean_files)

 maintainer-clean: distclean
diff --git a/src/pl/plpgsql/src/expected/.gitignore b/src/pl/plpgsql/src/expected/.gitignore
deleted file mode 100644
index 13e5918721..0000000000
--- a/src/pl/plpgsql/src/expected/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/plpgsql_copy.out
diff --git a/src/pl/plpgsql/src/sql/.gitignore b/src/pl/plpgsql/src/sql/.gitignore
deleted file mode 100644
index 210bee188e..0000000000
--- a/src/pl/plpgsql/src/sql/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-/plpgsql_copy.sql
diff --git a/src/test/regress/GNUmakefile b/src/test/regress/GNUmakefile
index fe6e0c98aa..330eca2b83 100644
--- a/src/test/regress/GNUmakefile
+++ b/src/test/regress/GNUmakefile
@@ -69,19 +69,12 @@ all: all-lib
 # Ensure parallel safety if a build is started in this directory
 $(OBJS): | submake-libpgport submake-generated-headers

-# Test input and expected files.  These are created by pg_regress itself, so we
-# don't have a rule to create them.  We do need rules to clean them however.
-input_files = $(patsubst $(srcdir)/input/%.source,sql/%.sql, $(wildcard $(srcdir)/input/*.source))
-output_files := $(patsubst $(srcdir)/output/%.source,expected/%.out, $(wildcard $(srcdir)/output/*.source))
-

 # not installed by default

 regress_data_files = \
-    $(filter-out $(addprefix $(srcdir)/,$(output_files)),$(wildcard $(srcdir)/expected/*.out)) \
-    $(wildcard $(srcdir)/input/*.source) \
-    $(wildcard $(srcdir)/output/*.source) \
-    $(filter-out $(addprefix $(srcdir)/,$(input_files)),$(wildcard $(srcdir)/sql/*.sql)) \
+    $(wildcard $(srcdir)/sql/*.sql) \
+    $(wildcard $(srcdir)/expected/*.out) \
     $(wildcard $(srcdir)/data/*.data) \
     $(srcdir)/parallel_schedule $(srcdir)/resultmap

@@ -162,6 +155,5 @@ clean distclean maintainer-clean: clean-lib
     rm -f $(OBJS) refint$(DLSUFFIX) autoinc$(DLSUFFIX)
     rm -f pg_regress_main.o pg_regress.o pg_regress$(X)
 # things created by various check targets
-    rm -f $(output_files) $(input_files)
     rm -rf testtablespace
     rm -rf $(pg_regress_clean_files)
diff --git a/src/test/regress/expected/.gitignore b/src/test/regress/expected/.gitignore
deleted file mode 100644
index b99caf5f40..0000000000
--- a/src/test/regress/expected/.gitignore
+++ /dev/null
@@ -1,10 +0,0 @@
-/constraints.out
-/copy.out
-/create_function_0.out
-/create_function_1.out
-/create_function_2.out
-/largeobject.out
-/largeobject_1.out
-/misc.out
-/security_label.out
-/tablespace.out
diff --git a/src/test/regress/pg_regress.c b/src/test/regress/pg_regress.c
index b19b39aa09..6d6b686a22 100644
--- a/src/test/regress/pg_regress.c
+++ b/src/test/regress/pg_regress.c
@@ -438,155 +438,6 @@ string_matches_pattern(const char *str, const char *pattern)
     return false;
 }

-/*
- * Replace all occurrences of "replace" in "string" with "replacement".
- * The StringInfo will be suitably enlarged if necessary.
- *
- * Note: this is optimized on the assumption that most calls will find
- * no more than one occurrence of "replace", and quite likely none.
- */
-void
-replace_string(StringInfo string, const char *replace, const char *replacement)
-{
-    int            pos = 0;
-    char       *ptr;
-
-    while ((ptr = strstr(string->data + pos, replace)) != NULL)
-    {
-        /* Must copy the remainder of the string out of the StringInfo */
-        char       *suffix = pg_strdup(ptr + strlen(replace));
-
-        /* Truncate StringInfo at start of found string ... */
-        string->len = ptr - string->data;
-        /* ... and append the replacement (this restores the trailing '\0') */
-        appendStringInfoString(string, replacement);
-        /* Next search should start after the replacement */
-        pos = string->len;
-        /* Put back the remainder of the string */
-        appendStringInfoString(string, suffix);
-        free(suffix);
-    }
-}
-
-/*
- * Convert *.source found in the "source" directory, replacing certain tokens
- * in the file contents with their intended values, and put the resulting files
- * in the "dest" directory, replacing the ".source" prefix in their names with
- * the given suffix.
- */
-static void
-convert_sourcefiles_in(const char *source_subdir, const char *dest_dir, const char *dest_subdir, const char *suffix)
-{
-    char        testtablespace[MAXPGPATH];
-    char        indir[MAXPGPATH];
-    char        outdir_sub[MAXPGPATH];
-    char      **name;
-    char      **names;
-    int            count = 0;
-
-    snprintf(indir, MAXPGPATH, "%s/%s", inputdir, source_subdir);
-
-    /* Check that indir actually exists and is a directory */
-    if (!directory_exists(indir))
-    {
-        /*
-         * No warning, to avoid noise in tests that do not have these
-         * directories; for example, ecpg, contrib and src/pl.
-         */
-        return;
-    }
-
-    names = pgfnames(indir);
-    if (!names)
-        /* Error logged in pgfnames */
-        exit(2);
-
-    /* Create the "dest" subdirectory if not present */
-    snprintf(outdir_sub, MAXPGPATH, "%s/%s", dest_dir, dest_subdir);
-    if (!directory_exists(outdir_sub))
-        make_directory(outdir_sub);
-
-    /* We might need to replace @testtablespace@ */
-    snprintf(testtablespace, MAXPGPATH, "%s/testtablespace", outputdir);
-
-    /* finally loop on each file and do the replacement */
-    for (name = names; *name; name++)
-    {
-        char        srcfile[MAXPGPATH];
-        char        destfile[MAXPGPATH];
-        char        prefix[MAXPGPATH];
-        FILE       *infile,
-                   *outfile;
-        StringInfoData line;
-
-        /* reject filenames not finishing in ".source" */
-        if (strlen(*name) < 8)
-            continue;
-        if (strcmp(*name + strlen(*name) - 7, ".source") != 0)
-            continue;
-
-        count++;
-
-        /* build the full actual paths to open */
-        snprintf(prefix, strlen(*name) - 6, "%s", *name);
-        snprintf(srcfile, MAXPGPATH, "%s/%s", indir, *name);
-        snprintf(destfile, MAXPGPATH, "%s/%s/%s.%s", dest_dir, dest_subdir,
-                 prefix, suffix);
-
-        infile = fopen(srcfile, "r");
-        if (!infile)
-        {
-            fprintf(stderr, _("%s: could not open file \"%s\" for reading: %s\n"),
-                    progname, srcfile, strerror(errno));
-            exit(2);
-        }
-        outfile = fopen(destfile, "w");
-        if (!outfile)
-        {
-            fprintf(stderr, _("%s: could not open file \"%s\" for writing: %s\n"),
-                    progname, destfile, strerror(errno));
-            exit(2);
-        }
-
-        initStringInfo(&line);
-
-        while (pg_get_line_buf(infile, &line))
-        {
-            replace_string(&line, "@abs_srcdir@", inputdir);
-            replace_string(&line, "@abs_builddir@", outputdir);
-            replace_string(&line, "@testtablespace@", testtablespace);
-            replace_string(&line, "@libdir@", dlpath);
-            replace_string(&line, "@DLSUFFIX@", DLSUFFIX);
-            fputs(line.data, outfile);
-        }
-
-        pfree(line.data);
-        fclose(infile);
-        fclose(outfile);
-    }
-
-    /*
-     * If we didn't process any files, complain because it probably means
-     * somebody neglected to pass the needed --inputdir argument.
-     */
-    if (count <= 0)
-    {
-        fprintf(stderr, _("%s: no *.source files found in \"%s\"\n"),
-                progname, indir);
-        exit(2);
-    }
-
-    pgfnames_cleanup(names);
-}
-
-/* Create the .sql and .out files from the .source files, if any */
-static void
-convert_sourcefiles(void)
-{
-    convert_sourcefiles_in("input", outputdir, "sql", "sql");
-    convert_sourcefiles_in("output", outputdir, "expected", "out");
-}
-
 /*
  * Clean out the test tablespace dir, or create it if it doesn't exist.
  *
@@ -936,7 +787,6 @@ initialize_environment(void)
             printf(_("(using postmaster on Unix socket, default port)\n"));
     }

-    convert_sourcefiles();
     load_resultmap();
 }

diff --git a/src/test/regress/pg_regress.h b/src/test/regress/pg_regress.h
index c6d015c840..ad91dfb858 100644
--- a/src/test/regress/pg_regress.h
+++ b/src/test/regress/pg_regress.h
@@ -65,6 +65,4 @@ int            regression_main(int argc, char *argv[],

 void        add_stringlist_item(_stringlist **listhead, const char *str);
 PID_TYPE    spawn_process(const char *cmdline);
-void        replace_string(struct StringInfoData *string,
-                           const char *replace, const char *replacement);
 bool        file_exists(const char *file);
diff --git a/src/test/regress/sql/.gitignore b/src/test/regress/sql/.gitignore
deleted file mode 100644
index fe14af6ae7..0000000000
--- a/src/test/regress/sql/.gitignore
+++ /dev/null
@@ -1,9 +0,0 @@
-/constraints.sql
-/copy.sql
-/create_function_0.sql
-/create_function_1.sql
-/create_function_2.sql
-/largeobject.sql
-/misc.sql
-/security_label.sql
-/tablespace.sql

В списке pgsql-hackers по дате отправления:

Предыдущее
От: Justin Pryzby
Дата:
Сообщение: sqlsmith: ERROR: XX000: bogus varno: 2
Следующее
От: Tom Lane
Дата:
Сообщение: Re: sqlsmith: ERROR: XX000: bogus varno: 2