On Sun, Sep 15, 2019 at 09:47:52PM -0700, Andres Freund wrote:
> On 2019-09-15 15:14:50 -0700, Noah Misch wrote:
> > --- a/src/test/regress/regress.c
> > +++ b/src/test/regress/regress.c
> > @@ -670,6 +670,16 @@ test_atomic_flag(void)
> > pg_atomic_clear_flag(&flag);
> > }
> >
> > +#define EXPECT(result_expr, expected_expr) \
> > + do { \
> > + uint32 result = (result_expr); \
> > + uint32 expected = (expected_expr); \
> > + if (result != expected) \
> > + elog(ERROR, \
> > + "%s yielded %u, expected %s in file \"%s\" line %u", \
> > + #result_expr, result, #expected_expr, __FILE__, __LINE__); \
> > + } while (0)
> > +
> Unfortunately we can't easily make this type independent. The local
> variables are needed to avoid multiple evaluation. While we could infer
> their type using compiler specific magic (__typeof__() or C++), we'd
> still need to print them. We could however remove the local variables,
> and purely rely on stringification of the arguments for printing the
> error.
> I'd name it EXPECT_EQ_U32 or such, but otherwise I think this is a clear
> improvement.
EXPECT_EQ_U32 works for me; I mildly prefer that to a type-independent macro
that doesn't print the unexpected value. Attached.