Обсуждение: Parametised query wraps everything in single quotes?

Поиск
Список
Период
Сортировка

Parametised query wraps everything in single quotes?

От
Victor Hooi
Дата:
Hi,

I'm attempting to pass parameters through to a COPY statement that's running against Amazon Redshift.

The statement looks a bit like this:

    COPY my_table (
        my_field_1,
        my_field_2,
        my_field_3
    )
    FROM 's3://my_bucket/my_filename'
    CREDENTIALS 'aws_access_key=my_key;aws_secret_key=my_secret_key'
    ...

where I'm passing through values to my_bucket, my_filename etc.

Originally, I tried using Psycopg2's parametised queries, however, this seemed to wrap all the values in single quotes, so the final result was something like:

    FROM 's3://'my_bucket'/'my_filename''

which obviously doesn't work since the bucket name doesn't have quotes in it.

From reading this:


I get the impression that this is a known limitation of Psycopg2, and there isn't any other way around it.

So I'm using Python's .format() operator to insert the values into the SQL statement now.

However, I'm wondering if there is another way around this since then, where you can still pass values through to the SQL statement, but I'm not exposed to SQL injection attacks?

Cheers,
Victor

Re: Parametised query wraps everything in single quotes?

От
David Niergarth
Дата:
This will work in a pinch for simple values that you want to add to a query via .format() or string interpolation:

    escape = lambda value: psycopg2.extensions.QuotedString(str(value)).getquoted()

Manyally wrap values you're concerned about with escape(), as in

    sql_query.format(my_filename=escape(my_filename))

--David



On Mon, Nov 11, 2013 at 1:36 AM, Victor Hooi <victorhooi@yahoo.com> wrote:
Hi,

I'm attempting to pass parameters through to a COPY statement that's running against Amazon Redshift.

The statement looks a bit like this:

    COPY my_table (
        my_field_1,
        my_field_2,
        my_field_3
    )
    FROM 's3://my_bucket/my_filename'
    CREDENTIALS 'aws_access_key=my_key;aws_secret_key=my_secret_key'
    ...

where I'm passing through values to my_bucket, my_filename etc.

Originally, I tried using Psycopg2's parametised queries, however, this seemed to wrap all the values in single quotes, so the final result was something like:

    FROM 's3://'my_bucket'/'my_filename''

which obviously doesn't work since the bucket name doesn't have quotes in it.

From reading this:


I get the impression that this is a known limitation of Psycopg2, and there isn't any other way around it.

So I'm using Python's .format() operator to insert the values into the SQL statement now.

However, I'm wondering if there is another way around this since then, where you can still pass values through to the SQL statement, but I'm not exposed to SQL injection attacks?

Cheers,
Victor