Re: Proposal: Adding json logging

Поиск
Список
Период
Сортировка
От David Arnold
Тема Re: Proposal: Adding json logging
Дата
Msg-id CAH6vsWLsmmafzLp=UBf-RxrJi2V8j7o1AFX5nJDvaZhY-vX-1w@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Proposal: Adding json logging  ("Daniel Verite" <daniel@manitou-mail.org>)
Ответы Re: Proposal: Adding json logging  ("Daniel Verite" <daniel@manitou-mail.org>)
Список pgsql-hackers
In CSV a line break inside a field is easy to process for
a parser, because (per https://tools.ietf.org/html/rfc4180):
>"Fields containing line breaks (CRLF), double quotes, and commas should be enclosed in double-quotes"

Interesting, does that implicitly mean the whole log event would get transmitted as a "line" (with CRLF) in CSV. Nor do I know how to confirm of falsify that...

In the affirmative scenario, this then would work for a true streaming aggregator (if CSV was supported).

Wouldn't it continue to be resource expensive if for some reason someone wants to continue to do log tailing out of a log file, which as I understand it is lined by line semantics?

I do not plan to do that, but there are people who prefer to also have a local copy of their logfile just in case.

But if CSV, as emitted by postgres, would be supported by fluentbit I would be equally happy with that solution.

The second order problem, after all is just that: of second order...

El lun., 16 abr. 2018, 1:28 p.m., Daniel Verite <daniel@manitou-mail.org> escribió:
        David Arnold wrote:

> Not claiming this assumption does imply parsing of a *rolling* set
> of log lines with *previously unkown cardinality*. That's expensive
> on computing resources. I don't have actual numbers, but it doesn't
> seem too far fetched, neither.
> I filed a question to the author of fluent-bit to that extend which
> you can consult here:
> https://github.com/fluent/fluent-bit/issues/564 Let's see what
> Eduardo has to inform us about this...

fluent-bit does not appear to support CSV, as mentioned in
https://github.com/fluent/fluent-bit/issues/459
which got flagged as an enhancement request some time ago.

In CSV a line break inside a field is easy to process for
a parser, because (per https://tools.ietf.org/html/rfc4180):

  "Fields containing line breaks (CRLF), double quotes, and commas
    should be enclosed in double-quotes"

So there is no look ahead to do. In a character-by-character loop,
when encountering a line break, either the current field did not
start with a double quote and the line break is part of the content, or it
did
start with a double quote and the line break ends the current record.

What doesn't quite work is to parse CSV with a regex, it's
discussed in some detail here for instance:
https://softwareengineering.stackexchange.com/questions/166454/can-the-csv-format-be-defined-by-a-regex


Best regards,
--
Daniel Vérité
PostgreSQL-powered mailer: http://www.manitou-mail.org
Twitter: @DanielVerite

В списке pgsql-hackers по дате отправления:

Предыдущее
От: Guilherme Pereira
Дата:
Сообщение: Re: very slow queries when max_parallel_workers_per_gather is higherthan zero
Следующее
От: Alvaro Herrera
Дата:
Сообщение: Re: ON CONFLICT DO UPDATE for partitioned tables