Re: Proposal: Adding json logging
От | David Arnold |
---|---|
Тема | Re: Proposal: Adding json logging |
Дата | |
Msg-id | CAH6vsWKScK=vhStOmL1r1wBKayyw5LmWByMeAS30iRWV1XjYTQ@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: Proposal: Adding json logging (David Arnold <dar@xoe.solutions>) |
Список | pgsql-hackers |
>Why? The newlines aren't meaningfully different from other characters
you need to parse? The data isn't actually stored in a newline separated
fashion, that's just one byte with that meaning
you need to parse? The data isn't actually stored in a newline separated
fashion, that's just one byte with that meaning
I miss the details, but I believe that stdout is usually parsed and streamed simply line by line. Like in:
El dom., 15 abr. 2018 a las 13:08, David Arnold (<dar@xoe.solutions>) escribió:
>This would appear to solve multiline issues within Fluent.....
>https://docs.fluentd.org/v0.12/articles/parser_multilineI definitely looked at that, but what guarantees do I have that the sequence is always ERROR/STATEMENT/DETAIL? And not the other way round?And it only works with tail logging from log file so I cannot use a native docker logging driver which streams event by event.This again prohibits me the usage of host-global docker logging driver configuration as my standard option for host provisioning.>Is the issue that there are line breaks in things like lines 7-9?No way to parse that with a line by line regex cleanly. Fluent-multi line is no real multi line regex, it's just some logic to emulate multi line regex. I believe the reason might be that true multi line regex would be way to resource demanding to run intercalatingly on a moving set of lines of unknown cardinality.--El dom., 15 abr. 2018 a las 13:00, David Arnold (<dar@xoe.solutions>) escribió:>It looks like the thread skipped over the problem space for the solution space pretty fastOK, I apologize, it seemed to me from the feedback that the problem was already uncontested. To verify/falsify that was the objective of my previous mail :)>Can you elaborate?Sure.CSV Logs: https://pastebin.com/uwfmRdU7CSV shows line breaks, STDOUT shows ERROR/FATAL and detail on different lines, not an easy problem to stream-parse reliably (without some kind of a buffer, etc)...--El dom., 15 abr. 2018 a las 12:46, Christophe Pettus (<xof@thebuild.com>) escribió:
> On Apr 15, 2018, at 10:39, David Arnold <dar@xoe.solutions> wrote:
>
> In the light of the specific use case / problem for this thread to be born, what exactly would you suggest?
It looks like the thread skipped over the problem space for the solution space pretty fast; I see your note:
> I have reviewed some log samples and all DO contain some kind of multi line logs which are very uncomfortable to parse reliably in a log streamer.
... but I don't see any actual examples of those. Can you elaborate?
--
-- Christophe Pettus
xof@thebuild.com
DAVID ARNOLD
Gerente Generalxoe.solutions
dar@xoe.solutions
+57 (315) 304 13 68Confidentiality Note: This email may contain confidential and/or private information. If you received this email in error please delete and notify sender. Environmental Consideration: Please avoid printing this email on paper, unless really necessary.
DAVID ARNOLD
Gerente Generalxoe.solutions
dar@xoe.solutions
+57 (315) 304 13 68Confidentiality Note: This email may contain confidential and/or private information. If you received this email in error please delete and notify sender. Environmental Consideration: Please avoid printing this email on paper, unless really necessary.
DAVID ARNOLD Gerente General | |
xoe.solutions dar@xoe.solutions +57 (315) 304 13 68 | |
Confidentiality Note: This email may contain confidential and/or private information. If you received this email in error please delete and notify sender. | |
Environmental Consideration: Please avoid printing this email on paper, unless really necessary. |
В списке pgsql-hackers по дате отправления: