Обсуждение: application for postgres Log
Hello and namaste
I am in a project that uses PostGresSql v9.0. We are developing an application in CSharp to parse the PG server activity log installed on Windows 2003/XP or higher.
Our application will :
Scan the Log for a given search text and Post rows found
Produce statistics related to SQL queries tracked in a time frame
Will allow selection of log columns, that can be displayed in a DataGrid
I need some help or even a simple link that is related to this subject. I am already checking Goog query of Postgres Log, though it gives mostly results for the postgres Documentation.
thanks
arvind
I am in a project that uses PostGresSql v9.0. We are developing an application in CSharp to parse the PG server activity log installed on Windows 2003/XP or higher.
Our application will :
Scan the Log for a given search text and Post rows found
Produce statistics related to SQL queries tracked in a time frame
Will allow selection of log columns, that can be displayed in a DataGrid
I need some help or even a simple link that is related to this subject. I am already checking Goog query of Postgres Log, though it gives mostly results for the postgres Documentation.
thanks
arvind
On 09/09/12 11:19 PM, Arvind Singh wrote: > I am in a project that uses PostGresSql v9.0. We are developing an > application in CSharp to parse the PG server activity log installed on > Windows 2003/XP or higher. > > Our application will : > Scan the Log for a given search text and Post rows found > Produce statistics related to SQL queries tracked in a time frame > Will allow selection of log columns, that can be displayed in a DataGrid > > I need some help or even a simple link that is related to this > subject. I am already checking Goog query of Postgres Log, though it > gives mostly results for the postgres Documentation. the log isn't in a column oriented format although you can define a custom log prefix of things like timestamp, database user, connection IP, database name, process ID, and those could be treated as columns, according to whatever seperators you specify as part of your custom log prefix. past that, the log is plain text. by default only errors and a few key system events are logged (startup, shutdown), but you can increase the amount of logging via the various options in postgresql.conf, including logging all statements, all connections, etc. whatever you do, its going to be a lot of plain text parsing, I think if it was my job, first thing I would do is generate some sample logs and look at them to see how to proceed. -- john r pierce N 37, W 122 santa cruz ca mid-left coast
Arvind Singh wrote: > I am in a project that uses PostGresSql v9.0. We are developing an application in CSharp to parse the > PG server activity log installed on Windows 2003/XP or higher. > > Our application will : > Scan the Log for a given search text and Post rows found > Produce statistics related to SQL queries tracked in a time frame > Will allow selection of log columns, that can be displayed in a DataGrid > > I need some help or even a simple link that is related to this subject. I am already checking Goog > query of Postgres Log, though it gives mostly results for the postgres Documentation. Maybe pgreplay can give you some inspiration: http://pgreplay.projects.postgresql.org/ Check out parse.c which parses a log file. One bit of advice I'd like to give you is a recommendation to use the CSV log format. The regular error log is much harder to parse, and moreover everything that gets printed to stderr by the server will end up there. This could be dumps when the server runs out of memory or stuff that loaded modules produce. All this will screw up your log file parsing. Yours, Laurenz Albe
W dniu 2012-09-10 08:19, Arvind Singh pisze: > > I need some help or even a simple link that is related to this subject you might be interested in reading this: http://bucardo.org/wiki/Tail_n_mail
Thanx for the reply,
I have enabled the csvlog option.
There seem to be 24 columns in the Log. I have difficulty in understanding the following columns
- virtual_transaction_id
- transaction_id
- hint
- internal_query
- internal_query_pos
- context
I have correlated the column names from log structure mentioned in documentation
arvind
I have enabled the csvlog option.
There seem to be 24 columns in the Log. I have difficulty in understanding the following columns
- virtual_transaction_id
- transaction_id
- hint
- internal_query
- internal_query_pos
- context
I have correlated the column names from log structure mentioned in documentation
arvind
> Subject: RE: [GENERAL] application for postgres Log
> Date: Mon, 10 Sep 2012 09:31:29 +0200
> From: laurenz.albe@wien.gv.at
> To: arvindps@hotmail.com; pgsql-general@postgresql.org
>
> Arvind Singh wrote:
> > I am in a project that uses PostGresSql v9.0. We are developing an
> application in CSharp to parse the
> > PG server activity log installed on Windows 2003/XP or higher.
> >
> > Our application will :
> > Scan the Log for a given search text and Post rows found
> > Produce statistics related to SQL queries tracked in a time frame
> > Will allow selection of log columns, that can be displayed in a
> DataGrid
> >
> > I need some help or even a simple link that is related to this
> subject. I am already checking Goog
> > query of Postgres Log, though it gives mostly results for the postgres
> Documentation.
>
> Maybe pgreplay can give you some inspiration:
> http://pgreplay.projects.postgresql.org/
>
> Check out parse.c which parses a log file.
>
> One bit of advice I'd like to give you is a recommendation
> to use the CSV log format. The regular error log is much
> harder to parse, and moreover everything that gets printed
> to stderr by the server will end up there. This could be
> dumps when the server runs out of memory or stuff that loaded
> modules produce. All this will screw up your log file
> parsing.
>
> Yours,
> Laurenz Albe
> Date: Mon, 10 Sep 2012 09:31:29 +0200
> From: laurenz.albe@wien.gv.at
> To: arvindps@hotmail.com; pgsql-general@postgresql.org
>
> Arvind Singh wrote:
> > I am in a project that uses PostGresSql v9.0. We are developing an
> application in CSharp to parse the
> > PG server activity log installed on Windows 2003/XP or higher.
> >
> > Our application will :
> > Scan the Log for a given search text and Post rows found
> > Produce statistics related to SQL queries tracked in a time frame
> > Will allow selection of log columns, that can be displayed in a
> DataGrid
> >
> > I need some help or even a simple link that is related to this
> subject. I am already checking Goog
> > query of Postgres Log, though it gives mostly results for the postgres
> Documentation.
>
> Maybe pgreplay can give you some inspiration:
> http://pgreplay.projects.postgresql.org/
>
> Check out parse.c which parses a log file.
>
> One bit of advice I'd like to give you is a recommendation
> to use the CSV log format. The regular error log is much
> harder to parse, and moreover everything that gets printed
> to stderr by the server will end up there. This could be
> dumps when the server runs out of memory or stuff that loaded
> modules produce. All this will screw up your log file
> parsing.
>
> Yours,
> Laurenz Albe
Arvind Singh wrote: > I have enabled the csvlog option. > > There seem to be 24 columns in the Log. I have difficulty in understanding the following columns > > - virtual_transaction_id Such a transaction ID is given to every transaction. Can be used to track transactions in the log. > - transaction_id Such a transaction ID is given to a transaction that modifies the database. > - hint Part of the error message (like "detail" and "context"). > - internal_query > - internal_query_pos These are used a handful of times in the code, I'm not 100% sure what for. You may never encouter one of those. > - context Describes where the error ocurred. Yours, Laurenz Albe
Thanks again,
I have come across a very peculiar situation.
We have a postgres installation 9.0. It was installed last year.
But we started implementation on it, just recently.
And therefore the need to develop a Logparser application.
During our preliminary parsing , What we discovered is just beyond the grasp of my knowledge.
It seems that during a certain period lastyear in November, it created a Session entry that holds more than
Fifty thousand records for a SINGLE SESSION (4ebccaa2.20c) . Yes that is 5 with five zeros
One does look at wonder of the Dawn of November when such occurance holds place.
Is it divine intervention, because we never parsed the log in those days and therefore cannot ascertain the reasons for such verbosity.
Although it never reoccurs and luckily we had csv option on during that period.
Where should i report such findings
I have uploaded that Part of Log at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
arvind
I have come across a very peculiar situation.
We have a postgres installation 9.0. It was installed last year.
But we started implementation on it, just recently.
And therefore the need to develop a Logparser application.
During our preliminary parsing , What we discovered is just beyond the grasp of my knowledge.
It seems that during a certain period lastyear in November, it created a Session entry that holds more than
Fifty thousand records for a SINGLE SESSION (4ebccaa2.20c) . Yes that is 5 with five zeros
One does look at wonder of the Dawn of November when such occurance holds place.
Is it divine intervention, because we never parsed the log in those days and therefore cannot ascertain the reasons for such verbosity.
Although it never reoccurs and luckily we had csv option on during that period.
Where should i report such findings
I have uploaded that Part of Log at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
arvind
- ------------------------------------------------------
- The woods are lovely, dark, and deep,
But I have promises to keep,
And miles to go before I sleep,
And miles to go before I sleep. - The woods are lovely, dark, and deep,
- (Robert Frost, Stopping by Woods on a Snowy Evening)
> Subject: RE: [GENERAL] application for postgres Log
> Date: Fri, 14 Sep 2012 13:16:33 +0200
> From: laurenz.albe@wien.gv.at
> To: arvindps@hotmail.com; pgsql-general@postgresql.org
>
> Arvind Singh wrote:
> > I have enabled the csvlog option.
> >
> > There seem to be 24 columns in the Log. I have difficulty in
> understanding the following columns
> >
> > - virtual_transaction_id
>
> Such a transaction ID is given to every transaction.
> Can be used to track transactions in the log.
>
> > - transaction_id
>
> Such a transaction ID is given to a transaction
> that modifies the database.
>
> > - hint
>
> Part of the error message (like "detail" and "context").
>
> > - internal_query
> > - internal_query_pos
>
> These are used a handful of times in the code, I'm not
> 100% sure what for.
> You may never encouter one of those.
>
> > - context
>
> Describes where the error ocurred.
>
> Yours,
> Laurenz Albe
Arvind Singh wrote: > I have come across a very peculiar situation. > > We have a postgres installation 9.0. It was installed last year. > > But we started implementation on it, just recently. > > And therefore the need to develop a Logparser application. > > During our preliminary parsing , What we discovered is just beyond the grasp of my knowledge. > > It seems that during a certain period lastyear in November, it created a Session entry that holds more > than > Fifty thousand records for a SINGLE SESSION (4ebccaa2.20c) . Yes that is 5 with five zeros > > One does look at wonder of the Dawn of November when such occurance holds place. > Is it divine intervention, because we never parsed the log in those days and therefore cannot > ascertain the reasons for such verbosity. It would indeed be divine intervention if fifty thousand had five zeros. Other than that, I don't see anything special about that. A session can last pretty long. Maybe you can solve your mystery by looking at the log entries. They should tell you what was going on. > Although it never reoccurs and luckily we had csv option on during that period. > > Where should i report such findings I don't think there is anything wrong. At least nothing database related. > I have uploaded that Part of Log at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip That looks like somebody turned on "log_statement_stats" for a spell. Yours, Laurenz Albe
thanx Laurenz,
But
Our CSV Log contains lot of statements like the following THREE lines. They appear exactly one after the other.
And they number in thousands for a Session (more than ten thousand)
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,754,"idle",2011-11-11 12:41:30 IST,2/308,0,LOG,00000,"statement: INSERT INTO pemdata.settings (""name"", ""setting"", ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,,,,,,,"exec_simple_query, .\src\backend\tcop\postgres.c:900",""
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,755,"INSERT",2011-11-11 12:41:30 IST,2/0,0,LOG,00000,"duration: 0.000 ms",,,,,,,,"exec_simple_query, .\src\backend\tcop\postgres.c:1128",""
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,756,"INSERT",2011-11-11 12:41:30 IST,2/0,0,LOG,00000,"QUERY STATISTICS","! system usage stats:
! 0.000000 elapsed 0.000000 user 0.000000 system sec
! [0.250000 user 0.156250 sys total]",,,,,"INSERT INTO pemdata.settings (""name"", ""setting"", ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,"ShowUsage, .\src\backend\tcop\postgres.c:4305",""
Is there anything that we enabled, because they dont appear after that particular session.
The log file is uploaded at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
arvind ps
But
Our CSV Log contains lot of statements like the following THREE lines. They appear exactly one after the other.
And they number in thousands for a Session (more than ten thousand)
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,754,"idle",2011-11-11 12:41:30 IST,2/308,0,LOG,00000,"statement: INSERT INTO pemdata.settings (""name"", ""setting"", ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,,,,,,,"exec_simple_query, .\src\backend\tcop\postgres.c:900",""
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,755,"INSERT",2011-11-11 12:41:30 IST,2/0,0,LOG,00000,"duration: 0.000 ms",,,,,,,,"exec_simple_query, .\src\backend\tcop\postgres.c:1128",""
2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,756,"INSERT",2011-11-11 12:41:30 IST,2/0,0,LOG,00000,"QUERY STATISTICS","! system usage stats:
! 0.000000 elapsed 0.000000 user 0.000000 system sec
! [0.250000 user 0.156250 sys total]",,,,,"INSERT INTO pemdata.settings (""name"", ""setting"", ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,"ShowUsage, .\src\backend\tcop\postgres.c:4305",""
Is there anything that we enabled, because they dont appear after that particular session.
The log file is uploaded at http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
arvind ps
> It would indeed be divine intervention if fifty thousand had
> five zeros.
>
> Other than that, I don't see anything special about that.
> A session can last pretty long.
> Maybe you can solve your mystery by looking at the log entries.
> They should tell you what was going on.
>
> > Although it never reoccurs and luckily we had csv option on during
> that period.
> >
> > Where should i report such findings
>
> I don't think there is anything wrong.
> At least nothing database related.
>
> > I have uploaded that Part of Log at
> http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
>
> That looks like somebody turned on "log_statement_stats" for a spell.
>
> Yours,
> Laurenz Albe
>
> five zeros.
>
> Other than that, I don't see anything special about that.
> A session can last pretty long.
> Maybe you can solve your mystery by looking at the log entries.
> They should tell you what was going on.
>
> > Although it never reoccurs and luckily we had csv option on during
> that period.
> >
> > Where should i report such findings
>
> I don't think there is anything wrong.
> At least nothing database related.
>
> > I have uploaded that Part of Log at
> http://dl.dropbox.com/u/71964910/pg_log_with_lot_of_session.zip
>
> That looks like somebody turned on "log_statement_stats" for a spell.
>
> Yours,
> Laurenz Albe
>
Arvind Singh wrote: > Our CSV Log contains lot of statements like the following THREE lines. They appear exactly one after > the other. > > And they number in thousands for a Session (more than ten thousand) > > > 2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,754,"idle",2011-11- 11 > 12:41:30 IST,2/308,0,LOG,00000,"statement: INSERT INTO pemdata.settings (""name"", ""setting"", > ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,,,,,,,"exec_simple_query, > .\src\backend\tcop\postgres.c:900","" You get that if log_statement='all' > 2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,755,"INSERT",2011-1 1-11 > 12:41:30 IST,2/0,0,LOG,00000,"duration: 0.000 ms",,,,,,,,"exec_simple_query, > .\src\backend\tcop\postgres.c:1128","" You get that if log_duration=on or log_min_duration_statement=0 > 2011-11-11 12:41:31.484 IST,"agent1","pem",524,"localhost:2141",4ebccaa2.20c,756,"INSERT",2011-1 1-11 > 12:41:30 IST,2/0,0,LOG,00000,"QUERY STATISTICS","! system usage stats: > ! 0.000000 elapsed 0.000000 user 0.000000 system sec > ! [0.250000 user 0.156250 sys total]",,,,,"INSERT INTO pemdata.settings (""name"", ""setting"", > ""unit"", ""server_id"") VALUES ('xmlbinary', 'base64', NULL, '1')",,"ShowUsage, > .\src\backend\tcop\postgres.c:4305","" You get that if log_statement_stats=on > Is there anything that we enabled, because they dont appear after that particular session. Yes, as indicated above, during this session these options must have been enabled. I wouldn't worry about it. Yours, Laurenz Albe