Обсуждение: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

Поиск
Список
Период
Сортировка

[pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
Hi there,

We are working on browser-automation-based acceptance tests that exercise pgAdmin4 the way a user might.

The first "connect to database" test works, but at the moment depends on Chrome and chromedriverWe would appreciate feedback on any possible license or code style issues at this point, as well as any thoughts on adding this sort of test to the codebase.

Thanks!
George and Tira

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
here's the patch we forgot to attach. Also, you can see work on our branch at: https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests

On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili <ggelashvili@pivotal.io> wrote:
Hi there,

We are working on browser-automation-based acceptance tests that exercise pgAdmin4 the way a user might.

The first "connect to database" test works, but at the moment depends on Chrome and chromedriver. We would appreciate feedback on any possible license or code style issues at this point, as well as any thoughts on adding this sort of test to the codebase.

Thanks!
George and Tira

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> here's the patch we forgot to attach. Also, you can see work on our branch
> at:
> https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>
> On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili <ggelashvili@pivotal.io>
> wrote:
>>
>> Hi there,
>>
>> We are working on browser-automation-based acceptance tests that exercise
>> pgAdmin4 the way a user might.

Nice!

>> The first "connect to database" test works, but at the moment depends on
>> Chrome and chromedriver. We would appreciate feedback on any possible
>> license or code style issues at this point, as well as any thoughts on
>> adding this sort of test to the codebase.

A few thoughts:

- If these tests are to run as part of the regression suite, the
framework for them should live under that directory.

- Are any of the tests likely to be module-specific? If so, they
should really be part of the relevant module as the regression tests
are. If they're more general/less tightly coupled, then I don't see a
problem with them residing where they are.

- Please take care not to include changes to .gitgnore files that
aren't relevant to the rest of us.

- The port number is hard-coded in the test.

- You've hard-coded the string "pgAdmin 4". We've tried to keep that
title as a config option in config.py, so you should pull the string
from there rather than hard-coding it.

- The connect test fails for me (Mac, Python 2.7). I have a suspicion
that this may be because when the test starts chromedriver, OS X
prompts the user about whether a listening port should be opened, but
the tests don't wait (though, I tested with 3 servers configured and
it failed with the same error on the second and third as well, long
after I clicked OK on the prompt):

Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
line 32, in runTest
    self.assertEqual("pgAdmin 4", self.driver.title)
AssertionError: 'pgAdmin 4' != u'localhost'

- Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).

- It looks like running a single test won't work yet (because of
TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
arguments['pkg']))

Thanks!

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Thanks for your feedback, Dave!

We can put the tests under the regression directory. I think that makes sense. 
I'm not picturing these tests being module specific, but we may want to enable running it as a separate suite of tests. 

Thanks for the callout about the port and title. We'll make sure those are pulled from config or that the pgAdmin server is spun up by the test with specific values. 

I have a couple ideas about why the test might not have been running for you. I think the patch we attached didn't spin up its own pgAdmin yet and it definitely doesn't fill in username/password if your app is running that way. That's part of the WIP-ness :-P

-Tira

Hi

On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
<ggelashvili(at)pivotal(dot)io> wrote:
> here's the patch we forgot to attach. Also, you can see work on our branch
> at:
> https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>
> On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili <ggelashvili(at)pivotal(dot)io>
> wrote:
>>
>> Hi there,
>>
>> We are working on browser-automation-based acceptance tests that exercise
>> pgAdmin4 the way a user might.

Nice!

>> The first "connect to database" test works, but at the moment depends on
>> Chrome and chromedriver. We would appreciate feedback on any possible
>> license or code style issues at this point, as well as any thoughts on
>> adding this sort of test to the codebase.

A few thoughts:

- If these tests are to run as part of the regression suite, the
framework for them should live under that directory.

- Are any of the tests likely to be module-specific? If so, they
should really be part of the relevant module as the regression tests
are. If they're more general/less tightly coupled, then I don't see a
problem with them residing where they are.

- Please take care not to include changes to .gitgnore files that
aren't relevant to the rest of us.

- The port number is hard-coded in the test.

- You've hard-coded the string "pgAdmin 4". We've tried to keep that
title as a config option in config.py, so you should pull the string
from there rather than hard-coding it.

- The connect test fails for me (Mac, Python 2.7). I have a suspicion
that this may be because when the test starts chromedriver, OS X
prompts the user about whether a listening port should be opened, but
the tests don't wait (though, I tested with 3 servers configured and
it failed with the same error on the second and third as well, long
after I clicked OK on the prompt):

Traceback (most recent call last): File "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
line 32, in runTest   self.assertEqual("pgAdmin 4", self.driver.title)
AssertionError: 'pgAdmin 4' != u'localhost'

- Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).

- It looks like running a single test won't work yet (because of
TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
arguments['pkg']))

Thanks!

-- 
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:

Here is an updated patch which starts the server up when the test starts and uses the values from config.py for server name etc. It still requires installing chromedriver before running. Should we add something to the readme about that?

On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io> wrote:
Thanks for your feedback, Dave!

We can put the tests under the regression directory. I think that makes sense. 
I'm not picturing these tests being module specific, but we may want to enable running it as a separate suite of tests. 

Thanks for the callout about the port and title. We'll make sure those are pulled from config or that the pgAdmin server is spun up by the test with specific values. 

I have a couple ideas about why the test might not have been running for you. I think the patch we attached didn't spin up its own pgAdmin yet and it definitely doesn't fill in username/password if your app is running that way. That's part of the WIP-ness :-P

-Tira

Hi

On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
<ggelashvili(at)pivotal(dot)io> wrote:
> here's the patch we forgot to attach. Also, you can see work on our branch
> at:
> https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>
> On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili <ggelashvili(at)pivotal(dot)io>
> wrote:
>>
>> Hi there,
>>
>> We are working on browser-automation-based acceptance tests that exercise
>> pgAdmin4 the way a user might.

Nice!

>> The first "connect to database" test works, but at the moment depends on
>> Chrome and chromedriver. We would appreciate feedback on any possible
>> license or code style issues at this point, as well as any thoughts on
>> adding this sort of test to the codebase.

A few thoughts:

- If these tests are to run as part of the regression suite, the
framework for them should live under that directory.

- Are any of the tests likely to be module-specific? If so, they
should really be part of the relevant module as the regression tests
are. If they're more general/less tightly coupled, then I don't see a
problem with them residing where they are.

- Please take care not to include changes to .gitgnore files that
aren't relevant to the rest of us.

- The port number is hard-coded in the test.

- You've hard-coded the string "pgAdmin 4". We've tried to keep that
title as a config option in config.py, so you should pull the string
from there rather than hard-coding it.

- The connect test fails for me (Mac, Python 2.7). I have a suspicion
that this may be because when the test starts chromedriver, OS X
prompts the user about whether a listening port should be opened, but
the tests don't wait (though, I tested with 3 servers configured and
it failed with the same error on the second and third as well, long
after I clicked OK on the prompt):

Traceback (most recent call last): File "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
line 32, in runTest   self.assertEqual("pgAdmin 4", self.driver.title)
AssertionError: 'pgAdmin 4' != u'localhost'

- Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).

- It looks like running a single test won't work yet (because of
TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
arguments['pkg']))

Thanks!

-- 
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake EnterpriseDB UK: http://www.enterprisedb.com The Enterprise PostgreSQL Company


Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
Here's an updated patch which polls to wait for the app to start instead of sleeping for 10 seconds.

On Thu, Jan 19, 2017 at 5:07 PM, George Gelashvili <ggelashvili@pivotal.io> wrote:

Here is an updated patch which starts the server up when the test starts and uses the values from config.py for server name etc. It still requires installing chromedriver before running. Should we add something to the readme about that?

On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io> wrote:
Thanks for your feedback, Dave!

We can put the tests under the regression directory. I think that makes sense. 
I'm not picturing these tests being module specific, but we may want to enable running it as a separate suite of tests. 

Thanks for the callout about the port and title. We'll make sure those are pulled from config or that the pgAdmin server is spun up by the test with specific values. 

I have a couple ideas about why the test might not have been running for you. I think the patch we attached didn't spin up its own pgAdmin yet and it definitely doesn't fill in username/password if your app is running that way. That's part of the WIP-ness :-P

-Tira

Hi

On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
<ggelashvili(at)pivotal(dot)io> wrote:
> here's the patch we forgot to attach. Also, you can see work on our branch
> at:
> https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>
> On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili <ggelashvili(at)pivotal(dot)io>
> wrote:
>>
>> Hi there,
>>
>> We are working on browser-automation-based acceptance tests that exercise
>> pgAdmin4 the way a user might.

Nice!

>> The first "connect to database" test works, but at the moment depends on
>> Chrome and chromedriver. We would appreciate feedback on any possible
>> license or code style issues at this point, as well as any thoughts on
>> adding this sort of test to the codebase.

A few thoughts:

- If these tests are to run as part of the regression suite, the
framework for them should live under that directory.

- Are any of the tests likely to be module-specific? If so, they
should really be part of the relevant module as the regression tests
are. If they're more general/less tightly coupled, then I don't see a
problem with them residing where they are.

- Please take care not to include changes to .gitgnore files that
aren't relevant to the rest of us.

- The port number is hard-coded in the test.

- You've hard-coded the string "pgAdmin 4". We've tried to keep that
title as a config option in config.py, so you should pull the string
from there rather than hard-coding it.

- The connect test fails for me (Mac, Python 2.7). I have a suspicion
that this may be because when the test starts chromedriver, OS X
prompts the user about whether a listening port should be opened, but
the tests don't wait (though, I tested with 3 servers configured and
it failed with the same error on the second and third as well, long
after I clicked OK on the prompt):

Traceback (most recent call last): File "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
line 32, in runTest   self.assertEqual("pgAdmin 4", self.driver.title)
AssertionError: 'pgAdmin 4' != u'localhost'

- Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).

- It looks like running a single test won't work yet (because of
TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
arguments['pkg']))

Thanks!

-- 
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake EnterpriseDB UK: http://www.enterprisedb.com The Enterprise PostgreSQL Company



Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Thu, Jan 19, 2017 at 11:15 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> Here's an updated patch which polls to wait for the app to start instead of
> sleeping for 10 seconds.

I see the browser opening now, and it immediately loads pgAdmin. Then,
it prompts me for a reload whenever the test is run, which fails no
matter how quickly I hit the prompt from what I can see:

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.test_connects_to_database.ConnectsToDatabase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/test_connects_to_database.py",
line 41, in setUp
    self._wait_for_app()
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/test_connects_to_database.py",
line 109, in _wait_for_app
    self.__wait_for("app to start", page_shows_app)
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/test_connects_to_database.py",
line 117, in __wait_for
    result = condition_met_function()
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/test_connects_to_database.py",
line 107, in page_shows_app
    return self.driver.title == config.APP_NAME
  File "/Users/dpage/.virtualenvs/pgadmin4/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py",
line 257, in title
    resp = self.execute(Command.GET_TITLE)
  File "/Users/dpage/.virtualenvs/pgadmin4/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py",
line 236, in execute
    self.error_handler.check_response(response)
  File "/Users/dpage/.virtualenvs/pgadmin4/lib/python2.7/site-packages/selenium/webdriver/remote/errorhandler.py",
line 192, in check_response
    raise exception_class(message, screen, stacktrace)
UnexpectedAlertPresentException: Alert Text: None
Message: unexpected alert open: {Alert text : Are you sure you wish to
close the pgAdmin 4 browser?}
  (Session info: chrome=55.0.2883.95)
  (Driver info: chromedriver=2.27.440174
(e97a722caafc2d3a8b807ee115bfb307f7d2cfd9),platform=Mac OS X 10.12.1
x86_64)

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
On Thu, Jan 19, 2017 at 10:07 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
>
> Here is an updated patch which starts the server up when the test starts and
> uses the values from config.py for server name etc. It still requires
> installing chromedriver before running. Should we add something to the
> readme about that?

Yes, we definitely should (including download site URL)

> On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>>
>> Thanks for your feedback, Dave!
>>
>> We can put the tests under the regression directory. I think that makes
>> sense.
>> I'm not picturing these tests being module specific, but we may want to
>> enable running it as a separate suite of tests.
>>
>> Thanks for the callout about the port and title. We'll make sure those are
>> pulled from config or that the pgAdmin server is spun up by the test with
>> specific values.
>>
>> I have a couple ideas about why the test might not have been running for
>> you. I think the patch we attached didn't spin up its own pgAdmin yet and it
>> definitely doesn't fill in username/password if your app is running that
>> way. That's part of the WIP-ness :-P
>>
>> -Tira
>>
>> Hi
>>
>> On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
>> <ggelashvili(at)pivotal(dot)io> wrote:
>> > here's the patch we forgot to attach. Also, you can see work on our
>> > branch
>> > at:
>> >
>> > https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>> >
>> > On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili
>> > <ggelashvili(at)pivotal(dot)io>
>> > wrote:
>> >>
>> >> Hi there,
>> >>
>> >> We are working on browser-automation-based acceptance tests that
>> >> exercise
>> >> pgAdmin4 the way a user might.
>>
>> Nice!
>>
>> >> The first "connect to database" test works, but at the moment depends
>> >> on
>> >> Chrome and chromedriver. We would appreciate feedback on any possible
>> >> license or code style issues at this point, as well as any thoughts on
>> >> adding this sort of test to the codebase.
>>
>> A few thoughts:
>>
>> - If these tests are to run as part of the regression suite, the
>> framework for them should live under that directory.
>>
>> - Are any of the tests likely to be module-specific? If so, they
>> should really be part of the relevant module as the regression tests
>> are. If they're more general/less tightly coupled, then I don't see a
>> problem with them residing where they are.
>>
>> - Please take care not to include changes to .gitgnore files that
>> aren't relevant to the rest of us.
>>
>> - The port number is hard-coded in the test.
>>
>> - You've hard-coded the string "pgAdmin 4". We've tried to keep that
>> title as a config option in config.py, so you should pull the string
>> from there rather than hard-coding it.
>>
>> - The connect test fails for me (Mac, Python 2.7). I have a suspicion
>> that this may be because when the test starts chromedriver, OS X
>> prompts the user about whether a listening port should be opened, but
>> the tests don't wait (though, I tested with 3 servers configured and
>> it failed with the same error on the second and third as well, long
>> after I clicked OK on the prompt):
>>
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
>> line 32, in runTest
>>     self.assertEqual("pgAdmin 4", self.driver.title)
>> AssertionError: 'pgAdmin 4' != u'localhost'
>>
>> - Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).
>>
>> - It looks like running a single test won't work yet (because of
>> TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
>> arguments['pkg']))
>>
>> Thanks!
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>>
>>
>
>
>
> --
> Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgadmin-hackers
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
Thanks for bringing that to our attention! Here's the latest patch

On Fri, Jan 20, 2017 at 10:38 AM, Dave Page <dpage@pgadmin.org> wrote:
On Thu, Jan 19, 2017 at 10:07 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
>
> Here is an updated patch which starts the server up when the test starts and
> uses the values from config.py for server name etc. It still requires
> installing chromedriver before running. Should we add something to the
> readme about that?

Yes, we definitely should (including download site URL)

> On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>>
>> Thanks for your feedback, Dave!
>>
>> We can put the tests under the regression directory. I think that makes
>> sense.
>> I'm not picturing these tests being module specific, but we may want to
>> enable running it as a separate suite of tests.
>>
>> Thanks for the callout about the port and title. We'll make sure those are
>> pulled from config or that the pgAdmin server is spun up by the test with
>> specific values.
>>
>> I have a couple ideas about why the test might not have been running for
>> you. I think the patch we attached didn't spin up its own pgAdmin yet and it
>> definitely doesn't fill in username/password if your app is running that
>> way. That's part of the WIP-ness :-P
>>
>> -Tira
>>
>> Hi
>>
>> On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
>> <ggelashvili(at)pivotal(dot)io> wrote:
>> > here's the patch we forgot to attach. Also, you can see work on our
>> > branch
>> > at:
>> >
>> > https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>> >
>> > On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili
>> > <ggelashvili(at)pivotal(dot)io>
>> > wrote:
>> >>
>> >> Hi there,
>> >>
>> >> We are working on browser-automation-based acceptance tests that
>> >> exercise
>> >> pgAdmin4 the way a user might.
>>
>> Nice!
>>
>> >> The first "connect to database" test works, but at the moment depends
>> >> on
>> >> Chrome and chromedriver. We would appreciate feedback on any possible
>> >> license or code style issues at this point, as well as any thoughts on
>> >> adding this sort of test to the codebase.
>>
>> A few thoughts:
>>
>> - If these tests are to run as part of the regression suite, the
>> framework for them should live under that directory.
>>
>> - Are any of the tests likely to be module-specific? If so, they
>> should really be part of the relevant module as the regression tests
>> are. If they're more general/less tightly coupled, then I don't see a
>> problem with them residing where they are.
>>
>> - Please take care not to include changes to .gitgnore files that
>> aren't relevant to the rest of us.
>>
>> - The port number is hard-coded in the test.
>>
>> - You've hard-coded the string "pgAdmin 4". We've tried to keep that
>> title as a config option in config.py, so you should pull the string
>> from there rather than hard-coding it.
>>
>> - The connect test fails for me (Mac, Python 2.7). I have a suspicion
>> that this may be because when the test starts chromedriver, OS X
>> prompts the user about whether a listening port should be opened, but
>> the tests don't wait (though, I tested with 3 servers configured and
>> it failed with the same error on the second and third as well, long
>> after I clicked OK on the prompt):
>>
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
>> line 32, in runTest
>>     self.assertEqual("pgAdmin 4", self.driver.title)
>> AssertionError: 'pgAdmin 4' != u'localhost'
>>
>> - Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).
>>
>> - It looks like running a single test won't work yet (because of
>> TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
>> arguments['pkg']))
>>
>> Thanks!
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>>
>>
>
>
>
> --
> Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgadmin-hackers
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
On Fri, Jan 20, 2017 at 5:33 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> Thanks for bringing that to our attention! Here's the latest patch

piranha:pgadmin4 dpage$ git apply
~/Downloads/acceptance-tests-with-server-start-and-polling.diff
error: patch failed: web/regression/test_utils.py:69
error: web/regression/test_utils.py: patch does not apply

:-(

> On Fri, Jan 20, 2017 at 10:38 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> On Thu, Jan 19, 2017 at 10:07 PM, George Gelashvili
>> <ggelashvili@pivotal.io> wrote:
>> >
>> > Here is an updated patch which starts the server up when the test starts
>> > and
>> > uses the values from config.py for server name etc. It still requires
>> > installing chromedriver before running. Should we add something to the
>> > readme about that?
>>
>> Yes, we definitely should (including download site URL)
>>
>> > On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io>
>> > wrote:
>> >>
>> >> Thanks for your feedback, Dave!
>> >>
>> >> We can put the tests under the regression directory. I think that makes
>> >> sense.
>> >> I'm not picturing these tests being module specific, but we may want to
>> >> enable running it as a separate suite of tests.
>> >>
>> >> Thanks for the callout about the port and title. We'll make sure those
>> >> are
>> >> pulled from config or that the pgAdmin server is spun up by the test
>> >> with
>> >> specific values.
>> >>
>> >> I have a couple ideas about why the test might not have been running
>> >> for
>> >> you. I think the patch we attached didn't spin up its own pgAdmin yet
>> >> and it
>> >> definitely doesn't fill in username/password if your app is running
>> >> that
>> >> way. That's part of the WIP-ness :-P
>> >>
>> >> -Tira
>> >>
>> >> Hi
>> >>
>> >> On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
>> >> <ggelashvili(at)pivotal(dot)io> wrote:
>> >> > here's the patch we forgot to attach. Also, you can see work on our
>> >> > branch
>> >> > at:
>> >> >
>> >> >
>> >> > https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>> >> >
>> >> > On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili
>> >> > <ggelashvili(at)pivotal(dot)io>
>> >> > wrote:
>> >> >>
>> >> >> Hi there,
>> >> >>
>> >> >> We are working on browser-automation-based acceptance tests that
>> >> >> exercise
>> >> >> pgAdmin4 the way a user might.
>> >>
>> >> Nice!
>> >>
>> >> >> The first "connect to database" test works, but at the moment
>> >> >> depends
>> >> >> on
>> >> >> Chrome and chromedriver. We would appreciate feedback on any
>> >> >> possible
>> >> >> license or code style issues at this point, as well as any thoughts
>> >> >> on
>> >> >> adding this sort of test to the codebase.
>> >>
>> >> A few thoughts:
>> >>
>> >> - If these tests are to run as part of the regression suite, the
>> >> framework for them should live under that directory.
>> >>
>> >> - Are any of the tests likely to be module-specific? If so, they
>> >> should really be part of the relevant module as the regression tests
>> >> are. If they're more general/less tightly coupled, then I don't see a
>> >> problem with them residing where they are.
>> >>
>> >> - Please take care not to include changes to .gitgnore files that
>> >> aren't relevant to the rest of us.
>> >>
>> >> - The port number is hard-coded in the test.
>> >>
>> >> - You've hard-coded the string "pgAdmin 4". We've tried to keep that
>> >> title as a config option in config.py, so you should pull the string
>> >> from there rather than hard-coding it.
>> >>
>> >> - The connect test fails for me (Mac, Python 2.7). I have a suspicion
>> >> that this may be because when the test starts chromedriver, OS X
>> >> prompts the user about whether a listening port should be opened, but
>> >> the tests don't wait (though, I tested with 3 servers configured and
>> >> it failed with the same error on the second and third as well, long
>> >> after I clicked OK on the prompt):
>> >>
>> >> Traceback (most recent call last):
>> >>   File
>> >>
>> >> "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
>> >> line 32, in runTest
>> >>     self.assertEqual("pgAdmin 4", self.driver.title)
>> >> AssertionError: 'pgAdmin 4' != u'localhost'
>> >>
>> >> - Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).
>> >>
>> >> - It looks like running a single test won't work yet (because of
>> >> TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
>> >> arguments['pkg']))
>> >>
>> >> Thanks!
>> >>
>> >> --
>> >> Dave Page
>> >> Blog: http://pgsnake.blogspot.com
>> >> Twitter: @pgsnake
>> >>
>> >> EnterpriseDB UK: http://www.enterprisedb.com
>> >> The Enterprise PostgreSQL Company
>> >>
>> >>
>> >
>> >
>> >
>> > --
>> > Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
>> > To make changes to your subscription:
>> > http://www.postgresql.org/mailpref/pgadmin-hackers
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
ah
That diff was generated before the python 3 patch was applied. This should work against master

Cheers,
George

On Tue, Jan 24, 2017 at 4:43 AM, Dave Page <dpage@pgadmin.org> wrote:
On Fri, Jan 20, 2017 at 5:33 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> Thanks for bringing that to our attention! Here's the latest patch

piranha:pgadmin4 dpage$ git apply
~/Downloads/acceptance-tests-with-server-start-and-polling.diff
error: patch failed: web/regression/test_utils.py:69
error: web/regression/test_utils.py: patch does not apply

:-(

> On Fri, Jan 20, 2017 at 10:38 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> On Thu, Jan 19, 2017 at 10:07 PM, George Gelashvili
>> <ggelashvili@pivotal.io> wrote:
>> >
>> > Here is an updated patch which starts the server up when the test starts
>> > and
>> > uses the values from config.py for server name etc. It still requires
>> > installing chromedriver before running. Should we add something to the
>> > readme about that?
>>
>> Yes, we definitely should (including download site URL)
>>
>> > On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io>
>> > wrote:
>> >>
>> >> Thanks for your feedback, Dave!
>> >>
>> >> We can put the tests under the regression directory. I think that makes
>> >> sense.
>> >> I'm not picturing these tests being module specific, but we may want to
>> >> enable running it as a separate suite of tests.
>> >>
>> >> Thanks for the callout about the port and title. We'll make sure those
>> >> are
>> >> pulled from config or that the pgAdmin server is spun up by the test
>> >> with
>> >> specific values.
>> >>
>> >> I have a couple ideas about why the test might not have been running
>> >> for
>> >> you. I think the patch we attached didn't spin up its own pgAdmin yet
>> >> and it
>> >> definitely doesn't fill in username/password if your app is running
>> >> that
>> >> way. That's part of the WIP-ness :-P
>> >>
>> >> -Tira
>> >>
>> >> Hi
>> >>
>> >> On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
>> >> <ggelashvili(at)pivotal(dot)io> wrote:
>> >> > here's the patch we forgot to attach. Also, you can see work on our
>> >> > branch
>> >> > at:
>> >> >
>> >> >
>> >> > https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>> >> >
>> >> > On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili
>> >> > <ggelashvili(at)pivotal(dot)io>
>> >> > wrote:
>> >> >>
>> >> >> Hi there,
>> >> >>
>> >> >> We are working on browser-automation-based acceptance tests that
>> >> >> exercise
>> >> >> pgAdmin4 the way a user might.
>> >>
>> >> Nice!
>> >>
>> >> >> The first "connect to database" test works, but at the moment
>> >> >> depends
>> >> >> on
>> >> >> Chrome and chromedriver. We would appreciate feedback on any
>> >> >> possible
>> >> >> license or code style issues at this point, as well as any thoughts
>> >> >> on
>> >> >> adding this sort of test to the codebase.
>> >>
>> >> A few thoughts:
>> >>
>> >> - If these tests are to run as part of the regression suite, the
>> >> framework for them should live under that directory.
>> >>
>> >> - Are any of the tests likely to be module-specific? If so, they
>> >> should really be part of the relevant module as the regression tests
>> >> are. If they're more general/less tightly coupled, then I don't see a
>> >> problem with them residing where they are.
>> >>
>> >> - Please take care not to include changes to .gitgnore files that
>> >> aren't relevant to the rest of us.
>> >>
>> >> - The port number is hard-coded in the test.
>> >>
>> >> - You've hard-coded the string "pgAdmin 4". We've tried to keep that
>> >> title as a config option in config.py, so you should pull the string
>> >> from there rather than hard-coding it.
>> >>
>> >> - The connect test fails for me (Mac, Python 2.7). I have a suspicion
>> >> that this may be because when the test starts chromedriver, OS X
>> >> prompts the user about whether a listening port should be opened, but
>> >> the tests don't wait (though, I tested with 3 servers configured and
>> >> it failed with the same error on the second and third as well, long
>> >> after I clicked OK on the prompt):
>> >>
>> >> Traceback (most recent call last):
>> >>   File
>> >>
>> >> "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
>> >> line 32, in runTest
>> >>     self.assertEqual("pgAdmin 4", self.driver.title)
>> >> AssertionError: 'pgAdmin 4' != u'localhost'
>> >>
>> >> - Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).
>> >>
>> >> - It looks like running a single test won't work yet (because of
>> >> TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
>> >> arguments['pkg']))
>> >>
>> >> Thanks!
>> >>
>> >> --
>> >> Dave Page
>> >> Blog: http://pgsnake.blogspot.com
>> >> Twitter: @pgsnake
>> >>
>> >> EnterpriseDB UK: http://www.enterprisedb.com
>> >> The Enterprise PostgreSQL Company
>> >>
>> >>
>> >
>> >
>> >
>> > --
>> > Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
>> > To make changes to your subscription:
>> > http://www.postgresql.org/mailpref/pgadmin-hackers
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
instead of that patch, please use this no-zombies version that kills the started process group instead of pid-only.

On Wed, Jan 25, 2017 at 6:31 PM, George Gelashvili <ggelashvili@pivotal.io> wrote:
ah
That diff was generated before the python 3 patch was applied. This should work against master

Cheers,
George

On Tue, Jan 24, 2017 at 4:43 AM, Dave Page <dpage@pgadmin.org> wrote:
On Fri, Jan 20, 2017 at 5:33 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> Thanks for bringing that to our attention! Here's the latest patch

piranha:pgadmin4 dpage$ git apply
~/Downloads/acceptance-tests-with-server-start-and-polling.diff
error: patch failed: web/regression/test_utils.py:69
error: web/regression/test_utils.py: patch does not apply

:-(

> On Fri, Jan 20, 2017 at 10:38 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> On Thu, Jan 19, 2017 at 10:07 PM, George Gelashvili
>> <ggelashvili@pivotal.io> wrote:
>> >
>> > Here is an updated patch which starts the server up when the test starts
>> > and
>> > uses the values from config.py for server name etc. It still requires
>> > installing chromedriver before running. Should we add something to the
>> > readme about that?
>>
>> Yes, we definitely should (including download site URL)
>>
>> > On Tue, Jan 17, 2017 at 11:09 AM, Atira Odhner <aodhner@pivotal.io>
>> > wrote:
>> >>
>> >> Thanks for your feedback, Dave!
>> >>
>> >> We can put the tests under the regression directory. I think that makes
>> >> sense.
>> >> I'm not picturing these tests being module specific, but we may want to
>> >> enable running it as a separate suite of tests.
>> >>
>> >> Thanks for the callout about the port and title. We'll make sure those
>> >> are
>> >> pulled from config or that the pgAdmin server is spun up by the test
>> >> with
>> >> specific values.
>> >>
>> >> I have a couple ideas about why the test might not have been running
>> >> for
>> >> you. I think the patch we attached didn't spin up its own pgAdmin yet
>> >> and it
>> >> definitely doesn't fill in username/password if your app is running
>> >> that
>> >> way. That's part of the WIP-ness :-P
>> >>
>> >> -Tira
>> >>
>> >> Hi
>> >>
>> >> On Thu, Jan 12, 2017 at 10:41 PM, George Gelashvili
>> >> <ggelashvili(at)pivotal(dot)io> wrote:
>> >> > here's the patch we forgot to attach. Also, you can see work on our
>> >> > branch
>> >> > at:
>> >> >
>> >> >
>> >> > https://github.com/pivotalsoftware/pgadmin4/tree/pivotal/acceptance-tests
>> >> >
>> >> > On Thu, Jan 12, 2017 at 5:26 PM, George Gelashvili
>> >> > <ggelashvili(at)pivotal(dot)io>
>> >> > wrote:
>> >> >>
>> >> >> Hi there,
>> >> >>
>> >> >> We are working on browser-automation-based acceptance tests that
>> >> >> exercise
>> >> >> pgAdmin4 the way a user might.
>> >>
>> >> Nice!
>> >>
>> >> >> The first "connect to database" test works, but at the moment
>> >> >> depends
>> >> >> on
>> >> >> Chrome and chromedriver. We would appreciate feedback on any
>> >> >> possible
>> >> >> license or code style issues at this point, as well as any thoughts
>> >> >> on
>> >> >> adding this sort of test to the codebase.
>> >>
>> >> A few thoughts:
>> >>
>> >> - If these tests are to run as part of the regression suite, the
>> >> framework for them should live under that directory.
>> >>
>> >> - Are any of the tests likely to be module-specific? If so, they
>> >> should really be part of the relevant module as the regression tests
>> >> are. If they're more general/less tightly coupled, then I don't see a
>> >> problem with them residing where they are.
>> >>
>> >> - Please take care not to include changes to .gitgnore files that
>> >> aren't relevant to the rest of us.
>> >>
>> >> - The port number is hard-coded in the test.
>> >>
>> >> - You've hard-coded the string "pgAdmin 4". We've tried to keep that
>> >> title as a config option in config.py, so you should pull the string
>> >> from there rather than hard-coding it.
>> >>
>> >> - The connect test fails for me (Mac, Python 2.7). I have a suspicion
>> >> that this may be because when the test starts chromedriver, OS X
>> >> prompts the user about whether a listening port should be opened, but
>> >> the tests don't wait (though, I tested with 3 servers configured and
>> >> it failed with the same error on the second and third as well, long
>> >> after I clicked OK on the prompt):
>> >>
>> >> Traceback (most recent call last):
>> >>   File
>> >>
>> >> "/Users/dpage/git/pgadmin4/web/acceptance/test_connects_to_database.py",
>> >> line 32, in runTest
>> >>     self.assertEqual("pgAdmin 4", self.driver.title)
>> >> AssertionError: 'pgAdmin 4' != u'localhost'
>> >>
>> >> - Please keep tests in the pgadmin. namespace (pgadmin.acceptance.??).
>> >>
>> >> - It looks like running a single test won't work yet (because of
>> >> TestsGeneratorRegistry.load_generators('pgadmin.%s.tests' %
>> >> arguments['pkg']))
>> >>
>> >> Thanks!
>> >>
>> >> --
>> >> Dave Page
>> >> Blog: http://pgsnake.blogspot.com
>> >> Twitter: @pgsnake
>> >>
>> >> EnterpriseDB UK: http://www.enterprisedb.com
>> >> The Enterprise PostgreSQL Company
>> >>
>> >>
>> >
>> >
>> >
>> > --
>> > Sent via pgadmin-hackers mailing list (pgadmin-hackers@postgresql.org)
>> > To make changes to your subscription:
>> > http://www.postgresql.org/mailpref/pgadmin-hackers
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
On Thu, Jan 26, 2017 at 10:40 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> instead of that patch, please use this no-zombies version that kills the
> started process group instead of pid-only.

Very cool :-). The only minor annoyance for me is that my Mac pops up
a message asking me if I want pgAdmin to accept connections, but
there's nothing you can do about that of course.

At this point I think there are a couple of things left to do;

- Add more tests!

- Add command line options to runtests.py to allow users to run either
the existing tests or the acceptance tests (or both, which should be
the default). Of course, it should still be possible to just run any
single test.

Thanks!

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
On Fri, Jan 27, 2017 at 4:11 PM, Dave Page <dpage@pgadmin.org> wrote:
> On Thu, Jan 26, 2017 at 10:40 PM, George Gelashvili
> <ggelashvili@pivotal.io> wrote:
>> instead of that patch, please use this no-zombies version that kills the
>> started process group instead of pid-only.
>
> Very cool :-). The only minor annoyance for me is that my Mac pops up
> a message asking me if I want pgAdmin to accept connections, but
> there's nothing you can do about that of course.
>
> At this point I think there are a couple of things left to do;
>
> - Add more tests!
>
> - Add command line options to runtests.py to allow users to run either
> the existing tests or the acceptance tests (or both, which should be
> the default). Of course, it should still be possible to just run any
> single test.

Please add:

- Proper cleanup. I just noticed the tests have left an
"acceptable_test_db" database behind.

Thanks.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
so, it sounds like you're saying our accaptable_test_db is unacceptable :-P

here's a patch that takes an "--exclude" flag (see README) and doesn't create dbs that don't get cleaned up afterwards

On Fri, Jan 27, 2017 at 11:28 AM, Dave Page <dpage@pgadmin.org> wrote:
On Fri, Jan 27, 2017 at 4:11 PM, Dave Page <dpage@pgadmin.org> wrote:
> On Thu, Jan 26, 2017 at 10:40 PM, George Gelashvili
> <ggelashvili@pivotal.io> wrote:
>> instead of that patch, please use this no-zombies version that kills the
>> started process group instead of pid-only.
>
> Very cool :-). The only minor annoyance for me is that my Mac pops up
> a message asking me if I want pgAdmin to accept connections, but
> there's nothing you can do about that of course.
>
> At this point I think there are a couple of things left to do;
>
> - Add more tests!
>
> - Add command line options to runtests.py to allow users to run either
> the existing tests or the acceptance tests (or both, which should be
> the default). Of course, it should still be possible to just run any
> single test.

Please add:

- Proper cleanup. I just noticed the tests have left an
"acceptable_test_db" database behind.

Thanks.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Here's the patch with one more fix -- cleaning up the connections that get created in pgAdmin.


On Mon, Jan 30, 2017 at 2:28 PM, George Gelashvili <ggelashvili@pivotal.io> wrote:
so, it sounds like you're saying our accaptable_test_db is unacceptable :-P

here's a patch that takes an "--exclude" flag (see README) and doesn't create dbs that don't get cleaned up afterwards

On Fri, Jan 27, 2017 at 11:28 AM, Dave Page <dpage@pgadmin.org> wrote:
On Fri, Jan 27, 2017 at 4:11 PM, Dave Page <dpage@pgadmin.org> wrote:
> On Thu, Jan 26, 2017 at 10:40 PM, George Gelashvili
> <ggelashvili@pivotal.io> wrote:
>> instead of that patch, please use this no-zombies version that kills the
>> started process group instead of pid-only.
>
> Very cool :-). The only minor annoyance for me is that my Mac pops up
> a message asking me if I want pgAdmin to accept connections, but
> there's nothing you can do about that of course.
>
> At this point I think there are a couple of things left to do;
>
> - Add more tests!
>
> - Add command line options to runtests.py to allow users to run either
> the existing tests or the acceptance tests (or both, which should be
> the default). Of course, it should still be possible to just run any
> single test.

Please add:

- Proper cleanup. I just noticed the tests have left an
"acceptable_test_db" database behind.

Thanks.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Mon, Jan 30, 2017 at 9:23 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Here's the patch with one more fix -- cleaning up the connections that get
> created in pgAdmin.

Hmm, I had trouble with this one. I noticed a few issues:

- The tests started pgAdmin listening on the default port (5050),
however, I already had an instance running on there;
    a) It should have detected that something else was running on the port
    b) Shouldn't we just use a random, unused port?

- Errors were given because I already had an acceptance_test_db on a
number of servers, and that contained the test table. Obviously the
code now cleans up after itself, but I think we should use a random
database name as the main regression tests do (they append a random
number to the name iirc).

- Some of the tests just seemed to time out. I *think* this might be
because the test browser window opens quite narrowly, and it looks
like the tests are probably trying to do things with nodes that aren't
actually visible.

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
line 69, in tearDown
    self.app_starter.stop_app()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/app_starter.py",
line 27, in stop_app
    os.killpg(os.getpgid(self.pgadmin_process.pid), signal.SIGTERM)
OSError: [Errno 3] No such process

======================================================================
ERROR: runTest
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
line 37, in runTest
    self.page.find_by_xpath("//*[@id='tree']//*[@class='aciTreeText'
and .='Trigger Functions']").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 45, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 72, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 106, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

======================================================================
ERROR: runTest
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
line 60, in tearDown
    self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 45, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 72, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 106, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

----------------------------------------------------------------------

Thanks.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
George Gelashvili
Дата:
Hi Dave,

We agree that a random port would be a nice addition. We think having randomized test database names can lead to polluting with lots of extra databases left around in the event that cleanup fails for whatever reason (e.g. a test errors out).  We see this happen already with the randomized test databases you mention. We agree that there should probably be one strategy across the test suite. We could use randomized names and have a more general cleanup step that removes all databases of the form "test_...".

Dave, are those errors you saw when you shut down your application on :5050 and did a fresh run of the tests? If not, could you please do a clean run? It's possible that the second error could be related to viewport size as you suggested, but the first error just looks like a problem with the test not being able to spin up its own server.

Thanks,
George & Tira

On Tue, Jan 31, 2017 at 9:41 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi

On Mon, Jan 30, 2017 at 9:23 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Here's the patch with one more fix -- cleaning up the connections that get
> created in pgAdmin.

Hmm, I had trouble with this one. I noticed a few issues:

- The tests started pgAdmin listening on the default port (5050),
however, I already had an instance running on there;
    a) It should have detected that something else was running on the port
    b) Shouldn't we just use a random, unused port?

- Errors were given because I already had an acceptance_test_db on a
number of servers, and that contained the test table. Obviously the
code now cleans up after itself, but I think we should use a random
database name as the main regression tests do (they append a random
number to the name iirc).

- Some of the tests just seemed to time out. I *think* this might be
because the test browser window opens quite narrowly, and it looks
like the tests are probably trying to do things with nodes that aren't
actually visible.

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
line 69, in tearDown
    self.app_starter.stop_app()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/app_starter.py",
line 27, in stop_app
    os.killpg(os.getpgid(self.pgadmin_process.pid), signal.SIGTERM)
OSError: [Errno 3] No such process

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
line 37, in runTest
    self.page.find_by_xpath("//*[@id='tree']//*[@class='aciTreeText'
and .='Trigger Functions']").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 45, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 72, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 106, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
line 60, in tearDown
    self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 45, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 72, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 106, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

----------------------------------------------------------------------

Thanks.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Tue, Jan 31, 2017 at 2:54 PM, George Gelashvili
<ggelashvili@pivotal.io> wrote:
> Hi Dave,
>
> We agree that a random port would be a nice addition. We think having
> randomized test database names can lead to polluting with lots of extra
> databases left around in the event that cleanup fails for whatever reason
> (e.g. a test errors out).  We see this happen already with the randomized
> test databases you mention. We agree that there should probably be one
> strategy across the test suite. We could use randomized names and have a
> more general cleanup step that removes all databases of the form "test_...".

I'm very wary about doing things like that. We had an early version of
the suite that managed to delete all databases :-/. Maybe we could use
a patterned name, but only delete databases that also have a comment
with some text in it that we can verify?

> Dave, are those errors you saw when you shut down your application on :5050
> and did a fresh run of the tests? If not, could you please do a clean run?
> It's possible that the second error could be related to viewport size as you
> suggested, but the first error just looks like a problem with the test not
> being able to spin up its own server.

That was on a second run of the tests, yes. I just did a careful
cleanup of left-over test databases, double-checked my server wasn't
running and re-ran the tests - I got the same results.

>
> Thanks,
> George & Tira
>
> On Tue, Jan 31, 2017 at 9:41 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi
>>
>> On Mon, Jan 30, 2017 at 9:23 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Here's the patch with one more fix -- cleaning up the connections that
>> > get
>> > created in pgAdmin.
>>
>> Hmm, I had trouble with this one. I noticed a few issues:
>>
>> - The tests started pgAdmin listening on the default port (5050),
>> however, I already had an instance running on there;
>>     a) It should have detected that something else was running on the port
>>     b) Shouldn't we just use a random, unused port?
>>
>> - Errors were given because I already had an acceptance_test_db on a
>> number of servers, and that contained the test table. Obviously the
>> code now cleans up after itself, but I think we should use a random
>> database name as the main regression tests do (they append a random
>> number to the name iirc).
>>
>> - Some of the tests just seemed to time out. I *think* this might be
>> because the test browser window opens quite narrowly, and it looks
>> like the tests are probably trying to do things with nodes that aren't
>> actually visible.
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
>> line 69, in tearDown
>>     self.app_starter.stop_app()
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/app_starter.py",
>> line 27, in stop_app
>>     os.killpg(os.getpgid(self.pgadmin_process.pid), signal.SIGTERM)
>> OSError: [Errno 3] No such process
>>
>> ======================================================================
>> ERROR: runTest
>>
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>>
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>> line 37, in runTest
>>     self.page.find_by_xpath("//*[@id='tree']//*[@class='aciTreeText'
>> and .='Trigger Functions']").click()
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 45, in find_by_xpath
>>     return self.wait_for_element(lambda:
>> self.driver.find_element_by_xpath(xpath))
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 72, in wait_for_element
>>     return self._wait_for("element to exist", element_if_it_exists)
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 106, in _wait_for
>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>> RuntimeError: timed out waiting for element to exist
>>
>> ======================================================================
>> ERROR: runTest
>>
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>>
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>> line 60, in tearDown
>>     self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 45, in find_by_xpath
>>     return self.wait_for_element(lambda:
>> self.driver.find_element_by_xpath(xpath))
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 72, in wait_for_element
>>     return self._wait_for("element to exist", element_if_it_exists)
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 106, in _wait_for
>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>> RuntimeError: timed out waiting for element to exist
>>
>> ----------------------------------------------------------------------
>>
>> Thanks.
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi George,

I just tried to do some debugging of pgAdmin, and found that I
couldn't start it. On further investigation, I found that I had an
instance running in the background on my system. I'm assuming this was
started by the acceptance tests, but not shutdown. I killed it off,
and re-ran the tests only to see failures because the database and
table used in the acceptance tests were still present. When the tests
completed, pgAdmin was again left running in the background.

I've just re-run the tests, having first killed the backgrounded
pgAdmin and then manually cleaned up the test objects. This time I do
indeed only get the two errors below when it tests the first of 3
servers I have configured. The second and third servers get three
errors each, and pgAdmin is left running in the background again.

So, you were right that I had another instance of pgAdmin running...
but it was tests that caused it :-p



On Tue, Jan 31, 2017 at 3:10 PM, Dave Page <dpage@pgadmin.org> wrote:
> Hi
>
> On Tue, Jan 31, 2017 at 2:54 PM, George Gelashvili
> <ggelashvili@pivotal.io> wrote:
>> Hi Dave,
>>
>> We agree that a random port would be a nice addition. We think having
>> randomized test database names can lead to polluting with lots of extra
>> databases left around in the event that cleanup fails for whatever reason
>> (e.g. a test errors out).  We see this happen already with the randomized
>> test databases you mention. We agree that there should probably be one
>> strategy across the test suite. We could use randomized names and have a
>> more general cleanup step that removes all databases of the form "test_...".
>
> I'm very wary about doing things like that. We had an early version of
> the suite that managed to delete all databases :-/. Maybe we could use
> a patterned name, but only delete databases that also have a comment
> with some text in it that we can verify?
>
>> Dave, are those errors you saw when you shut down your application on :5050
>> and did a fresh run of the tests? If not, could you please do a clean run?
>> It's possible that the second error could be related to viewport size as you
>> suggested, but the first error just looks like a problem with the test not
>> being able to spin up its own server.
>
> That was on a second run of the tests, yes. I just did a careful
> cleanup of left-over test databases, double-checked my server wasn't
> running and re-ran the tests - I got the same results.
>
>>
>> Thanks,
>> George & Tira
>>
>> On Tue, Jan 31, 2017 at 9:41 AM, Dave Page <dpage@pgadmin.org> wrote:
>>>
>>> Hi
>>>
>>> On Mon, Jan 30, 2017 at 9:23 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>>> > Here's the patch with one more fix -- cleaning up the connections that
>>> > get
>>> > created in pgAdmin.
>>>
>>> Hmm, I had trouble with this one. I noticed a few issues:
>>>
>>> - The tests started pgAdmin listening on the default port (5050),
>>> however, I already had an instance running on there;
>>>     a) It should have detected that something else was running on the port
>>>     b) Shouldn't we just use a random, unused port?
>>>
>>> - Errors were given because I already had an acceptance_test_db on a
>>> number of servers, and that contained the test table. Obviously the
>>> code now cleans up after itself, but I think we should use a random
>>> database name as the main regression tests do (they append a random
>>> number to the name iirc).
>>>
>>> - Some of the tests just seemed to time out. I *think* this might be
>>> because the test browser window opens quite narrowly, and it looks
>>> like the tests are probably trying to do things with nodes that aren't
>>> actually visible.
>>>
>>> ======================================================================
>>> ERROR: runTest
>>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
>>> line 69, in tearDown
>>>     self.app_starter.stop_app()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/app_starter.py",
>>> line 27, in stop_app
>>>     os.killpg(os.getpgid(self.pgadmin_process.pid), signal.SIGTERM)
>>> OSError: [Errno 3] No such process
>>>
>>> ======================================================================
>>> ERROR: runTest
>>>
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>>
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>>> line 37, in runTest
>>>     self.page.find_by_xpath("//*[@id='tree']//*[@class='aciTreeText'
>>> and .='Trigger Functions']").click()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 45, in find_by_xpath
>>>     return self.wait_for_element(lambda:
>>> self.driver.find_element_by_xpath(xpath))
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 72, in wait_for_element
>>>     return self._wait_for("element to exist", element_if_it_exists)
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 106, in _wait_for
>>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>>> RuntimeError: timed out waiting for element to exist
>>>
>>> ======================================================================
>>> ERROR: runTest
>>>
(pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>>
"/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>>> line 60, in tearDown
>>>     self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 45, in find_by_xpath
>>>     return self.wait_for_element(lambda:
>>> self.driver.find_element_by_xpath(xpath))
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 72, in wait_for_element
>>>     return self._wait_for("element to exist", element_if_it_exists)
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 106, in _wait_for
>>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>>> RuntimeError: timed out waiting for element to exist
>>>
>>> ----------------------------------------------------------------------
>>>
>>> Thanks.
>>>
>>> --
>>> Dave Page
>>> Blog: http://pgsnake.blogspot.com
>>> Twitter: @pgsnake
>>>
>>> EnterpriseDB UK: http://www.enterprisedb.com
>>> The Enterprise PostgreSQL Company
>>
>>
>
>
>
> --
> Dave Page
> Blog: http://pgsnake.blogspot.com
> Twitter: @pgsnake
>
> EnterpriseDB UK: http://www.enterprisedb.com
> The Enterprise PostgreSQL Company



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Hi Dave,

Here is a new patch which includes the following:
- randomized ports
- delete the acceptance_test_db database in setup in case a prior run failed
- fixed size browser window

Cheers,
Tira & George

On Tue, Jan 31, 2017 at 11:25 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi George,

I just tried to do some debugging of pgAdmin, and found that I
couldn't start it. On further investigation, I found that I had an
instance running in the background on my system. I'm assuming this was
started by the acceptance tests, but not shutdown. I killed it off,
and re-ran the tests only to see failures because the database and
table used in the acceptance tests were still present. When the tests
completed, pgAdmin was again left running in the background.

I've just re-run the tests, having first killed the backgrounded
pgAdmin and then manually cleaned up the test objects. This time I do
indeed only get the two errors below when it tests the first of 3
servers I have configured. The second and third servers get three
errors each, and pgAdmin is left running in the background again.

So, you were right that I had another instance of pgAdmin running...
but it was tests that caused it :-p



On Tue, Jan 31, 2017 at 3:10 PM, Dave Page <dpage@pgadmin.org> wrote:
> Hi
>
> On Tue, Jan 31, 2017 at 2:54 PM, George Gelashvili
> <ggelashvili@pivotal.io> wrote:
>> Hi Dave,
>>
>> We agree that a random port would be a nice addition. We think having
>> randomized test database names can lead to polluting with lots of extra
>> databases left around in the event that cleanup fails for whatever reason
>> (e.g. a test errors out).  We see this happen already with the randomized
>> test databases you mention. We agree that there should probably be one
>> strategy across the test suite. We could use randomized names and have a
>> more general cleanup step that removes all databases of the form "test_...".
>
> I'm very wary about doing things like that. We had an early version of
> the suite that managed to delete all databases :-/. Maybe we could use
> a patterned name, but only delete databases that also have a comment
> with some text in it that we can verify?
>
>> Dave, are those errors you saw when you shut down your application on :5050
>> and did a fresh run of the tests? If not, could you please do a clean run?
>> It's possible that the second error could be related to viewport size as you
>> suggested, but the first error just looks like a problem with the test not
>> being able to spin up its own server.
>
> That was on a second run of the tests, yes. I just did a careful
> cleanup of left-over test databases, double-checked my server wasn't
> running and re-ran the tests - I got the same results.
>
>>
>> Thanks,
>> George & Tira
>>
>> On Tue, Jan 31, 2017 at 9:41 AM, Dave Page <dpage@pgadmin.org> wrote:
>>>
>>> Hi
>>>
>>> On Mon, Jan 30, 2017 at 9:23 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>>> > Here's the patch with one more fix -- cleaning up the connections that
>>> > get
>>> > created in pgAdmin.
>>>
>>> Hmm, I had trouble with this one. I noticed a few issues:
>>>
>>> - The tests started pgAdmin listening on the default port (5050),
>>> however, I already had an instance running on there;
>>>     a) It should have detected that something else was running on the port
>>>     b) Shouldn't we just use a random, unused port?
>>>
>>> - Errors were given because I already had an acceptance_test_db on a
>>> number of servers, and that contained the test table. Obviously the
>>> code now cleans up after itself, but I think we should use a random
>>> database name as the main regression tests do (they append a random
>>> number to the name iirc).
>>>
>>> - Some of the tests just seemed to time out. I *think* this might be
>>> because the test browser window opens quite narrowly, and it looks
>>> like the tests are probably trying to do things with nodes that aren't
>>> actually visible.
>>>
>>> ======================================================================
>>> ERROR: runTest
>>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
>>> line 69, in tearDown
>>>     self.app_starter.stop_app()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/app_starter.py",
>>> line 27, in stop_app
>>>     os.killpg(os.getpgid(self.pgadmin_process.pid), signal.SIGTERM)
>>> OSError: [Errno 3] No such process
>>>
>>> ======================================================================
>>> ERROR: runTest
>>> (pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>>> line 37, in runTest
>>>     self.page.find_by_xpath("//*[@id='tree']//*[@class='aciTreeText'
>>> and .='Trigger Functions']").click()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 45, in find_by_xpath
>>>     return self.wait_for_element(lambda:
>>> self.driver.find_element_by_xpath(xpath))
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 72, in wait_for_element
>>>     return self._wait_for("element to exist", element_if_it_exists)
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 106, in _wait_for
>>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>>> RuntimeError: timed out waiting for element to exist
>>>
>>> ======================================================================
>>> ERROR: runTest
>>> (pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest)
>>> ----------------------------------------------------------------------
>>> Traceback (most recent call last):
>>>   File
>>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/sql_template_selection_by_postgres_version_works_feature_test.py",
>>> line 60, in tearDown
>>>     self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 45, in find_by_xpath
>>>     return self.wait_for_element(lambda:
>>> self.driver.find_element_by_xpath(xpath))
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 72, in wait_for_element
>>>     return self._wait_for("element to exist", element_if_it_exists)
>>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>>> line 106, in _wait_for
>>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>>> RuntimeError: timed out waiting for element to exist
>>>
>>> ----------------------------------------------------------------------
>>>
>>> Thanks.
>>>
>>> --
>>> Dave Page
>>> Blog: http://pgsnake.blogspot.com
>>> Twitter: @pgsnake
>>>
>>> EnterpriseDB UK: http://www.enterprisedb.com
>>> The Enterprise PostgreSQL Company
>>
>>
>
>
>
> --
> Dave Page
> Blog: http://pgsnake.blogspot.com
> Twitter: @pgsnake
>
> EnterpriseDB UK: http://www.enterprisedb.com
> The Enterprise PostgreSQL Company



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
> Here is a new patch which includes the following:
> - randomized ports
> - delete the acceptance_test_db database in setup in case a prior run failed
> - fixed size browser window

Definitely getting there :-). A couple of thoughts/questions:

- Now there are 2 tests in there, it's clear that both the Python
server and browser session are restarted for each test. Can this be
avoided? It'll really slow down test execution as more and more are
added.

- We've got a new monster name:

pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
(which on disk is
sql_template_select_by_postgres_version_works_feature_test.py). Names
like that really must be shortened to something more sane and
manageable.

- I'm a little confused by why the tests cannot be run in server mode.
The error says it's because the username/password is unknown -
however, both the pgAdmin and database server usernames and passwords
are in test_config.json.

Thanks!

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
I agree that we should rename the test. We've renamed it to "template_selection_feature_test".
Your other suggestions are captured in our backlog as future improvements. We definitely can and should do those things but I think it would be valuable to go ahead and get this suite in and give other devs a chance to use and iterate on this work.

Thanks,

Tira & George

On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi

On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
> Here is a new patch which includes the following:
> - randomized ports
> - delete the acceptance_test_db database in setup in case a prior run failed
> - fixed size browser window

Definitely getting there :-). A couple of thoughts/questions:

- Now there are 2 tests in there, it's clear that both the Python
server and browser session are restarted for each test. Can this be
avoided? It'll really slow down test execution as more and more are
added.

- We've got a new monster name:
pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
(which on disk is
sql_template_select_by_postgres_version_works_feature_test.py). Names
like that really must be shortened to something more sane and
manageable.

- I'm a little confused by why the tests cannot be run in server mode.
The error says it's because the username/password is unknown -
however, both the pgAdmin and database server usernames and passwords
are in test_config.json.

Thanks!

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Hey Dave,

We re-used one of the test helpers for the 'fix-greenplum-show-tables.diff' patch, so here is an updated patch which does not include adding that test helper in case you apply the show-tables patch first. Also, we saw some strange test behavior yesterday where form fields weren't being filled in correctly so we changed the way that input fields get filled to be more reliable.

In short these need to be applied in this order:
git apply fix-greenplum-show-tables.diff
git apply acceptance-tests-minus-create-table-helper-with-fixed-inputs.diff

We also moved the --exclude flag changes out to a separate patch.

On our side we are still dealing with these as 20 separate commits. What is the best way for us to send you these patches? Do you prefer having them all squashed down to a single patch or to have smaller patches?



On Mon, Feb 6, 2017 at 9:54 AM, Atira Odhner <aodhner@pivotal.io> wrote:
I agree that we should rename the test. We've renamed it to "template_selection_feature_test".
Your other suggestions are captured in our backlog as future improvements. We definitely can and should do those things but I think it would be valuable to go ahead and get this suite in and give other devs a chance to use and iterate on this work.

Thanks,

Tira & George

On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi

On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
> Here is a new patch which includes the following:
> - randomized ports
> - delete the acceptance_test_db database in setup in case a prior run failed
> - fixed size browser window

Definitely getting there :-). A couple of thoughts/questions:

- Now there are 2 tests in there, it's clear that both the Python
server and browser session are restarted for each test. Can this be
avoided? It'll really slow down test execution as more and more are
added.

- We've got a new monster name:
pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
(which on disk is
sql_template_select_by_postgres_version_works_feature_test.py). Names
like that really must be shortened to something more sane and
manageable.

- I'm a little confused by why the tests cannot be run in server mode.
The error says it's because the username/password is unknown -
however, both the pgAdmin and database server usernames and passwords
are in test_config.json.

Thanks!

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

I get the following crash when running with Python 3.4 or 3.5:

(pgadmin4-py34) piranha:pgadmin4 dpage$ python web/regression/runtests.py
pgAdmin 4 - Application Initialisation
======================================


The configuration database - '/Users/dpage/.pgadmin/test_pgadmin4.db'
does not exist.
Entering initial setup mode...
NOTE: Configuring authentication for DESKTOP mode.

The configuration database has been created at
/Users/dpage/.pgadmin/test_pgadmin4.db

=============Running the test cases for 'Regression - PG 9.4'=============
runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
... Traceback (most recent call last):
  File "web/regression/runtests.py", line 276, in <module>
    verbosity=2).run(suite)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/runner.py",
line 168, in run
    test(result)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
line 84, in __call__
    return self.run(*args, **kwds)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
line 122, in run
    test(result)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 628, in __call__
    return self.run(*args, **kwds)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 588, in run
    self._feedErrorsToResult(result, outcome.errors)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 515, in _feedErrorsToResult
    if issubclass(exc_info[0], self.failureException):
TypeError: issubclass() arg 2 must be a class or tuple of classes

With Python 2.7, it initially opens Chrome with the URL "data:,"
(without the quotes), and then spits out:

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
line 41, in setUp
    test_utils.create_table(self.server, "acceptance_test_db", "test_table")
AttributeError: 'module' object has no attribute 'create_table'

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
line 36, in runTest
    test_utils.create_table(self.server, "acceptance_test_db", "test_table")
AttributeError: 'module' object has no attribute 'create_table'

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
line 66, in tearDown
    self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 46, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 86, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 120, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

----------------------------------------------------------------------
Ran 149 tests in 59.258s

FAILED (errors=3, skipped=12)


On Wed, Feb 8, 2017 at 10:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hey Dave,
>
> We re-used one of the test helpers for the 'fix-greenplum-show-tables.diff'
> patch, so here is an updated patch which does not include adding that test
> helper in case you apply the show-tables patch first. Also, we saw some
> strange test behavior yesterday where form fields weren't being filled in
> correctly so we changed the way that input fields get filled to be more
> reliable.
>
> In short these need to be applied in this order:
>>
>> git apply fix-greenplum-show-tables.diff
>>
>> git apply
>> acceptance-tests-minus-create-table-helper-with-fixed-inputs.diff
>
>
> We also moved the --exclude flag changes out to a separate patch.
>
> On our side we are still dealing with these as 20 separate commits. What is
> the best way for us to send you these patches? Do you prefer having them all
> squashed down to a single patch or to have smaller patches?
>
>
>
> On Mon, Feb 6, 2017 at 9:54 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>>
>> I agree that we should rename the test. We've renamed it to
>> "template_selection_feature_test".
>> Your other suggestions are captured in our backlog as future improvements.
>> We definitely can and should do those things but I think it would be
>> valuable to go ahead and get this suite in and give other devs a chance to
>> use and iterate on this work.
>>
>> Thanks,
>>
>> Tira & George
>>
>> On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
>>>
>>> Hi
>>>
>>> On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>>> > Hi Dave,
>>> >
>>> > Here is a new patch which includes the following:
>>> > - randomized ports
>>> > - delete the acceptance_test_db database in setup in case a prior run
>>> > failed
>>> > - fixed size browser window
>>>
>>> Definitely getting there :-). A couple of thoughts/questions:
>>>
>>> - Now there are 2 tests in there, it's clear that both the Python
>>> server and browser session are restarted for each test. Can this be
>>> avoided? It'll really slow down test execution as more and more are
>>> added.
>>>
>>> - We've got a new monster name:
>>>
>>>
pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
>>> (which on disk is
>>> sql_template_select_by_postgres_version_works_feature_test.py). Names
>>> like that really must be shortened to something more sane and
>>> manageable.
>>>
>>> - I'm a little confused by why the tests cannot be run in server mode.
>>> The error says it's because the username/password is unknown -
>>> however, both the pgAdmin and database server usernames and passwords
>>> are in test_config.json.
>>>
>>> Thanks!
>>>
>>> --
>>> Dave Page
>>> Blog: http://pgsnake.blogspot.com
>>> Twitter: @pgsnake
>>>
>>> EnterpriseDB UK: http://www.enterprisedb.com
>>> The Enterprise PostgreSQL Company
>>
>>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:

create_table is the change we pulled into the other patch which would need to be applied first.

On Thu, Feb 9, 2017, 7:47 AM Dave Page <dpage@pgadmin.org> wrote:
Hi

I get the following crash when running with Python 3.4 or 3.5:

(pgadmin4-py34) piranha:pgadmin4 dpage$ python web/regression/runtests.py
pgAdmin 4 - Application Initialisation
======================================


The configuration database - '/Users/dpage/.pgadmin/test_pgadmin4.db'
does not exist.
Entering initial setup mode...
NOTE: Configuring authentication for DESKTOP mode.

The configuration database has been created at
/Users/dpage/.pgadmin/test_pgadmin4.db

=============Running the test cases for 'Regression - PG 9.4'=============
runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
... Traceback (most recent call last):
  File "web/regression/runtests.py", line 276, in <module>
    verbosity=2).run(suite)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/runner.py",
line 168, in run
    test(result)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
line 84, in __call__
    return self.run(*args, **kwds)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
line 122, in run
    test(result)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 628, in __call__
    return self.run(*args, **kwds)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 588, in run
    self._feedErrorsToResult(result, outcome.errors)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
line 515, in _feedErrorsToResult
    if issubclass(exc_info[0], self.failureException):
TypeError: issubclass() arg 2 must be a class or tuple of classes

With Python 2.7, it initially opens Chrome with the URL "data:,"
(without the quotes), and then spits out:

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
line 41, in setUp
    test_utils.create_table(self.server, "acceptance_test_db", "test_table")
AttributeError: 'module' object has no attribute 'create_table'

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
line 36, in runTest
    test_utils.create_table(self.server, "acceptance_test_db", "test_table")
AttributeError: 'module' object has no attribute 'create_table'

======================================================================
ERROR: runTest (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
line 66, in tearDown
    self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 46, in find_by_xpath
    return self.wait_for_element(lambda:
self.driver.find_element_by_xpath(xpath))
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 86, in wait_for_element
    return self._wait_for("element to exist", element_if_it_exists)
  File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
line 120, in _wait_for
    raise RuntimeError("timed out waiting for " + waiting_for_message)
RuntimeError: timed out waiting for element to exist

----------------------------------------------------------------------
Ran 149 tests in 59.258s

FAILED (errors=3, skipped=12)


On Wed, Feb 8, 2017 at 10:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hey Dave,
>
> We re-used one of the test helpers for the 'fix-greenplum-show-tables.diff'
> patch, so here is an updated patch which does not include adding that test
> helper in case you apply the show-tables patch first. Also, we saw some
> strange test behavior yesterday where form fields weren't being filled in
> correctly so we changed the way that input fields get filled to be more
> reliable.
>
> In short these need to be applied in this order:
>>
>> git apply fix-greenplum-show-tables.diff
>>
>> git apply
>> acceptance-tests-minus-create-table-helper-with-fixed-inputs.diff
>
>
> We also moved the --exclude flag changes out to a separate patch.
>
> On our side we are still dealing with these as 20 separate commits. What is
> the best way for us to send you these patches? Do you prefer having them all
> squashed down to a single patch or to have smaller patches?
>
>
>
> On Mon, Feb 6, 2017 at 9:54 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>>
>> I agree that we should rename the test. We've renamed it to
>> "template_selection_feature_test".
>> Your other suggestions are captured in our backlog as future improvements.
>> We definitely can and should do those things but I think it would be
>> valuable to go ahead and get this suite in and give other devs a chance to
>> use and iterate on this work.
>>
>> Thanks,
>>
>> Tira & George
>>
>> On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
>>>
>>> Hi
>>>
>>> On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>>> > Hi Dave,
>>> >
>>> > Here is a new patch which includes the following:
>>> > - randomized ports
>>> > - delete the acceptance_test_db database in setup in case a prior run
>>> > failed
>>> > - fixed size browser window
>>>
>>> Definitely getting there :-). A couple of thoughts/questions:
>>>
>>> - Now there are 2 tests in there, it's clear that both the Python
>>> server and browser session are restarted for each test. Can this be
>>> avoided? It'll really slow down test execution as more and more are
>>> added.
>>>
>>> - We've got a new monster name:
>>>
>>> pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
>>> (which on disk is
>>> sql_template_select_by_postgres_version_works_feature_test.py). Names
>>> like that really must be shortened to something more sane and
>>> manageable.
>>>
>>> - I'm a little confused by why the tests cannot be run in server mode.
>>> The error says it's because the username/password is unknown -
>>> however, both the pgAdmin and database server usernames and passwords
>>> are in test_config.json.
>>>
>>> Thanks!
>>>
>>> --
>>> Dave Page
>>> Blog: http://pgsnake.blogspot.com
>>> Twitter: @pgsnake
>>>
>>> EnterpriseDB UK: http://www.enterprisedb.com
>>> The Enterprise PostgreSQL Company
>>
>>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
OK, well that one was sent back with feedback as well, so please
resubmit when the relevant updates have been made to either and
they've been retested. Given the amount of work you're doing at the
moment, it would be helpful if you could note when one patch is
dependent on another. It's hard to keep track when you're this
productive!

Thanks.

On Thu, Feb 9, 2017 at 1:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> create_table is the change we pulled into the other patch which would need
> to be applied first.
>
>
> On Thu, Feb 9, 2017, 7:47 AM Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi
>>
>> I get the following crash when running with Python 3.4 or 3.5:
>>
>> (pgadmin4-py34) piranha:pgadmin4 dpage$ python web/regression/runtests.py
>> pgAdmin 4 - Application Initialisation
>> ======================================
>>
>>
>> The configuration database - '/Users/dpage/.pgadmin/test_pgadmin4.db'
>> does not exist.
>> Entering initial setup mode...
>> NOTE: Configuring authentication for DESKTOP mode.
>>
>> The configuration database has been created at
>> /Users/dpage/.pgadmin/test_pgadmin4.db
>>
>> =============Running the test cases for 'Regression - PG 9.4'=============
>> runTest
>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>> ... Traceback (most recent call last):
>>   File "web/regression/runtests.py", line 276, in <module>
>>     verbosity=2).run(suite)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/runner.py",
>> line 168, in run
>>     test(result)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
>> line 84, in __call__
>>     return self.run(*args, **kwds)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
>> line 122, in run
>>     test(result)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 628, in __call__
>>     return self.run(*args, **kwds)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 588, in run
>>     self._feedErrorsToResult(result, outcome.errors)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 515, in _feedErrorsToResult
>>     if issubclass(exc_info[0], self.failureException):
>> TypeError: issubclass() arg 2 must be a class or tuple of classes
>>
>> With Python 2.7, it initially opens Chrome with the URL "data:,"
>> (without the quotes), and then spits out:
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
>> line 41, in setUp
>>     test_utils.create_table(self.server, "acceptance_test_db",
>> "test_table")
>> AttributeError: 'module' object has no attribute 'create_table'
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
>> line 36, in runTest
>>     test_utils.create_table(self.server, "acceptance_test_db",
>> "test_table")
>> AttributeError: 'module' object has no attribute 'create_table'
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
>> line 66, in tearDown
>>     self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 46, in find_by_xpath
>>     return self.wait_for_element(lambda:
>> self.driver.find_element_by_xpath(xpath))
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 86, in wait_for_element
>>     return self._wait_for("element to exist", element_if_it_exists)
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 120, in _wait_for
>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>> RuntimeError: timed out waiting for element to exist
>>
>> ----------------------------------------------------------------------
>> Ran 149 tests in 59.258s
>>
>> FAILED (errors=3, skipped=12)
>>
>>
>> On Wed, Feb 8, 2017 at 10:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Hey Dave,
>> >
>> > We re-used one of the test helpers for the
>> > 'fix-greenplum-show-tables.diff'
>> > patch, so here is an updated patch which does not include adding that
>> > test
>> > helper in case you apply the show-tables patch first. Also, we saw some
>> > strange test behavior yesterday where form fields weren't being filled
>> > in
>> > correctly so we changed the way that input fields get filled to be more
>> > reliable.
>> >
>> > In short these need to be applied in this order:
>> >>
>> >> git apply fix-greenplum-show-tables.diff
>> >>
>> >> git apply
>> >> acceptance-tests-minus-create-table-helper-with-fixed-inputs.diff
>> >
>> >
>> > We also moved the --exclude flag changes out to a separate patch.
>> >
>> > On our side we are still dealing with these as 20 separate commits. What
>> > is
>> > the best way for us to send you these patches? Do you prefer having them
>> > all
>> > squashed down to a single patch or to have smaller patches?
>> >
>> >
>> >
>> > On Mon, Feb 6, 2017 at 9:54 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>> >>
>> >> I agree that we should rename the test. We've renamed it to
>> >> "template_selection_feature_test".
>> >> Your other suggestions are captured in our backlog as future
>> >> improvements.
>> >> We definitely can and should do those things but I think it would be
>> >> valuable to go ahead and get this suite in and give other devs a chance
>> >> to
>> >> use and iterate on this work.
>> >>
>> >> Thanks,
>> >>
>> >> Tira & George
>> >>
>> >> On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
>> >>>
>> >>> Hi
>> >>>
>> >>> On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io>
>> >>> wrote:
>> >>> > Hi Dave,
>> >>> >
>> >>> > Here is a new patch which includes the following:
>> >>> > - randomized ports
>> >>> > - delete the acceptance_test_db database in setup in case a prior
>> >>> > run
>> >>> > failed
>> >>> > - fixed size browser window
>> >>>
>> >>> Definitely getting there :-). A couple of thoughts/questions:
>> >>>
>> >>> - Now there are 2 tests in there, it's clear that both the Python
>> >>> server and browser session are restarted for each test. Can this be
>> >>> avoided? It'll really slow down test execution as more and more are
>> >>> added.
>> >>>
>> >>> - We've got a new monster name:
>> >>>
>> >>>
>> >>>
pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
>> >>> (which on disk is
>> >>> sql_template_select_by_postgres_version_works_feature_test.py). Names
>> >>> like that really must be shortened to something more sane and
>> >>> manageable.
>> >>>
>> >>> - I'm a little confused by why the tests cannot be run in server mode.
>> >>> The error says it's because the username/password is unknown -
>> >>> however, both the pgAdmin and database server usernames and passwords
>> >>> are in test_config.json.
>> >>>
>> >>> Thanks!
>> >>>
>> >>> --
>> >>> Dave Page
>> >>> Blog: http://pgsnake.blogspot.com
>> >>> Twitter: @pgsnake
>> >>>
>> >>> EnterpriseDB UK: http://www.enterprisedb.com
>> >>> The Enterprise PostgreSQL Company
>> >>
>> >>
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:

Certainly.  We did mention the dependency in the email. Would it be better to mention it in the patch name? Is there a better way for us to manage these changes? On other open source projects, I've seen github mirrors set up so that changes can be pulled in like branches rather then as patch applies. That would have avoided this situation since the parent commit would be pulled in with the same SHA from either pull request branch and git would not see it as a conflict.

I'm rather new to dealing with patch files like this so I would love some tips.

Thanks,
Tira

On Thu, Feb 9, 2017, 8:27 AM Dave Page <dpage@pgadmin.org> wrote:
OK, well that one was sent back with feedback as well, so please
resubmit when the relevant updates have been made to either and
they've been retested. Given the amount of work you're doing at the
moment, it would be helpful if you could note when one patch is
dependent on another. It's hard to keep track when you're this
productive!

Thanks.

On Thu, Feb 9, 2017 at 1:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> create_table is the change we pulled into the other patch which would need
> to be applied first.
>
>
> On Thu, Feb 9, 2017, 7:47 AM Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi
>>
>> I get the following crash when running with Python 3.4 or 3.5:
>>
>> (pgadmin4-py34) piranha:pgadmin4 dpage$ python web/regression/runtests.py
>> pgAdmin 4 - Application Initialisation
>> ======================================
>>
>>
>> The configuration database - '/Users/dpage/.pgadmin/test_pgadmin4.db'
>> does not exist.
>> Entering initial setup mode...
>> NOTE: Configuring authentication for DESKTOP mode.
>>
>> The configuration database has been created at
>> /Users/dpage/.pgadmin/test_pgadmin4.db
>>
>> =============Running the test cases for 'Regression - PG 9.4'=============
>> runTest
>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>> ... Traceback (most recent call last):
>>   File "web/regression/runtests.py", line 276, in <module>
>>     verbosity=2).run(suite)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/runner.py",
>> line 168, in run
>>     test(result)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
>> line 84, in __call__
>>     return self.run(*args, **kwds)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/suite.py",
>> line 122, in run
>>     test(result)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 628, in __call__
>>     return self.run(*args, **kwds)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 588, in run
>>     self._feedErrorsToResult(result, outcome.errors)
>>   File
>> "/opt/local/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/unittest/case.py",
>> line 515, in _feedErrorsToResult
>>     if issubclass(exc_info[0], self.failureException):
>> TypeError: issubclass() arg 2 must be a class or tuple of classes
>>
>> With Python 2.7, it initially opens Chrome with the URL "data:,"
>> (without the quotes), and then spits out:
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.connect_to_server_feature_test.ConnectsToServerFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/connect_to_server_feature_test.py",
>> line 41, in setUp
>>     test_utils.create_table(self.server, "acceptance_test_db",
>> "test_table")
>> AttributeError: 'module' object has no attribute 'create_table'
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
>> line 36, in runTest
>>     test_utils.create_table(self.server, "acceptance_test_db",
>> "test_table")
>> AttributeError: 'module' object has no attribute 'create_table'
>>
>> ======================================================================
>> ERROR: runTest
>> (pgadmin.acceptance.tests.template_selection_feature_test.TemplateSelectionFeatureTest)
>> ----------------------------------------------------------------------
>> Traceback (most recent call last):
>>   File
>> "/Users/dpage/git/pgadmin4/web/pgadmin/acceptance/tests/template_selection_feature_test.py",
>> line 66, in tearDown
>>     self.page.find_by_xpath("//button[contains(.,'Cancel')]").click()
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 46, in find_by_xpath
>>     return self.wait_for_element(lambda:
>> self.driver.find_element_by_xpath(xpath))
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 86, in wait_for_element
>>     return self._wait_for("element to exist", element_if_it_exists)
>>   File "/Users/dpage/git/pgadmin4/web/regression/utils/pgadmin_page.py",
>> line 120, in _wait_for
>>     raise RuntimeError("timed out waiting for " + waiting_for_message)
>> RuntimeError: timed out waiting for element to exist
>>
>> ----------------------------------------------------------------------
>> Ran 149 tests in 59.258s
>>
>> FAILED (errors=3, skipped=12)
>>
>>
>> On Wed, Feb 8, 2017 at 10:15 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Hey Dave,
>> >
>> > We re-used one of the test helpers for the
>> > 'fix-greenplum-show-tables.diff'
>> > patch, so here is an updated patch which does not include adding that
>> > test
>> > helper in case you apply the show-tables patch first. Also, we saw some
>> > strange test behavior yesterday where form fields weren't being filled
>> > in
>> > correctly so we changed the way that input fields get filled to be more
>> > reliable.
>> >
>> > In short these need to be applied in this order:
>> >>
>> >> git apply fix-greenplum-show-tables.diff
>> >>
>> >> git apply
>> >> acceptance-tests-minus-create-table-helper-with-fixed-inputs.diff
>> >
>> >
>> > We also moved the --exclude flag changes out to a separate patch.
>> >
>> > On our side we are still dealing with these as 20 separate commits. What
>> > is
>> > the best way for us to send you these patches? Do you prefer having them
>> > all
>> > squashed down to a single patch or to have smaller patches?
>> >
>> >
>> >
>> > On Mon, Feb 6, 2017 at 9:54 AM, Atira Odhner <aodhner@pivotal.io> wrote:
>> >>
>> >> I agree that we should rename the test. We've renamed it to
>> >> "template_selection_feature_test".
>> >> Your other suggestions are captured in our backlog as future
>> >> improvements.
>> >> We definitely can and should do those things but I think it would be
>> >> valuable to go ahead and get this suite in and give other devs a chance
>> >> to
>> >> use and iterate on this work.
>> >>
>> >> Thanks,
>> >>
>> >> Tira & George
>> >>
>> >> On Mon, Feb 6, 2017 at 5:32 AM, Dave Page <dpage@pgadmin.org> wrote:
>> >>>
>> >>> Hi
>> >>>
>> >>> On Fri, Feb 3, 2017 at 9:56 PM, Atira Odhner <aodhner@pivotal.io>
>> >>> wrote:
>> >>> > Hi Dave,
>> >>> >
>> >>> > Here is a new patch which includes the following:
>> >>> > - randomized ports
>> >>> > - delete the acceptance_test_db database in setup in case a prior
>> >>> > run
>> >>> > failed
>> >>> > - fixed size browser window
>> >>>
>> >>> Definitely getting there :-). A couple of thoughts/questions:
>> >>>
>> >>> - Now there are 2 tests in there, it's clear that both the Python
>> >>> server and browser session are restarted for each test. Can this be
>> >>> avoided? It'll really slow down test execution as more and more are
>> >>> added.
>> >>>
>> >>> - We've got a new monster name:
>> >>>
>> >>>
>> >>> pgadmin.acceptance.tests.sql_template_selection_by_postgres_version_works_feature_test.SQLTemplateSelectionByPostgresVersionWorksFeatureTest
>> >>> (which on disk is
>> >>> sql_template_select_by_postgres_version_works_feature_test.py). Names
>> >>> like that really must be shortened to something more sane and
>> >>> manageable.
>> >>>
>> >>> - I'm a little confused by why the tests cannot be run in server mode.
>> >>> The error says it's because the username/password is unknown -
>> >>> however, both the pgAdmin and database server usernames and passwords
>> >>> are in test_config.json.
>> >>>
>> >>> Thanks!
>> >>>
>> >>> --
>> >>> Dave Page
>> >>> Blog: http://pgsnake.blogspot.com
>> >>> Twitter: @pgsnake
>> >>>
>> >>> EnterpriseDB UK: http://www.enterprisedb.com
>> >>> The Enterprise PostgreSQL Company
>> >>
>> >>
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi

On Thu, Feb 9, 2017 at 2:20 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Certainly.  We did mention the dependency in the email. Would it be better
> to mention it in the patch name?

I think the problem was that the way you phrased it, it sounded
optional ("an updated patch which does not include adding that test
helper in case you apply the show-tables patch first"). I think a
clear "This patch is dependent on patch Foo" would suffice.

> Is there a better way for us to manage
> these changes? On other open source projects, I've seen github mirrors set
> up so that changes can be pulled in like branches rather then as patch
> applies. That would have avoided this situation since the parent commit
> would be pulled in with the same SHA from either pull request branch and git
> would not see it as a conflict.
>
> I'm rather new to dealing with patch files like this so I would love some
> tips.

The Postgres project in general is quite conservative and stuck in
it's ways about how things are done (which is usually a good thing
considering you trust your data to the resulting code). We're used to
dealing with larger patchsets via the mailing list - typically as long
as you're clear about any dependencies, it shouldn't be a problem.
Some of us use tools like PyCharms for handling patches and helping
with reviews etc. which I guess replaces most, if not all of the
GitHub functionality over plain git.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Hi Dave,

 I think the problem was that the way you phrased it,

You're right, we totally messed that up. We were talking about making 3 patches and ended up making only 2 and forgot to reword that bit.
Sorry about that.

Here are the two patches for this change that resolves the AttributeError you were seeing. The first patch is identical to the patch of the same name in the other email thread. 

We're used to
dealing with larger patchsets via the mailing list - typically as long
as you're clear about any dependencies, it shouldn't be a problem.
 
Great! We'll try sending patchsets from now on and hopefully that resolves some of the issues we were seeing.

Tira & George

On Thu, Feb 9, 2017 at 9:28 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi

On Thu, Feb 9, 2017 at 2:20 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Certainly.  We did mention the dependency in the email. Would it be better
> to mention it in the patch name?

I think the problem was that the way you phrased it, it sounded
optional ("an updated patch which does not include adding that test
helper in case you apply the show-tables patch first"). I think a
clear "This patch is dependent on patch Foo" would suffice.

> Is there a better way for us to manage
> these changes? On other open source projects, I've seen github mirrors set
> up so that changes can be pulled in like branches rather then as patch
> applies. That would have avoided this situation since the parent commit
> would be pulled in with the same SHA from either pull request branch and git
> would not see it as a conflict.
>
> I'm rather new to dealing with patch files like this so I would love some
> tips.

The Postgres project in general is quite conservative and stuck in
it's ways about how things are done (which is usually a good thing
considering you trust your data to the resulting code). We're used to
dealing with larger patchsets via the mailing list - typically as long
as you're clear about any dependencies, it shouldn't be a problem.
Some of us use tools like PyCharms for handling patches and helping
with reviews etc. which I guess replaces most, if not all of the
GitHub functionality over plain git.

--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Hi,

I've been playing with this for the last couple of hours, and I just
can't get it to work reliably;

- A good percentage of the time the browser opens with a URL of
"data:," and does nothing more. This appears to happen if tests fail,
which still leaves server processes running in the background.

- The connect_to_server test usually seems to work.

- The template_selection_feature test usually does *not* work. I can't
see an obvious reason, but I suspect it's a race condition. What seems
to happen is that the function definition is entered, but not
registered by the UI, so the mSQL panel just ends up saying
"incomplete definition". Manually checking what was input proves that
everything is correct - and indeed, returning the SQL tab shows the
expected SQL.

Other issues I noted:

- The template_selection_feature test should just enter BEGIN/END.
What it currently enters is an entire function definition, when only
the body content is expected. E.g.

        self.page.fill_codemirror_area_with(
"""BEGIN

END;
"""
        )

- Screenshots are being taken of failed tests:
  1) I've never actually seen any get saved
  2) They should be saved to the same directory as the test log, not /tmp
  3) They should have guaranteed unique names, and be mentioned in the
test output so the user can reference the image to the failure.

The reason the last two items are important is that I've now got a
test server running the test suite with every supported version of
Python, for every supported database (well, almost, pending a couple
of fixes). I have separate workspaces for each Python version, and a
single test run might run every test 10 times, once for each database
server.

- Please wrap the README at < 80 chars.



On Thu, Feb 9, 2017 at 4:17 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
>>  I think the problem was that the way you phrased it,
>
>
> You're right, we totally messed that up. We were talking about making 3
> patches and ended up making only 2 and forgot to reword that bit.
> Sorry about that.
>
> Here are the two patches for this change that resolves the AttributeError
> you were seeing. The first patch is identical to the patch of the same name
> in the other email thread.
>
>> We're used to
>> dealing with larger patchsets via the mailing list - typically as long
>> as you're clear about any dependencies, it shouldn't be a problem.
>
>
> Great! We'll try sending patchsets from now on and hopefully that resolves
> some of the issues we were seeing.
>
> Tira & George
>
> On Thu, Feb 9, 2017 at 9:28 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi
>>
>> On Thu, Feb 9, 2017 at 2:20 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Certainly.  We did mention the dependency in the email. Would it be
>> > better
>> > to mention it in the patch name?
>>
>> I think the problem was that the way you phrased it, it sounded
>> optional ("an updated patch which does not include adding that test
>> helper in case you apply the show-tables patch first"). I think a
>> clear "This patch is dependent on patch Foo" would suffice.
>>
>> > Is there a better way for us to manage
>> > these changes? On other open source projects, I've seen github mirrors
>> > set
>> > up so that changes can be pulled in like branches rather then as patch
>> > applies. That would have avoided this situation since the parent commit
>> > would be pulled in with the same SHA from either pull request branch and
>> > git
>> > would not see it as a conflict.
>> >
>> > I'm rather new to dealing with patch files like this so I would love
>> > some
>> > tips.
>>
>> The Postgres project in general is quite conservative and stuck in
>> it's ways about how things are done (which is usually a good thing
>> considering you trust your data to the resulting code). We're used to
>> dealing with larger patchsets via the mailing list - typically as long
>> as you're clear about any dependencies, it shouldn't be a problem.
>> Some of us use tools like PyCharms for handling patches and helping
>> with reviews etc. which I guess replaces most, if not all of the
>> GitHub functionality over plain git.
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company


Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Atira Odhner
Дата:
Hi Dave,

We fixed the flakiness issues that we saw (hopefully they are the same ones you were seeing.) by tearing down connections to the acceptance_test_db before attempting to drop it at the beginning of the test. Once we have access to the CI pipeline we can help out there to ensure the flakiness is gone.

We wrapped the README at 80 characters, and removed the misleading function definition from the test.

As far as the screenshots go, I'm more inclined to remove the screenshotting than to work on improving it. It currently only works when the failure is due to an AssertionError since that's what failureException relies on. 

We also renamed acceptance to feature_tests since 'acceptance' seemed ambiguous/redundant with 'regression'.

Tira & Sara


On Mon, Feb 13, 2017 at 9:36 AM, Dave Page <dpage@pgadmin.org> wrote:
Hi,

I've been playing with this for the last couple of hours, and I just
can't get it to work reliably;

- A good percentage of the time the browser opens with a URL of
"data:," and does nothing more. This appears to happen if tests fail,
which still leaves server processes running in the background.

- The connect_to_server test usually seems to work.

- The template_selection_feature test usually does *not* work. I can't
see an obvious reason, but I suspect it's a race condition. What seems
to happen is that the function definition is entered, but not
registered by the UI, so the mSQL panel just ends up saying
"incomplete definition". Manually checking what was input proves that
everything is correct - and indeed, returning the SQL tab shows the
expected SQL.

Other issues I noted:

- The template_selection_feature test should just enter BEGIN/END.
What it currently enters is an entire function definition, when only
the body content is expected. E.g.

        self.page.fill_codemirror_area_with(
"""BEGIN

END;
"""
        )

- Screenshots are being taken of failed tests:
  1) I've never actually seen any get saved
  2) They should be saved to the same directory as the test log, not /tmp
  3) They should have guaranteed unique names, and be mentioned in the
test output so the user can reference the image to the failure.

The reason the last two items are important is that I've now got a
test server running the test suite with every supported version of
Python, for every supported database (well, almost, pending a couple
of fixes). I have separate workspaces for each Python version, and a
single test run might run every test 10 times, once for each database
server.

- Please wrap the README at < 80 chars.



On Thu, Feb 9, 2017 at 4:17 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
>>  I think the problem was that the way you phrased it,
>
>
> You're right, we totally messed that up. We were talking about making 3
> patches and ended up making only 2 and forgot to reword that bit.
> Sorry about that.
>
> Here are the two patches for this change that resolves the AttributeError
> you were seeing. The first patch is identical to the patch of the same name
> in the other email thread.
>
>> We're used to
>> dealing with larger patchsets via the mailing list - typically as long
>> as you're clear about any dependencies, it shouldn't be a problem.
>
>
> Great! We'll try sending patchsets from now on and hopefully that resolves
> some of the issues we were seeing.
>
> Tira & George
>
> On Thu, Feb 9, 2017 at 9:28 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi
>>
>> On Thu, Feb 9, 2017 at 2:20 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Certainly.  We did mention the dependency in the email. Would it be
>> > better
>> > to mention it in the patch name?
>>
>> I think the problem was that the way you phrased it, it sounded
>> optional ("an updated patch which does not include adding that test
>> helper in case you apply the show-tables patch first"). I think a
>> clear "This patch is dependent on patch Foo" would suffice.
>>
>> > Is there a better way for us to manage
>> > these changes? On other open source projects, I've seen github mirrors
>> > set
>> > up so that changes can be pulled in like branches rather then as patch
>> > applies. That would have avoided this situation since the parent commit
>> > would be pulled in with the same SHA from either pull request branch and
>> > git
>> > would not see it as a conflict.
>> >
>> > I'm rather new to dealing with patch files like this so I would love
>> > some
>> > tips.
>>
>> The Postgres project in general is quite conservative and stuck in
>> it's ways about how things are done (which is usually a good thing
>> considering you trust your data to the resulting code). We're used to
>> dealing with larger patchsets via the mailing list - typically as long
>> as you're clear about any dependencies, it shouldn't be a problem.
>> Some of us use tools like PyCharms for handling patches and helping
>> with reviews etc. which I guess replaces most, if not all of the
>> GitHub functionality over plain git.
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company

Вложения

Re: [pgadmin-hackers] Acceptance Tests against a browser (WIP)

От
Dave Page
Дата:
Thanks, patch applied!

On Tue, Feb 21, 2017 at 10:12 PM, Atira Odhner <aodhner@pivotal.io> wrote:
> Hi Dave,
>
> We fixed the flakiness issues that we saw (hopefully they are the same ones
> you were seeing.) by tearing down connections to the acceptance_test_db
> before attempting to drop it at the beginning of the test. Once we have
> access to the CI pipeline we can help out there to ensure the flakiness is
> gone.
>
> We wrapped the README at 80 characters, and removed the misleading function
> definition from the test.
>
> As far as the screenshots go, I'm more inclined to remove the screenshotting
> than to work on improving it. It currently only works when the failure is
> due to an AssertionError since that's what failureException relies on.
>
> We also renamed acceptance to feature_tests since 'acceptance' seemed
> ambiguous/redundant with 'regression'.
>
> Tira & Sara
>
>
> On Mon, Feb 13, 2017 at 9:36 AM, Dave Page <dpage@pgadmin.org> wrote:
>>
>> Hi,
>>
>> I've been playing with this for the last couple of hours, and I just
>> can't get it to work reliably;
>>
>> - A good percentage of the time the browser opens with a URL of
>> "data:," and does nothing more. This appears to happen if tests fail,
>> which still leaves server processes running in the background.
>>
>> - The connect_to_server test usually seems to work.
>>
>> - The template_selection_feature test usually does *not* work. I can't
>> see an obvious reason, but I suspect it's a race condition. What seems
>> to happen is that the function definition is entered, but not
>> registered by the UI, so the mSQL panel just ends up saying
>> "incomplete definition". Manually checking what was input proves that
>> everything is correct - and indeed, returning the SQL tab shows the
>> expected SQL.
>>
>> Other issues I noted:
>>
>> - The template_selection_feature test should just enter BEGIN/END.
>> What it currently enters is an entire function definition, when only
>> the body content is expected. E.g.
>>
>>         self.page.fill_codemirror_area_with(
>> """BEGIN
>>
>> END;
>> """
>>         )
>>
>> - Screenshots are being taken of failed tests:
>>   1) I've never actually seen any get saved
>>   2) They should be saved to the same directory as the test log, not /tmp
>>   3) They should have guaranteed unique names, and be mentioned in the
>> test output so the user can reference the image to the failure.
>>
>> The reason the last two items are important is that I've now got a
>> test server running the test suite with every supported version of
>> Python, for every supported database (well, almost, pending a couple
>> of fixes). I have separate workspaces for each Python version, and a
>> single test run might run every test 10 times, once for each database
>> server.
>>
>> - Please wrap the README at < 80 chars.
>>
>>
>>
>> On Thu, Feb 9, 2017 at 4:17 PM, Atira Odhner <aodhner@pivotal.io> wrote:
>> > Hi Dave,
>> >
>> >>  I think the problem was that the way you phrased it,
>> >
>> >
>> > You're right, we totally messed that up. We were talking about making 3
>> > patches and ended up making only 2 and forgot to reword that bit.
>> > Sorry about that.
>> >
>> > Here are the two patches for this change that resolves the
>> > AttributeError
>> > you were seeing. The first patch is identical to the patch of the same
>> > name
>> > in the other email thread.
>> >
>> >> We're used to
>> >> dealing with larger patchsets via the mailing list - typically as long
>> >> as you're clear about any dependencies, it shouldn't be a problem.
>> >
>> >
>> > Great! We'll try sending patchsets from now on and hopefully that
>> > resolves
>> > some of the issues we were seeing.
>> >
>> > Tira & George
>> >
>> > On Thu, Feb 9, 2017 at 9:28 AM, Dave Page <dpage@pgadmin.org> wrote:
>> >>
>> >> Hi
>> >>
>> >> On Thu, Feb 9, 2017 at 2:20 PM, Atira Odhner <aodhner@pivotal.io>
>> >> wrote:
>> >> > Certainly.  We did mention the dependency in the email. Would it be
>> >> > better
>> >> > to mention it in the patch name?
>> >>
>> >> I think the problem was that the way you phrased it, it sounded
>> >> optional ("an updated patch which does not include adding that test
>> >> helper in case you apply the show-tables patch first"). I think a
>> >> clear "This patch is dependent on patch Foo" would suffice.
>> >>
>> >> > Is there a better way for us to manage
>> >> > these changes? On other open source projects, I've seen github
>> >> > mirrors
>> >> > set
>> >> > up so that changes can be pulled in like branches rather then as
>> >> > patch
>> >> > applies. That would have avoided this situation since the parent
>> >> > commit
>> >> > would be pulled in with the same SHA from either pull request branch
>> >> > and
>> >> > git
>> >> > would not see it as a conflict.
>> >> >
>> >> > I'm rather new to dealing with patch files like this so I would love
>> >> > some
>> >> > tips.
>> >>
>> >> The Postgres project in general is quite conservative and stuck in
>> >> it's ways about how things are done (which is usually a good thing
>> >> considering you trust your data to the resulting code). We're used to
>> >> dealing with larger patchsets via the mailing list - typically as long
>> >> as you're clear about any dependencies, it shouldn't be a problem.
>> >> Some of us use tools like PyCharms for handling patches and helping
>> >> with reviews etc. which I guess replaces most, if not all of the
>> >> GitHub functionality over plain git.
>> >>
>> >> --
>> >> Dave Page
>> >> Blog: http://pgsnake.blogspot.com
>> >> Twitter: @pgsnake
>> >>
>> >> EnterpriseDB UK: http://www.enterprisedb.com
>> >> The Enterprise PostgreSQL Company
>> >
>> >
>>
>>
>>
>> --
>> Dave Page
>> Blog: http://pgsnake.blogspot.com
>> Twitter: @pgsnake
>>
>> EnterpriseDB UK: http://www.enterprisedb.com
>> The Enterprise PostgreSQL Company
>
>



--
Dave Page
Blog: http://pgsnake.blogspot.com
Twitter: @pgsnake

EnterpriseDB UK: http://www.enterprisedb.com
The Enterprise PostgreSQL Company