Обсуждение: Limits on Tables?
Hi there, I have an app that has some basic tables. When a new project is created in the app, it creates 5 new tables (inherited from some base ones). A single project will never go outside of its tables. What kind of speed problems will I be seeing when there are a lot of projects? Say I have 100 projects, that would be 500 tables, plus a couple of base tables, say 510 tables. Am I going to start seeing major problems? Gerald --- #include <std_disclaimer> Gerald Brandt gbr@hvdc.ca Manitoba HVDC Research Centre http://www.hvdc.ca
> Hi there, > > I have an app that has some basic tables. When a new project is created > in the app, it creates 5 new tables (inherited from some base ones). A > single project will never go outside of its tables. > > What kind of speed problems will I be seeing when there are a lot of > projects? Say I have 100 projects, that would be 500 tables, plus a > couple of base tables, say 510 tables. > > Am I going to start seeing major problems? > Could share with us the tables you are using as base tables? I suspect that you may not need to do this scheme of duplicating tables, but could rather have a table that has a project ID, that you use to segregate your project data amongst these tables. At anyrate I would love to see the tables you have, and then I would know if how best to help....james
Hi James, Actually, it all COULD be placed into the base tables, and in fact it was before I separated them. I just thought that I would get a speed increase by more cleanly separating the data. Then I sat down and thought about it some more, and now I'm not sure. Hence the question... Gerald On 28-Jul-98 James Olin Oden wrote: > >> Hi there, >> >> I have an app that has some basic tables. When a new project is >> created >> in the app, it creates 5 new tables (inherited from some base ones). A >> single project will never go outside of its tables. >> >> What kind of speed problems will I be seeing when there are a lot of >> projects? Say I have 100 projects, that would be 500 tables, plus a >> couple of base tables, say 510 tables. >> >> Am I going to start seeing major problems? >> > > Could share with us the tables you are using as base tables? I suspect > that > you may not need to do this scheme of duplicating tables, but could > rather > have a table that has a project ID, that you use to segregate your > project > data amongst these tables. At anyrate I would love to see the tables > you > have, and then I would know if how best to help....james > > --- #include <std_disclaimer> Gerald Brandt gbr@hvdc.ca Manitoba HVDC Research Centre http://www.hvdc.ca
Gerald Brandt wrote: > Hi James, > > Actually, it all COULD be placed into the base tables, and in fact it was > before I separated them. I just thought that I would get a speed increase > by more cleanly separating the data. Then I sat down and thought about it > some more, and now I'm not sure. Hence the question... > > Gerald I think I understand now. As far as the original peformance question you asked, I am not in a position to answer it, but as a software tester (one of the many hats I wear) I would suggest creating some sort of script or program that goes in loop and generates the SQL code to generate 500 sets of these tables. Then I would write another program to fill the tables with pseudo random data. Then I would write another program to do various inserts, and selects againsts this data, only being the devious tester that I am, I would create a shell script or perl script perhaps that would kick off about a hundred of these, maybe a thousand if the system would handle (actually if it doesn't that would be good, because you kind of know a limit to your system, but if its a live system...) and during all this, I would be generating logs to keep track of the time it takes to do all these transactions. I might at that point generate another script to pull all this data together and see if I could make something out of all the data...I might even put the data in a postgreSQL database. Anyway, that's a lot of work, but it is a sure way to find out just what will happen before you spend a lot of time developing a polished user app that bases its internals on this. It would probably be fun even, but...it certainly would at least eat up an afternoon...james
Hi there, Okay, I've created a test. The test has 100 projects, and each project has 500 entries. So basically it created 500 new tables, and 50000 records. Speed difference is negligible. I do not notice a slowdown, although I have not done actual timing tests. Today, I will add 50 more projects with 1000 entries each (again 50000 records), and essentialy double the size of the database, but not drastically increase the number of tables. Gerald On 28-Jul-98 James Olin Oden wrote: > >> Hi there, >> >> I have an app that has some basic tables. When a new project is >> created >> in the app, it creates 5 new tables (inherited from some base ones). A >> single project will never go outside of its tables. >> >> What kind of speed problems will I be seeing when there are a lot of >> projects? Say I have 100 projects, that would be 500 tables, plus a >> couple of base tables, say 510 tables. >> >> Am I going to start seeing major problems? >> > > Could share with us the tables you are using as base tables? I suspect > that > you may not need to do this scheme of duplicating tables, but could > rather > have a table that has a project ID, that you use to segregate your > project > data amongst these tables. At anyrate I would love to see the tables > you > have, and then I would know if how best to help....james > > > --- #include <std_disclaimer> Gerald Brandt gbr@hvdc.ca Manitoba HVDC Research Centre http://www.hvdc.ca
Hi there, I've created my second test, mentioned below. I now have 150 projects, with a total of 100,000 records. This equates down to about 760 tables at the same time. PostgreSQL handles this quantity of tables just fine. My searches in a particular project are fast, and response is fantastic. So, my conclusion.... A high number of tables in PostgreSQL does not slow down access, when you limit your queries to a small subset of those tables. Gerald On 31-Jul-98 Gerald Brandt wrote: > Hi there, > > Okay, I've created a test. The test has 100 projects, and each project > has 500 entries. So basically it created 500 new tables, and 50000 > records. Speed difference is negligible. I do not notice a slowdown, > although I have not done actual timing tests. > > Today, I will add 50 more projects with 1000 entries each (again 50000 > records), and essentialy double the size of the database, but not > drastically increase the number of tables. > > Gerald > > On 28-Jul-98 James Olin Oden wrote: > >> >>> Hi there, >>> >>> I have an app that has some basic tables. When a new project is >>> created >>> in the app, it creates 5 new tables (inherited from some base ones). >>> A >>> single project will never go outside of its tables. >>> >>> What kind of speed problems will I be seeing when there are a lot of >>> projects? Say I have 100 projects, that would be 500 tables, plus a >>> couple of base tables, say 510 tables. >>> >>> Am I going to start seeing major problems? >>> >> >> Could share with us the tables you are using as base tables? I suspect >> that >> you may not need to do this scheme of duplicating tables, but could >> rather >> have a table that has a project ID, that you use to segregate your >> project >> data amongst these tables. At anyrate I would love to see the tables >> you >> have, and then I would know if how best to help....james >> >> >> > > --- >#include <std_disclaimer> > > Gerald Brandt gbr@hvdc.ca > Manitoba HVDC Research Centre http://www.hvdc.ca > --- #include <std_disclaimer> Gerald Brandt gbr@hvdc.ca Manitoba HVDC Research Centre http://www.hvdc.ca
Hi, me again, As a small addendum, the entire database size is not that big, only 300 MB. I will retry the tests laster when I get a bigger hard drive, and let you all know the rsults, if you are interested. Gerald On 31-Jul-98 Gerald Brandt wrote: > Hi there, > > Okay, I've created a test. The test has 100 projects, and each project > has 500 entries. So basically it created 500 new tables, and 50000 > records. Speed difference is negligible. I do not notice a slowdown, > although I have not done actual timing tests. > > Today, I will add 50 more projects with 1000 entries each (again 50000 > records), and essentialy double the size of the database, but not > drastically increase the number of tables. > > Gerald > > On 28-Jul-98 James Olin Oden wrote: > >> >>> Hi there, >>> >>> I have an app that has some basic tables. When a new project is >>> created >>> in the app, it creates 5 new tables (inherited from some base ones). >>> A >>> single project will never go outside of its tables. >>> >>> What kind of speed problems will I be seeing when there are a lot of >>> projects? Say I have 100 projects, that would be 500 tables, plus a >>> couple of base tables, say 510 tables. >>> >>> Am I going to start seeing major problems? >>> >> >> Could share with us the tables you are using as base tables? I suspect >> that >> you may not need to do this scheme of duplicating tables, but could >> rather >> have a table that has a project ID, that you use to segregate your >> project >> data amongst these tables. At anyrate I would love to see the tables >> you >> have, and then I would know if how best to help....james >> >> >> > > --- >#include <std_disclaimer> > > Gerald Brandt gbr@hvdc.ca > Manitoba HVDC Research Centre http://www.hvdc.ca > --- #include <std_disclaimer> Gerald Brandt gbr@hvdc.ca Manitoba HVDC Research Centre http://www.hvdc.ca