On 11/18/2015 5:10 PM, Jonathan Vanasco wrote:
> As a temporary fix I need to write some uploaded image files to PostgreSQL until a task server can
read/process/deletethem.
>
> The problem I've run into (via server load tests that model our production environment), is that these read/writes
endup pushing the indexes used by other queries out of memory -- causing them to be re-read from disk. These files
canbe anywhere from 200k to 5MB.
>
> has anyone dealt with situations like this before and has any suggestions? I could use a dedicated db connection if
thatwould introduce any options.
We have a system that loads a bunch of files up to be processed - we
queue them for processing behind the scenes. We don't load them into
Postgres before processing. We put them in a temp directory and just
save the location of the file to the database. This configuration does
have limitations. Post-processing can not be load balanced across
servers unless the temp directory is shared.
I'm sure you'll get more DB centric answers from others on the list.
Roxanne
--
[At other schools] I think the most common fault in general is to teach students how to pass exams instead of teaching
themthe science.
Donald Knuth