Hi Martin,
I haven't hit that yet but may soon so I've been wondering about it.
Possible solution:
My sgifaceserver runs as user "nobody". So, give nobody a home directory,
and a shell, and in the .cshrc (or .bashrc or whatever) add a line
"limit datasize unlimited"
(Actually datasize defaults to unlimited on my Solaris 8 system, see below.
But my sgifaceserver does occasionally crash for lack of file descriptors.)
% limit
cputime unlimited
filesize unlimited
datasize unlimited
stacksize 8192 kbytes
coredumpsize 0 kbytes
descriptors 256
memorysize unlimited
- Dave
> To: bionet-software-acedb at net.bio.net> From: martin.trick at bbsrc.ac.uk (Martin Trick)
> Subject: sgifaceserver with big databases
> Date: Mon, 11 Nov 2002 16:08:07 +0000 (GMT)
>> Apologies to all, as I guess this is mainly a UNIX SysAdmin question ...
>> My ACEDB database is now rather large and I'm experiencing a problem
> with sgifaceserver 4.9f crashing as it trys to grab more memory than the
> user process datasize. I tried to up the datasize limit as root but I
> guess that only lasts as long as the current environment? As soon as the
> server crashes and restarts it inherits the (inadequate) hard limits
> that came with the system.
>> Is there any way around this apart from rebuilding the kernel :-(
>> thanks in advance
>> Martin Trick
---