[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: memory allocation
- Subject: Re: memory allocation
- From: Craig Markwardt <craigmnet(at)cow.physics.wisc.edu>
- Date: 27 Jul 1999 01:36:23 -0500
- Newsgroups: comp.lang.idl-pvwave
- Organization: U. Wisc. Madison Physics -- Compact Objects
- References: <379CC4A9.846FB985@cmrr.umn.edu><MPG.120682dbfcff9cbf98984c@news.frii.com><379CDED1.E303E67F@cmrr.umn.edu><MPG.1206c5d4996f504498984f@news.frii.com>
- Reply-To: craigmnet(at)cow.physics.wisc.edu
- Xref: news.doit.wisc.edu comp.lang.idl-pvwave:15845
davidf@dfanning.com (David Fanning) writes:
>
> Essa Yacoub (yacoub@cmrr.umn.edu) writes:
>
> > I am assigning an array using the complex function
> >
> > rd=
> > complex(mag_regress*cos(phase_regress),mag_regress*sin(phase_regress))
> >
> > >>99% of the time this turns out to be a programming error >>>
> >
> > the code works, because I have successfully used it on several data
> > sets of
> > different sizes. In this particular case, the data size is much
> > larger. I am running
> > this on an sgi onyx 2 system with 2 gig of ram, so that should not
> > be
> > a problem. In fact I can open another session in another window and
> > create bigger arrays with no problem. It actually crashed before
> > this above statement
> > and I zeroed an array I was no longer using and it continued a while
> > longer.
> > (using .con) before crashing again at a different line.
>
> I'm going to guess that there are more of these large arrays
> hanging around that you are also not cleaning up. When you
> ...
David makes some good suggestions for being memory efficient in your
IDL programs. However, if you are running on a Unix system you may be
running into the memory limit for a single process. Memory footprints
for processes are typically limited to prevent a single task from
hogging all of the resources, but often this is what you want!
You can try either limit (csh or tcsh) or ulimit (sh). By itself,
"limit" produces this output for me on an alpha machine:
[33]> limit
...
datasize 131072 kbytes
...
If you get a small number like I did, then you can try increasing it,
like so,
limit datasize 2048M # 2048 megabytes = 2 gigabytes
Usage may vary, depending on your shell and Unix version. If you
can't increase it yourself, it's time to abase yourself to your
sysadmin.
Good luck,
Craig
--
--------------------------------------------------------------------------
Craig B. Markwardt, Ph.D. EMAIL: craigmnet@cow.physics.wisc.edu
Astrophysics, IDL, Finance, Derivatives | Remove "net" for better response
--------------------------------------------------------------------------