Return to BSD News archive
Newsgroups: comp.os.386bsd.questions Path: sserve!manuel.anu.edu.au!munnari.oz.au!constellation!convex!convex!cs.utexas.edu!wupost!sdd.hp.com!caen!batcomputer!ghost.dsi.unimi.it!univ-lyon1.fr!scsing.switch.ch!ira.uka.de!math.fu-berlin.de!news.netmbx.de!Germany.EU.net!mcsun!news.funet.fi!news.lut.fi!junki From: junki@lut.fi (Juha Nurmela) Subject: Re: gcc - large arrays, out of vm - any way to avoid? Sender: news@lut.fi (Usenet News) Message-ID: <C3Dr9u.Dpz@lut.fi> Date: Thu, 4 Mar 1993 19:53:53 GMT References: <9303022137.AA04169@pizzabox.demon.co.uk> Nntp-Posting-Host: kobra.cc.lut.fi Organization: Lappeenranta University of Technology, Lappeenranta, Finland Lines: 43 In article <9303022137.AA04169@pizzabox.demon.co.uk>, gtoal@gtoal.com (Graham Toal) writes: |> I'm writing a program which has very little source code, but a whapping |> big initialised char array at the head of it. Well, I say 'whapping big', |> but in fact it's only 50K yet its running out of virtual memory during the |> compile (with the error: "prog.c:2683: Virtual memory exhausted.") |> |> I've tried making the array static, or putting it inside main as an auto. |> No help. Any suggestions how to get round this? Do I have to split it |> up into lots of separate arrays? :-( If it's a solution like that that's |> needed, I can hack it myself - I'm really more looking for some life-saving |> flag I can give that'll just make everything work magically... (or even just |> an explanation of why gcc can't cope with this, to satisfy my curiosity...) |> |> This is the gcc that first came out with 386bsd; the machine has 16Mb of |> Ram and I think 8Mb swap space. |> |> Thanks. |> |> #include <stdio.h> |> #include <stdlib.h> |> char prog[] = { |> /* The 50915 elements of this array have been removed for brevity */ |> }; |> |> int main(int argc, char **argv) |> { |> /* Prog deleted for brevity too - still went wrong with a null main */ |> return(1); |> } I have experienced this too, with 1.39 and 2.3.3 both. Seems that gcc eats lots of memory with big { 'x', 'y', 'z', ... } things. Default limits for memory are somewhere near 6 M (?), my solution was to: sulo $ exec csh % limit datasize 16M % exec sh Pages like hell, but eventually compiles... -- juha nurmela, Adr. 54430 Hujakkala, Finland. Tel. 953 78022 eiku: skinnarilank. 28d10, Tel. 953 26292