We are addressing the same issues on the Loci list as the Perl-XML list is, interestingly. This thread concerns heavy loading of web servers with XML parsing of large files. David Lapointe Manager - Research Computing Services UMass Medical School Worcester, MA 01655 508/856-5141 "What we obtain too cheap, we esteem too lightly." - T. Paine -----Original Message----- From: Tim Bray [mailto:tbray at textuality.com] Sent: Monday, June 14, 1999 6:13 PM To: Perl-XML Mailing List Cc: Perl-XML Mailing List Subject: Re: CGI performance issues was Re: Xmerge or XSL... At 11:00 PM 6/14/99 +0000, Matt Sergeant wrote: >- 30 meg seems *way* high. I don't know what's going on there - but I've >been seeing about 10 maximum with mod_perl and XML stuff going on. I >guess our XML files are a lot smaller. Things to look out for: Using >XML::DOM (sorry Enno!) - it gobbles memory like no tomorrow. No need to apologize to Enno; it's an axiom that to to load N bytes of data into a tree, you're going to burn somewhere between 5 and 50 times N worth of memory, depending on how dense the tagging in, how clever your data structures are, and what programming language you're using. Basically, if you want to read chunks of XML of unconstrained size off the network, you simply cannot afford to use DOM-like APIs, that's all there is to it, or you're staring down the barrel of a loaded cannon. -Tim