Laurence Payne NOSPAMlpayne1ATdsl.pipex.com wrote:
On 22 Oct 2007 11:06:02 -0400, (Scott Dorsey) wrote:
Sure, but data sets are often huge when compared with physical ram. It's
easy to make a PT system thrash if you throw enough tracks onto it.
No matter how fast and large computers get, users always figure out ways to
make them slow.
Does PT have particularly bad design in this area? Every multitrack
program I've worked with is designed to stream data to/from disk as
required. Given ample RAM, the program and/or os can be pretty clever
about caching, cutting down disk activity if you're continually
rolling over a particular section of music. But I don't see how
"thrashing" comes into it?
Ideally what you want to do is cache as much of the data set in memory
as possible. That means you aren't living by the whims of the
(nondeterministic) disk access time. But if you cache _too much_, the
core gets swapped out to disk and then you're back where you started.
Remember, the paging and swapping is done _by the operating system_ without
the application having any control over it. This isn't a realtime system,
this is Windows. So the application has no idea if it's going to make
deadline or not and there's no way it can ask the OS to find out. Consequently
we just throw hardware at the problem and everything is fine for a while.
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."