All of us have heard that if you just add memory to your system, it will work faster. This is generally accepted and it makes alot of sense, but what is the math behind it?
About 2 years ago, I heard the presenter of a webcast that I was viewing put these two elements in a perspective that I relay on to others when speaking about memory. Basically, Disk Access is measured in milliseconds; RAM Access is measured in nanoseconds. There are 1 million nanoseconds in 1 millisecond. In other words the magnitude of access time is on the order of 1 million; that's pretty big. In real human time, I can't distinguish between a nanosecond and a millisecond, but there is a HUGE difference.
Just about everyone can look at their watch or a clock and know how long a second takes. If you sit and stare at that timepiece until 1 million seconds tick by, you will stare at it for 11.57 days!
(1 million seconds / (60 seconds) / (60 minutes) / (24 hours)) = 11.57 Days
As you can see, there is a HUGE difference in the access speeds between IO and RAM. We all knew this inherently, but sometimes an example can really help - especially when budgeting season is imminent.