2 GB minimum, as stated by MS
however in my experience 4GB is probably the minimum for vista, sometimes my computer's 6 GB isn't enough.
Vista doesn't really consume a lot more memory, on it's own, from XP. the major difference is it's file caching. XP's memory cache was often quite small, and in fact the concepts that SuperFetch implement were really only present in a concept stage- (the prefetch folders) and they were stored as files. SuperFetch basically stores these "prefetch" files in RAM, along with also prefetch data for data files and dlls. Right now, not counting SuperFetch, I'm using around 25% of my physical memory; so 2GB of my 8GB. I'm running a good number of programs though, too, VB6, Apiviewer, firefox with around 20-30 tabs, Paint Shop Pro, an IRC client, etc.
Basically the "slowness" one experiences with Vista and 7 with less RAM is not really slowness but rather a lack of speed, if that makes sense. the "default" speed that Vista and 7 would go with that less amount of RAM, since SuperFetch has either less to work with or none to work with. The problem here is that many people then turn to a "RAM optimizer" such as RAMpage or AnalogX maxmem, these are great for displaying the memory usage, but their actual memory-freeing idea is somewhat misplaced. All it does is move most of what is in physical memory to Virtual Memory by allocating as much as possible. Additionally, this also happens to clear out Superfetch as the allocations incur on it's space.(the "free space" in RAM, that is).
The main thing that people forget is that a single CPU can only perform a single task at a time; more processes and fewer processors means more context switches between processes. the context switch is not instantaneous and can often incur almost 25% overhead when the timeslicing is to low and there are a lot of processes/threads running.
In Early versions of windows, multi-tasking was cooperative; that is, a program needed to actually relinquish control to other applications. This didn't necessarily need to be written into the application; the Default Window Procedure and GetMessage() would call the kernel "yield" function that would allow other programs to run their timeslices as well. However, if a long task occured in a program and that program did not explicitly call yield or a function that called yield itself (such as the Visual Basic "DoEvents" function) then the entire PC would be hung on the process.
With today's preemptive systems this is not the case; a Program can "seem" to take up 100% of the CPU, but really, it isn't- the pre-emptive task-switcher is still allowing other processes to run- (to allow for things such as window drawing* and mouse movement), but at a reduced timeslice. This is why we are still able to start task manager to close the process, or to even see that the process is using psuedo 100% (I mean, think about it- task manager obviously needs some CPU time to keep the listview updated, so for all intents and purposes there is no way you can both see 100% in task manager and have a process at 100%. The only time you can actually have a process at 100% is if it is able to hang a driver; this sometimes happens with Video drivers; you see this in the form that EVERYTHING stops, and you cannot do anything. With Vista, it can often recover from this state, and inform you of the issue. But until it recovers nothing at all happens- since the stuck code is in kernel mode and the pre-emptive task switching functionality only runs for user-mode, no pre-emptive task-switching can occur (driver code CAN relinquish control, but there is so much state information that then needs to be saved and restored it is not really worth it, especially since if the driver hangs your pretty much boned anyway).
footnote: * "to allow for things like window drawing" many might say, But BC! Why is it when Microsoft word is crashed, and I move the calculator window over it, do I see a lot of little calculators?
Ahh, this is an artifact of the way windows draws the desktop. You'll also notice that this behaviour does not happen with Vista's DWM.
With Pre-DWM desktop, every window was told to repaint when another window moved off of it; in this case, moving the calculator window made the window manager send the Word Window a WM_PAINT message. Under normal circumstances, word would oblige and repaint it's window. However, with word hung, the message is instead simply placed in the window's message queue; if and when Word recovers from whatever was causing it to be hung (perhaps an errant VBA script), then all these messages are retrieved and handled, and everything returns to normal.
Vista and Windows 7's DWM completely change this architecture (and it may be the new architecture is partially the cause of the increased base memory usage when Aero is being used, not necessarily the eye candy (which are really just bitmaps, no different from that of the Luna themes of XP) but rather the increased state information for the composition of the desktop as well as keeping to copies of every single window surface for compatibility reasons (after all, old applications were not necessarily written to, nor could they predict, the advent of a composited desktop). the new DWM basically takes the textures from every window and composites them together, so there is no change when a window hangs. However, the window texture of that window is "faded out" as compared to the other window, and, due to the fact that the application is no longer drawing to it's texture(s) no longer changing, either. This is a far more intuitive visual cue to an application hang then having a bunch of little visual artifacts on the screen, especially since the pre-emptive multi-tasking is vastly improved and the heuristic detection for a hanged program as well; it is possible, but very unlikely that the "faded out" window effect will be seen simply on a program that is busy, rather then one that is actually hanged.