Jarie does make an interesting point about the monitors themselves using a certain amount of VRAM. I've done some machine learning work that's very VRAM dependent with a lower RAM card on a Mac (GTX 980 in a cheesegrater) and had to unplug one monitor to free up that little bit of extra VRAM to not run out of VRAM.
I also observed something else - the act of unplugging and replugging a monitors DVI cable "clears out" the currently used VRAM. That is to say, imagine you've been working for a while, have a bunch of programs open and have, say, 1 GB free of VRAM. Unplug the monitors DVI cable, plug it back in, and you'll go back to 3 GB free for that moment.
I'd be curious if the machines that were able to render Brendan's VR project at full res only had one monitor? Or perhaps they had more recent versions of the Nvidia web driver or CUDA driver - if they have Nvidia cards, that is.