I wish people in here would not talk about a subject they do not know about.
People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum. Doing the exact same thing.
The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because they have to for consoles. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.
1920x1080P, uses 256MB of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.
Memory usage is not linear, it does not go up every freaking year. It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300
When publishers tell developers to do this for the PC:
1.) Don't have to optimize for a NUMA architecture.
2.) Don't have to prioritize assets.
3.) Don't have to write efficient rendering methods.
A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.