It's sort of a constant problem that as computers get bigger and more powerful, programmers get lazier. Programs are a gas, they expand to fill whatever space is available. To some extent that's reasonable, as computers are more powerful it's not wise to spend so much time on micro optimization, you should use that power to get your program done faster and with more features. However, people usually do this the wrong way.
Just because the computer is bigger and faster doesn't mean it's okay to use systems with bad asymptotic behavior. That is, your O() should still be right, but you can just be a bit more sloppy about the constant factor in front of that. Bad programmers instead get lazy and do things like use O(N^2) systems instead of O(NlogN). Similarly, just because you can use more disk space or more memory doesn't mean its okay to keep growing your disk cache without limit, or to have a memory leak and just never free things. Yes, you can be more bloated, but you can't be just plain wrong.