For quite a while, I thought that 32 bits was more than enough for most people. I had this belief in 1999 and I had it up to yesterday.
Today, I read something that made me change my mind. For some reason, I had some pretty big blinders on to the whole Linux memory system. Linux’s kernel has a performance hit when addressing more than 1 gig of physical memory. This is pretty serious. It turns out that a lot of the 32 bit OSs don’t have a great answer for this (Windows uses a 2gig user and 2 gig kernel split, so basically a 2 gig physical system for windows).
Also, I have had the need to work with some large in memory data sets that easily consume 2+ gigs at a time. Linux barely handles this with its 3/1 – user/kernel split. I’ve run ‘Out of Memory’ enough times to consider just moving to a 64 bit arch.
So, originally I thought the vast majority of users would be OK, but I was ignorant to the 1 gig physical design choice. I thought that a machine that had 4 gigs was 4x as good as a machine with 1 gig. That was wrong.
From now on, all of the big data machines that I have will be 64 bit systems. My next desktop OS will be 64 bit as well. The time has come.