Right, the problem appears to be more an issue of data-rep for time, rather than an issue with 32-bit vs 64-bit architectures. Correct me if I'm wrong, but i think there was long int well before 32 bit chips came around(and long long before 64). Does a system scheduler really need to know the number of seconds elapsed since midnight on Jan-1st-1970? There are only 86400 seconds in a day(31536000 sec/year, 2^32 = 4294967296 - seems like enough, why not split time in 2?). On a side note, i tried setting up a little compute station on my TV about a year ago using an old raspi i had laying around, and the latest version of raspbian-i386 is pretty rot-gut. I seemed to remember it being more snappy when i had done a similar job a few years prior. Also, i seem to remember it doing better at recognizing peripherals a few years prior. I guess this seems to be a trend now: if you don't buy the new tech you are toast, and your old stuff is likely kipple at this point. i think the word I'm looking for is designed-obsolescence. Perhaps a potential light at the end of the tunnel was that i discovered RISC OS, though the 3-button mouse thing sort crashed the party and then i ran out of time. I'm also contemplating SARPi(Slackware) as another contender if i ever get back to the project. Also maybe Plan 9? It seams that kids these days think old computers aren't sexy. Maybe that's fair, but they can be good for the environment(and your wallet).