Hello:
I read the white paper referenced on the Crowd Supply campaign page [1], and I was particularly interested in the section on "good enough" computing. The following paragraphs describe some problems that, as far as I recall, aren't addressed in the rest of the paper:
Now, in 2015, "good enough computing" has been cross-examined, and
found wanting - perhaps not for the right reasons though. The key problem of having a three to five year old computer is not so much that it can't do the job it was designed to do: if the computer was not connected to the Internet it could continue to be used for its designated tasks until it suffered major component failure (possibly in 8 to even 15 years time).
The problem is that the kinds of web sites that most people visit and
want to use are being designed with modern computers in mind. Even some recent smartphones are more powerful than high-end desktop computers of a decade ago. The latest version of Google Maps, for example, when using the "Street View", overwhelms a recent version of Firefox running on a computer with 8 Gigabytes of memory and a Dual-Core Dual-Hyper-threaded 2.4 Ghz processor, causing it to reach 100% CPU and lock up the entire machine.
But that's not so much the real problem: the real problem is the
inter-dependent nature of Software Development. Upgrading even just one application often brings in a set of dependencies that can result in the entire operating system needing an upgrade. And the longer the duration since a software upgrade, the less likely it is that one single application may be upgraded without huge impact and inconvenience. With no knowledge (or convenience) on how to upgrade software or hardware, most people pick the simplest solution...
This "upgrade treadmill" has bothered me for a while. Yes, with modular hardware like the EOMA68 cards and housings, the environmental impact is lessened because we only have to discard computer cards, not whole laptops. But unless we can stop the upgrade treadmill, we'll still have to discard our old computer cards when they would otherwise still be functional.
I remember the laptop I used throughout my university education, from 1999 to 2003. It had a 366 MHz mobile Pentium processor and 64 MB of RAM (later upgraded to 192 MB when I had to work on a fairly memory-hungry Java application under Windows). In its original configuration in 1999, that laptop was perfectly comfortable for everything I wanted to do, at least under Linux: Web browsing, email, word processing (including StarOffice), software development, and listening to music. Now I don't know if X would run at all in 64 MB of RAM.
As another illustration of how much waste the upgrade treadmill causes, here's a paraphrased bit of dialogue from the 2012 novel _Off to Be the Wizard_ by Scott Meyer. One character, a time traveler from 1984 whose last computer was a Commodore 64, asks, "What on earth can a person do with 4 gigabytes of RAM?". The other character, from 2012, replies, "Upgrade it immediately." Maybe that was supposed to be funny; the whole book is pretty light-hearted. But to me it's just sad.
So what can we do about this? The only idea I've got is that I and other software developers should do all of our work on the most underpowered computer that will let us get buy, rather than the nicest one we can afford. Then maybe, out of necessity, we won't be so wasteful. But then maybe we won't be as productive either, particularly if not being wasteful means we have to write everything in C or C++. And of course, it won't do any good if I'm the only one who chooses to make those sacrifices.
At least with free software, there's always the possibility to fork projects that succumb to the upgrade treadmill. For example, the MATE desktop environment is a fork of GNOME 2, and one of its explicit goals is to run well on non-compositing graphics hardware. I imagine MATE will run quite well on something like the A20 card. But still, we can't live in a forked, time-warped world. We have to interact with mainstream websites, which means using a mainstream browser or at least one of the major rendering engines. In this regard in particular, I wonder if the upgrade treadmill has already left the A20 behind, particularly since we can't use full GPU acceleration. I can certainly understand why the JZ4775 wasn't chosen, though it checks all the other boxes for ethical computing.
Anyone else have any thoughts on this? Sorry if this is too much of a rant or off-topic here. FWIW, I just backed the campaign by ordering an A20 card.
Matt
[1]: http://rhombus-tech.net/whitepapers/ecocomputing_07sep2015/