Hi Luke,
sorry for the few months delay - I was stuck with moving, family, job applications, etc.
I would like to bring up again the painful topic of a long-sustainable graphical interface in the EOMA68 specification ( http://rhombus-tech.net/whitepapers/ecocomputing_07sep2015/ ) as the RGB/TTL solution is so utterly bad choice.
RGB/TTL is way too slow and highly limited (alone, but even more in EOMA specifications):
a) maximum resolution 1366x768 (even though the current EOMA should be capable of 1440x900) b) maximum pixel clock about 148.5 MHz (~ max 71.6 fps at the mainstream 1920x1080 with visual artifacts and high EMI disqualifying EOMA for mobile devices with radios) c) maximum color depth 18 bits (3x6) d) eats a lot of pins (18) already in this lowest quality set up e) requires HW changes (i.e. disallows sustainable plug & play) in case EOMA should mitigate the limitations above (parallel interfaces require addition of many pins) - this is anyway impossible, because all pins are already in use f) easy (not requiring a chip) conversion to VGA output; conversion chips to all (!) other interfaces (which are modern, serial, and ubiquitous) is needed (and is more expensive than a serial->RGB/TTL because of very low purchaser interest) g) easy implementation in FPGA (few hundreds of LUTs)
Yet EOMA does not even allow adding any better display interface, because there are not enough "free" pins for any modern serial interface (MIPI DSI, eDP, HDMI, ...). This effectively totally (!) disallows manufacturers of EOMA cards to add an SoC <-> MIPI/eDP/HDMI circuity on an EOMA card to provide at least mainstream (!) resolution output with 24 bit colors for internal displays (not talking about 2017, when it will supposedly be 4K at 30 fps with 24 bit colors).
In other words, this only one fact degrades the whole EOMA to the category of yet another toy (as every other libre general-purpose user computer HW failed up until now). By the way I had to publicly confess this during my talk at the conference OpenAlt 2016 (https://openalt.cz/2016 , there is a full recording).
After reading through all the important emails from Arm-netbook beginning in 2010 (yeah, 571509 lines of text), many of your posts on different web servers, and watching nearly all your videos, I did some research on display interfaces. Few quick facts based on my findings follow (yes, I focus on lower mainstream and mainstream "fat" embedded and mobile segment, not on total low-end, because there we have zillion of existing PCBs all offering basically the same HW interfaces - some of them even libre).
From panelook.com (size >= 7.0", px density >= 160 PPI):
* LVDS: 382 panels in MP YEAR 2016 (2015: 53) => ratio (the higher the better) 53/382 = 0.138 * MIPI DSI: 239 panels in MP YEAR 2016 (2015: 50) => ratio 50/239 = 0.209 * eDP: 233 panels in MP YEAR 2016 (2015: 66) => ratio 66/233 = 0.283 * RGB/TTL: 15 panels in MP YEAR 2016 (2015: 2) ratio 2/15 = 0.133
(the ratio shows how much is the certain interface on rise)
Video interfaces from data sheets of few tens of more performant (i.e. having more computing power) mobile SoCs (no AMDs, no Intels) in 2015 & 2016:
* LVDS: nearly nowhere * MIPI DSI: everywhere (!) * eDP: nearly nowhere (in contrast to big chips like Intel i5/i6/i7, where eDP is largely prevalent) * RGB/TTL: nearly everywhere (but especially on smaller SoCs)
We can see a strong trend of LVDS disappearing (though having still the major position in 2015 & 2016), MIPI DSI and eDP on fast rise and RGB/TTL on a total decline reaching it's physical limits. Add the fact, that MIPI DSI is present on basically every mobile SoC since cca 2014 (in contrast to 2012 when EOMA was looking for "the ultimate video interface" and when RGB/TTL was really the only portable option) and moreover is easily and cheaply convertible to eDP or to LVDS, we have a clear winner. By the way, even Intel also recommends an external MIPI DSI to LVDS and MIPI DSI to eDP bridges for his SoCs. Based on all that I'm confident, that in the upcoming 10 years, the SoC market will use MIPI DSI everywhere as the main standard and eDP for the few biggest chips.
MIPI DSI also doesn't have the issues as LVDS, when the specification did not cover the chosen width and properties of implementation and thus prevented bundling universal conversion chips.
MIPI DSI offers:
a) highest state-of-the-art resolutions (not limited to, but supporting 4096x2160) b) highest state-of-the-art refresh rates (not limited to, but supporting 120 fps at 1920x1200; i.e. pixel clock about 276.4 MHz) without visual artifacts and while remaining low-power c) 24 bit color depth (3x8) d) eats just 4 pins for minimal configuration with one lane (in practise 4 lanes are most common, so 10 pins will be needed) e) requires none (increasing frequency) or very minimal (adding two pins as a new data lane) HW changes (MIPI DSI is a serial interface) f) VGA output (hell it should die out already!) requires a conversion chip (which is not so expensive, so it shouldn't influence the Micro Desktop PC PCB nor notebook PCB price; actually I would not provide VGA at all on these consumer devices, but rather (e)DP or HDMI, because consumers do not want VGA any more and external eDP -> VGA converters are about 5$ with free shipping world-wide for those who need it or want to be really eco-friendly and use the few surviving VGA devices at the expense of higher electrical consumption) g) easy implementation in FPGA (there are fully functional existing implementations having just about 2000 LUTs)
But how to cope with this when there is already an EOMA card in manufacturing?
Let me boldly demonstrate "thinking outside the box" (kicked off by email from Luke from Sat, 13 Jul 2013 17:22:41 +0100 and supported by the "take advantage of the MIPI / eDP" statement of Luke from Thu, 10 Apr 2014 10:48:46 +0100).
Can we provide both interfaces (RGB/TTL + MIPI DSI) on the same pins while having a HW way to choose from these?
Yes we can! EOMA already counts on several types of PC Cards (originally called PCMCIA). At minimum two - thinner (Type I - 3.3 mm) and thicker (Type II - 5 mm). Let's declare the thicker cards to be high-end and offer only MIPI DSI while thinner cards low-end with just RGB/TTL. Problem solved!
The "high-end" specification shall then also be extended allowing higher thermal dissipation (5W is too low - maybe 15W would be OK as it's still easy to cool passively) etc. to accommodate "high-end" (actually mainstream, but in this context it's high-end) requirements.
Speaking about thermal dissipation, I'm not sure that in case of the high-end card type, this limit should be a fixed one. I would probably prefer the 15W value as a strong recommendation instead of a firm requirement. While always (disregarding whether it's smaller than 15W or not) requiring to readably and fully visibly to the end user quote (at best on the outermost coating) the maximum dissipation of the whole particular card under heavy load. Why? Because there will be a manufacturer offering a special super hyper mega powerful card implementing a "turbo" mechanical switch automatically overclocking the by default heavily underclocked 8-core beast.
This could then be finally called useful and sustainable for 10 years.
Recap of solved issues:
* No more copyleft-like/religious constraints ("we require you to only use libre and eco-friendly stuff, not the new fast non-libre SoCs with non-refurbished eco-unfriendly displays"), but rather a permissive approach. * No more issues with non-tiny display panels. * No issues with users putting a wrong card to the slot. * No more issues with dissipation. * No more issues with libre world being old, slow, ugly looking, etc. (as I'm often told, because it unfortunately currently holds). * No more issues with decision makers, who will finally get the freedom of choice from several cards (keep in mind what the marketing/portfolio research says - a set of more similar but still diversified products/services sell significantly better than one product/service disregarding whether it's high-quality or not). * Useful for the upcoming hybrid phone ( http://rhombus-tech.net/community_ideas/hybrid_phone/ ) - lower number of pins, lower consumption.
Enjoy the new life in China with your family and good luck with discovery how the world's real HW production and market works,
-- Jan