On 09/08/2018 03:21 PM, enh wrote:>> I see the plan to move to compiler-rt: Are
you familiar with Rich Pennington's
Post by enh Post by Rob Landley
http://ellcc.org ? He got a toolchain with no gnu components working a few years
ago, including compiler-rt. (He works down the hall from me at my current
contract, we keep meaning to sit down some evening and work out an update of
ellcc that would let me use his toolchain in my
https://github.com/landley/mkroot project but we've both been too busy so far...)
oh, it basically works, but there are lots of missing pieces and
corner cases (that differ between architectures). the NDK tends to hit
more of these than the platform, because we have enough control over
the platform code to just work around problems when we need to.
Let me know when there's a test build of a complete non-gpl toolchain. :)
Post by enh Post by Rob Landley Post by enh
but, no, 32-bit isn't going anywhere any time soon. we *are* insisting
that folks have a 64-bit version of any 32-bit app so that at some
point it'll be possible to have a 64-bit-only device, but i assume
even that will only be another step in the slow decline of 32-bit.
32 bit's alive and well in the embedded space, and x32 remains interesting on 64
bit systems, but there's only 20 years until y2038...
Which is why x32 is a new API. Android probably wants to grow an x32 variant at
Post by enh
this (or fixing time_t) would be ABI breaks for Android, though, so i
assume that 32-bit Android will go away, even if 32-bit Linux lives
20 years is a long time, and "here's a new API, old one is deprecated" is
probably going to be required for Android at some point between now and then.
Even Microsoft gave up on full backwards compatibility after about 20 years:
That said 20 years is not _that_ long (1998 to now), and the end of moore's law
makes it more predictable. I expect all the _processors_ to be 64 bit by then,
but an x32 mode is probably still likely to be useful if memory bus bandwidth
and how much stack space fits in L1 cache remain constraining in any way.
The s-curve of Moore's Law started bending back down in 2000 (when Intel
recalled its overclocked 1.13ghz pentium III http://www.anandtech.com/show/613
and processor development went wide instead of clocking faster). A couple years
later Gordon Moore gave a speech called "No Exponential Is Forever" basically
warning the industry that the curve was bending down.
https://ethw.org/Archives:No_Exponential_is_Forever then in 2004 Intel cancelled
its 4ghz chip http://www.pcworld.com/article/118603/article.html and went wide
instead, then core duo (intel's first on-die SMP) shipped in January 2006. Going
wide bought another decade, but Moore's speech was something like 16 years ago
now and spectre and meltdown feel kinda end-gamey for single-threaded
performance improvements. But the head of the ACM called the end of Moore's Law
even before that: https://www.acm.org/articles/people-of-acm/2016/david-patterson
The big driver for both memory and storage consumption since the DOS days has
been increasing display resolution and video, but a 4k display is the same
resolution as digital cinema blown up on the wall of a theatre
https://en.wikipedia.org/wiki/Digital_cinema and the point of a "retina display"
is it's at the limits of human perception, and frame rate increases have been
kinda meh too (movies have been at at 24 frames and TV at 30-ish for most of a
), we've _already_ doubled
that today, so that's near endgame too...
I'm sure we'll come up with new stuff over the next 20 years but ideas like
"solar powered phone" and "longer battery life" almost call for _less_
performance. (Yes, I say that wanting to turn android phones into self-hosting
development systems, and building the whole of AOSP on a current high-end phone
is probably something like 24 hours. I'm not saying this is what I _want_, it's
what I _expect_.)
That's why 4 gigabytes being sufficiently more than most individual applications
currently use today seems like it should still being a generally useful compile
time option in 20 years. (The _system_ will probably want (at least) 64 gigs,
there's an app/OS split and then the GPU has its own address space so your
textures don't have to stay mapped...)
*shrug* We'll see. But I expect a 32 bit API to survive y2038.
Post by enh
my point is that even if your Nexus 5 hasn't had an OS update in a
couple of years, it can still download new apps. NDK r17 still
supports targeting >= ICS from 2011, which is several years older than
the oldest OS release that runs on Nexus 5.
when i say "device lifespan" i mean "years before users stop
installing apps on it, and app developers stop caring about it", not
"years it gets OS updates".
Ah, that's why the default is the _oldest_ API level. Makes sense now.
Post by enh
(fwiw, Nexus 5 got KitKat, Lollipop, and Marshmallow.)
People have built Oreo for it. But while I'm in milwaukee most of my internet
access goes through it, so reimaging it would be problematic.
Post by enh
the Android P version of the CDD
(https://source.android.com/compatibility/android-cdd) still allows
shipping new 32-bit devices. so the countdown timer to the NDK
dropping 32-bit support hasn't even started yet.
There's an Oreo build for the Nexus 5. I should try to image the sucker, I just
want to do so first on a device I _won't_ be significantly inconvenienced by
bricking. (I break everything.)