This site totally sucks when viewed on a smartphone.
I'll fix this Real Soon Now.
"All optimizations are a form of caching."
-- Eric Traut, Performance Evangelist, Apple Computer
I've never been so sure Eric's assertion is actually correct, but I have yet to find a counterexample despite seventeen years of searching for one. But Eric should know, as he wrote the Dynamic Recompiling Emulator for the PowerPC Classic Mac OS, that enabled one to run 680x0 binaries - even device drivers and interrupt handlers - on PowerPC Macs by converting their PPC machine code to 68k on-the-fly.
I know all the inner workings of the Dynamic Recompiling Emulator, but due to my Non-Disclosure Agreement with Apple, I have no clue what I can actually tell you about it. But I can tell you this:
On Sun, Jun 17, 2012 at 9:54 AM, Christoph Erhardt <firstname.lastname@example.org> wrote:
>Also, the implementation of puts() is a lot smaller - which might matter
>if the binary is linked statically and code size is important.
I humbly submit that we all ought to tattoo the following on each others' foreheads, so that this important *fact* need never slip the grasp of our tragically limited minds:
Code size is important even if not linked statically and you have gigabytes of physical memory.
That's because the single greatest contributor to poor software performance, as well as one of the most signifcant sources of power consumption, is to hit the disk or network file server to page in either code or data that your process needs to continue running.
Even if the code and data - I expect there is lots of data - used by printf and friends is already in-core, that means that some other code or data cannot be. Thus the use even of an already in-core printf will result in greater paging then if printf were never or only rarely used.
These same considerations apply to every level of caching in one's entire system, all the way up to whether a register will need to be saved to the stack to make room for the registers required by some subroutine. Thus you can see that simpler subroutines, especially "leaf" routines that call no other subroutines themselves, will save significant runtime and - of far more critical importance these days - power.
I learned the importance of that when Luke Crawford, the owner of the Prgmr.com Xen VPS Hosting Service, pointed out to me that his company's single-greatest cost center is electrical power, not so much because it costs a lot to obtain the current from the power company, but because it costs so much to cool the data centers his servers are in.
If you look over the cost of just about any web hosting company's service plans, you are going to find that the cost of hosting is absurdly high compared to what it would cost you to host yourself at your home or place of work were you to pay your ISP for but one static IP address. For most web hosting services, that's because of the cost of data center cooling, with the cost of the really, really pricey hosts being dominated by the cost of technical support for their end-users.
Do Whatever You Can To Optimize The Performance Of Your Servers.
I'll be having a lot more to say son on this topic, but just for now, those of you who operate any manner of database-driven website, the single most-effective way that you can save on electrical power, as well as dramatically improving the performance of your entire website, is to read and to follow the advice of The Data Access Handbook.
The Data Access Handbook: Achieving Optimal Database Application Performance and Scalability by John Goodson and Robert A. Stewart
You'd blow more than that on dinner for two and a bottle of wine.
My degree is in Physics, not Computer Science.
I regard myself as a Software Engineer rather than a Computer Programmer for much the same reason as my father regarded himself as an Electrical Engineer for having the chops to take out a supersonic North Vietnamese MIG Fighter Jet through the use of nothing other than the USS Providence's radar, an Analog Computer - yes, really, not a digital one - and some quick thinking, every single time enabling some other poor bastard to die for his country, before that MIG could get within firing range for its Air-to-Ship missiles at a distance of twenty miles, with no software of any sort, and absolutely all of the electronics involved being hand-soldered from discrete components, for the most part vacuum tube technology from the early 1950s.
I have been researching the issue of software power consumption ever since I was one of the Men in Black - yes, really, "Man in Black" was my official job itle - at Sony Ericsson Mobile Communications in Redwood Shores, California. I decided to look into the power management troubles of the Sony Ericsson (now Sony Mobile) XPeria Play as a result of my incredible frustration with my complete inability to flash the firmware in our prototype units without bricking them.
It turned out that, due to the early development of both our unit's firmware and hardware, a serious bug in its power management sometimes lead it to draw more current during boot than its battery could supply. You all must surely appreciate how delicate today's integrated circuits are. Now consider how brain-damaged they get when a full-featured, 32-bit computing device gets when its supply voltage drops well below specification.
I have completed only one Computer Science course in my entire life, despite having written code since 1976. My colleagues over at Kuro5hin like to make fun of me when I point out that Dr. Carver Meade devoted his entire first lecture for Caltech's CS10 to considering the drift of electrons in doped Silicon Crystals.
However the final project for that class, which I enrolled in during Spring Quarter 1984, was to write in Pascal a full-featured GUI color vector graphic editor on a six megahertz Motorola MC68000 workstation made by Hewlett-Packard that ran the UCSD P-System. It was well over ten years after Spring 1984 that I ever so much as laid my eyes on any commercially-sold vector graphic editor that even remotely approached the capabilities of the editors that every single CS10 student was required to write just to pass what was only a Pass/Fail class.
I once devoted three or four solid hours to an in-depth technical argument with Regan Gill, one of my very closest and oldest friends and one of the very finest Software Engineers who has ever Walked the Earth, over my assertion as to the vital importance that even Java programmers learn the Assembly Code for at least one Instruction Set Architecture before ever being permitted to take paying work writing Java software.
Regan felt I was very, very wrong. Her position was that her company - at the time a rather large Electronic Medical Records Software vendor - wrote absolutely all its code in Java because of the importance of that code being unassailably correct.
"If Java is so cool," I replied, "Why did ClickRebates have to set up a cron job to reboot their e-Commerce server every night when it ran out of swap space?"
That was because ClickRebates' original coders - long before I hired on - bought into the Bill of Goods sold them by Sun Microsystems, which is the Urban Legend that falsely asserts that programs in garbage collected languages never suffer from memory leaks. That bald-faced lie had the eventual result that Borland International once earned a tidy income through the sales of a Java memory leak detector.
My reasoning for the importance of Assembly Code knowledge to every coder though had nothing to do with leaks, but the fundamental point that computers never, ever, ever execute anything other than Machine Code. Therefore it is of vital importance to understand, at a deeply intuitive level, what the effect in terms of machine code will be, whenever you write so much as one line of source code in any language at all.
The Mind Simply Reels.
I was one of Dr. Meade's very best students in CS10 that quarter. The reason I attended every single lecture - on time - and listened with rapt attention was not at all because I found his material so fascinating, but because I was completely dumbstruck with awe that one of the very pioneers of Very Large Scale Integrated Circuit Chip Design - VLSI - delivered every last one of those lectures while clad in the most appalling 1970s-style Leisure Suit.
Many years later, I was moved to subscribe to Forbes Magazine the very instant I saw Dr. Meade on the magazine cover, wearing that very same synthetic jacket.
He looked just like Leisure Suit Larry!
You see, back in the day a tiny little struggling hardware startup desperately needed some consulting advice from Dr. Meade, but had no money with which to pay him.
"Is there anything else we can do?" they asked.
Dr. Carver Meade, a modestly paid Electrical Engineering Professor at the California Institute of Technology in Pasadena, Earned the Wealth of Croseus when he cheerfully replied:
Some of Intel's stock would do just fine.