Re: [OT] colonial weapons (Moore's Theorem)
From: Randall L Joiner <rljoiner@m...>
Date: Thu, 31 Jan 2002 21:49:43 -0500
Subject: Re: [OT] colonial weapons (Moore's Theorem)
Since you seem to want credentials, I'll spout a few of mine... (DSW
are
so much fun.)
Senior Systems Engineer for CNN. My (and my teams) duties include the
hardware, OS, software, networking, and every little piece of everything
inbetween. I'm responsible (in part) for over 600 systems, 300
peripherals, etc. etc. etc.
I have more certifications than I care to remember.
I double majored in Comp Sci and Chem.
I've done professional developing of complex systems, and am proficient
(maybe not speedy or fresh and not current in most) in over 10+ separate
languages, 25+ variations of those 10+ languages.
I've been in the real world of IT for many years. I've been on the net
for
longer. I still remember the 3 heirarchies of usenet pre-great
renaming.
Damn, I feel like I'm applying for a job.
I don't see where your phallus is any bigger than mine.
And if you can take some ribbing meant in good nature, I'll remind you
of
the cliche, "Those you can't, teach." (If it you can't, know that I'm
probably sleeping on my couch tonight for having said that... The
better
half is a teacher.)
While I concede that the Jargon file is not a great source, it is kept
as
factual as possible, and is one of _the_ sources of hacker cultural
history.
Now, with preliminaries and pleasantries out of the way, I'll repeat
myself.
"Moore's Law. It's a tongue-in-cheek 'joke' in hacker/computer jargon"
When referenced as Moore's Law, it's meant to be taken exactly the same
as
Murphy's Law. They are meant to poke fun at many things science being
but
one of them, much like most hacker humor, and also like most hacker
humor,
they have a gain of truth in them, and again, like most hacker humor
they
are somewhat good-natured cynicism. Not one of those I mentioned is
meant
to be taken in a rigorous fashion. At best, when being serious, they're
meant to be commentary on current (and past) societal problems and
events. It seems you didn't really look those other laws up. Gate's is
really amusing in a cynical sort of way, given the software worlds trend
to
bloating.
As to your definition of law, you've forgotten 2 very common and
important
types of law. Socio-political, and religious. Neither of which needs
to
be proved, and in fact most can't. (I admit to being geek enough to
have
spent time amusing my self this evening by trying to write proofs of
"Thou
shalt not commit covet thy neighbors wife" and several US gun
laws. Failing, but amusingly.) I won't pretend that Moore's law is
really
religious, but there is a hint of it.
Oddly enough, dictionary.com responds to a search of Moore's Law by
giving
quotes from the Jargon file but has no entries for Moore's theorum.
On the topic of Moore's theorum, I'll give you your statements... With
the
exception that the theorum as first presented is certainly going to be
proved dead in a very short time. None of Quatum computing,
Gated/switched
light (fiber-optic) computing, nano, nor very very recent
chip-manufactoring techniques involve silicon, which is directly stated
in
the theorum. "...the logic density of silicon integrated circuits..."
The general theorum is applied too liberally, or misapplied due to a
misunderstanding (or misuse) of the above Moore's Law. However, the
general theorum, computers "double" every 18 months IS provable as a law
in
the second definition of your limited definitions. It's really a
limited
and specific case of evolution when applied to technology.
[Tomb] I'm more than happy to randomly point to a
>hundred instances of an overly complex mechanical
>or electronic system in my daily environment.
Come on... I can point to as many examples of simple mechanical and
electrical systems in my daily environment as you can point to
complex. I'll start by pointing out the fact that complex systems are
_OFTEN_ built of simple systems. Which should be enough to prove my
comment that "Robust and simple are OFTEN built in the real world."
You're right... We do spend a lot of time adding features. By that
statement, 2.0 is more complex than 1.0, correct? But that means 1.0 is
simple in comparison... My point made. Simplicity is relative.
Modern memory, if treated right, will last much more than 15
years... Frankly, I'd be surprised if it didn't last darn near
forever. Memory does not have _any_ planned obsolescence. Or, more to
the
point, any and all planned obsolescence is external to the memory. It's
actually a fairly unique time in the history of manufacturing.
Computers
in general, have no need for planned obsolescence. The market,
technological growth, new tech vectors, and other factors give all the
necessary obsolescence needed for the majority of computer
parts. Hard-drives and monitors are 2 exceptions, and you will note
that
they have planned obsolescence built into them. And it makes
sense... Hard drives can be used from computer to computer, and thus
the
impetus to purchase another is low, yet RAM can be changed simply by
changing the socket it's on, or increasing the need of it.
Some trench practical knowledge: Any ram you get, if properly burned in
before use, will last much longer than the life of the machine. Failure
rates are 5 to 6 nines or better. Burn in is required only to isolate
chips that are defective. How's that for robust? Look at the
difference
between a PC and a 5 9 reliability machine like Sun's high end E class
machines. They use similar or the same ram. Believe me, if Sun claims
5
9's reliability, they mean it. The need (and price) of such reliability
means that anything requiring it is extremely valuable or dangerous, or
both. Litterally, the cost of a 1 minute downtime in that situation can
mean millions of lives, or billions of dollars lost.
I'd be surprised if any firearm, not treated properly, would last even
15
years. For that matter, most anything, if abused, tends to not last
long. With or without planned obsolescence.
Believe it or not, aside from the odd bug, the most bleeding edge CPU's
are
as reliable (if operated correctly) as 8086's, 8088's, 286's, 386's,
etc. They're even as robust. Run a modern chip at 8088 speeds, and
you'll
still see an increase in performance, for the same or less heat and
physical resistance.
When I said first run, I was not speaking of prototypes. As to your
over/under engineered comment, I disagree. Most first run objects are
engineered correctly (I will _NOT_ include modern software here however)
to
the specs. It's almost always the fault of bad specs, which is usually
the
fault of bad market research, miscommunication, artifical limitations
placed upon engineering, misunderstood needs, unforeseen problem
resulting
in product use, etc. True, this doesn't lead to robust and simple
machines, but that is not the engineer's fault.
I do agree with your complex does not make robust or reliable easy. But
it
does not preclude it either.
*sigh*
Rand.
At 02:58 AM 1/31/02 -0500, Thomas Barclay wrote:
>Randall said:
>
>*cough* It's Moore's Law. It's a tongue-in-cheek
>"joke" in hacker/computer jargon.
>
>[Tomb] Randall, just FYI: I teach TCP/IP
>programming for Internet applications at our
>college and I have a background in both
>Electronics, Electrical Engineering, and Software
>Development. I have a pretty darn good idea of
>Moore's Law, its origins and whatnot (which is the
>underlying basis of my prior comment). I don't go
>back quite so far as Alan making PDP's in a kitchen,
>but I've wirewrapped my own microprocessor
>system and written code to drive it in assembler
>burn onto ROMs. I have a pretty good idea of who
>Moore is, what he said, the paper he released on
>the subject, etc. And the Jargon File is a
>questionable resource or reference, despite its
>pretensions of grandeur or even adequacy.
>
>Similar to Gate's
>Law, Parkinson's Law of Data, and Murphy's Law.
>
>[Tomb] My point was that not ONE of these justly
>deserves to be called a Law in any rigorous sense.
>At best, a half-assed Theorem.
>
> From dictionary.com, the pertinent parts of the
>definition of Law in the context of science are:
>
>1) A statement describing a relationship
>observed to be invariable between or among
>phenomena for all cases in which the specified
>conditions are met: the law of gravity.
>2) . In philosophy and physics: A rule of being,
>operation, or change, so certain and constant that
>it is conceived of as imposed by the will of God
>or by some controlling authority; as, the law of
>gravitation; the laws of motion; the law heredity;
>the laws of thought; the laws of cause and effect;
>law of self-preservation.
>
>Any of the "alleged" Laws you mention above
>are at best theorems given the fact that they
>cannot be mathematically proven for all cases
>and that we have not undergone sufficient time
>and exhaustive study as to consider them proven.
>
>On the topic particularly of Moore's "Law",
>there have been many allegations about how
>Moore's "Law" will cease to apply in the none
>too distant future (there have, fairly, been
>arguments on the other side). One basis for
>attacking Moore's Law is the basic physical
>limits which will be reached sometime along the
>line which limit the minimum size of a switching
>element (the core of a processor) based on
>certain atomic size limitations.
>
>I'm not even saying Moore's "Law" is a terrible
>theory - it has held out for longer than people
>would have suspected. But to call it a Law is
>more than vaguely insulting to real science as it
>is number based pseudo-science and
>(temporarily supported) conjecture. It is this
>exact slackness that is part of what is keeping
>computer engineering from being properly
>integrated with the rest of the engineering
>profession. (There are other reasons).
>
> >[Tomb] It always takes tech complexity to make
>the state ofthe art item with state of the art
>efficiency and features.If you are willing to settle
>for a lesser feature set andapply state of the art
>engineering to building somethingrobust and simple
>(rarely done in the real world because
> >people want features and there is no market),
>you CANproduce something far more robust and
>simple.
>
>Not true... Robust and simple are OFTEN built in
>the real world.
>
>[Tomb] I'm more than happy to randomly point to a
>hundred instances of an overly complex mechanical
>or electronic system in my daily environment.
>Feature-rich is a buzzword. It sells. Thus the world
>is full of non-simple devices and systems. If it were
>otherwise, many modern systems would be far
>more reliable. I didn't imply that we never built
>robust, simple stuff. We just spend a lot of time
>making things fancy or feature-rich or snazzy
>looking. Often to the detriment of the end product
>which could have been better enhanced by more
>testing, fewer features, more reliability.
>
> The problem is, these terms are _really_
>subjective terms.
>
>[Tomb] Conceded. However, in the sense of a
>colonial piece of equipment, there is a clear
>absolute standard: how does it do its job, and how
>long does it last.
>
> Simple and robust chips are much more complex
>and unreliable when compared to fire-starting
>equipment like a lighter.
>
>[Tomb] Even this statement is problematic. I think
>modern computer memory could be considered
>simple, but these will last 15 years? Will the
>lighter? And aren't you comparing apples and
>oranges? Wouldn't it make more sense to compare
>a zippo to a modern piezo-electric lighter with an
>element that can put out 2000 degrees? Or to
>compare a 386 chip rated for the space program
>(simple, in this comparison, and robust) to a
>bleeding edge AMD chip? Which is more robust?
>
> Which is much more complex than say a match.
>Most first run products are simple, feature-poor
>commodities. It's the 2.0 and beyond that get nasty.
>
>[Tomb] In an economic sense, most prototypes are
>either underengineered or overengineered
>(probably in equal measure or a bit of both). As
>time goes by, things get made lighter (sometimes
>at the cost of durability - who needs a 286 to last
>20 years? We throw it out in 5), sometimes
>features get added. Sometimes later models are
>refined with the edges taken off (leatherman tools
>are an example) and sometimes they are feature-
>heavy bloaters with dubious reliability.
>
>Simple and robust are not always easier to
>manufacture either. It is a simple procedure you
>can do in the house to create Oxygen and
>Hydrogen, (both fairly unstable in this
>condition). Many other forms of fire-starting
>equipment require massive refineries, complex
>chemical processes, etc. Still, when it comes down
>to it... I'd rather a match to start my fire.
>
>[Tomb] The match is a fairly simple construct.
>Simple, robust, and does the job well. Of course, so
>might a zippo. Robustness is not always easy to
>engineer into an object. Part of this challenge is
>dependent on the nature of the object and its
>function. Simplicity can be hard to engineer in if the
>task it must perform is complex. But the absence of
>simplicity introduces plenty of room for mistakes
>and plenty of places for failures. Complexity is the
>enemy of Robustness. This is ultimately the basic
>lesson of Reliability Engineering. Herein we learn
>that, for systems of high reliability (0.97 or better I
>recall), the optimal number of redudant parallel
>switched backups is 2-3. Beyond that, the
>complexity you are introducing actually reduces
>your overall reliability. Complexity and fancy neato
>gee whiz features sell new cars in a showroom on
>Albion. The ability for a binary propellant rifle to
>function reliably when submersed in muck is much
>more of a selling feature on Slimobia III.
>
>Tomb
>
>---------------------------------------------
>Thomas Barclay
>Co-Creator of http://www.stargrunt.ca
>Stargrunt II and Dirtside II game site
>"In God We Trust... on Cold Steel We Depend."
>---------------------------------------------