[lug] can't make this stuff up, folks... My 2 lines of code, errr, I mean, my 2 cents....

Rob Nagler nagler at bivio.biz
Thu Oct 22 11:06:52 MDT 2009


Jeffrey Haemer writes:
> I think it unseemly to start Wikipedia pages about one's own work.  Call me
> old-fashioned.

That's an interesting point, and one I adhere to as well.  Yet it does
open up some interesting questions about knowledge acquisition.

A friend recently recommended the work of Charles S. Peirce
(pronounced "purse").  Some of you may have heard of him.  He wrote
over 100K pages on logic, math, science, and philosophy (specifically
the logic of science).  In any event, many of his ideas predated other
people's by decades.  For example, (per Wikipedia):

    Discovered in 1880[20] how that which is now called Boolean
    algebra could be expressed by means of a single binary operation,
    either NAND or its dual, NOR. (See also De Morgan's Laws). This
    discovery anticipated Sheffer by 33 years.

He also invented (discovered?) a third type of inference called
abduction.  I didn't know about abduction, but it's what you do when
you think up a test.  You are trying to come up with the rule before
you have any evidence that indicates the rule should be valid.
Abduction is quite a revolutionary concept, and I'm just trying to
get my head around it.  Indeed, much Peirce's work was
revolutionary, and sadly, completely unknown during his lifetime.  He
died a pauper and barely known even within the world of science at
the time.

Could he have done a better job at self-promotion?  I think so.  Would
it have ruined his credibility?  I don't know.  Would he have been
happier if his accomplishments were recognized?  Another big "I don't
know", but he might have not been a pauper, getting stale bread from
the town's baker.

Now, let's flip that around.  Take Nagios (please!).  I happen to be
involved with a client about this, and I decided to take a peak under
the hood.  My intuition says that it's more fluff than stuff.  When I
looked at the code, it was *terrifying*.  This thing is supposed to be
a robust platform for detecting and publishing critical system
events.  Yet, the software is riddled with memory leaks, doesn't seem
to have the slightest clue about error handling, and is so
copy-and-pasted it even has two versions of "my_strtok", which is not
only used over-and-over, but implemented differently in the two
versions.

The guy who wrote Nagios did so just after school.  I've been there.
I know how easy it is to think you've discovered something brilliant
after writing some code.  Yet, I didn't go around pronouncing it.
Somehow I knew that I had a lot to learn.  This guy has started a
company around Nagios, and he promotes it heavily.  I consider this
not only irresponsible but dangerous to the software industry imiho.
Obviously, what I have to say is pointless as I have no doubt many of
you are using Nagios.

We run into this question at bivio all the time.  When people see what
we do, they say, "You have to tell people about this."  At the same
time, I have no interest in debates with people who haven't been
through what we've been through.  It's not a discussion.  It's simply
words flying back and forth (sometimes like what happens on this list ;-) 
and that's just a waste of everybody's time.

It's what I imagine Peirce went through when he discovered Abduction.
Hearing about it 140 years later, and having done lots of testing, it
makes perfect sense to me, but I've got a wealth of information at my
disposal from which I can make sense of it.  One of Peirce's peers
might have laughed him off as an idiot.

So the question I often ask myself: Am I deluding myself into thinking
what we are doing at bivio is "great", or is it great but before its
time and therefore has to stay that way?

Jeff, I think you have to ask that question about your experience with
testing.  Perhaps you have some great things to offer (I think you
do, if that counts :-), but perhaps people are not ready to accept
your ideas.

Rob



More information about the LUG mailing list