[lug] Developer abuse

Nate Duehr nate at natetech.com
Fri Nov 12 18:41:21 MST 2004


Evelyn Mitchell wrote:

>One of the biggest factors, I believe, is just the pace of change in the
>industry. In building construction, the estimator is working with a set of
>processeses and materials which change on the order of a few percent per
>year. Styles change, but the basic way things get done is pretty stable.
>  
>
One could argue that many of these standards are forced by building 
code, and therefore are legal requirements overshadowing the 
construction of the building.  The building industry is mature enough 
that building things "to code" is simply taken for granted, and everyone 
knows that a government inspector is coming at regular intervals to 
review the work being accomplished.

Should companies learn to mimic this and only accept software that meets 
minimum design safety requirements specifically tailored to their type 
of business unless they wish to purposefully take risks above and beyond 
that level?  Business already does this as a matter of necessity in the 
medical and aviation software sectors.  Or is the fact that it's a 
government mandate the only thing that keeps construction companies 
building things "to code"?

I can make one prediction -- until companies aren't willing to shell out 
millions for a large complex system with ZERO visibility to the design 
and build process... bad software disasters will continue to happen.  At 
bigger and bigger risk as systems are integrated more and more.

>In software development, though, all of the components of the environment,
>from the software tools (languages and libraries) to operating systems,
>computer hardware performance and customer expectations are in flux, with
>changes of >50%/year not uncommon.
>
But they don't *have* to be in flux do they?  This is an artifact of not 
having standards that MUST be met or the customer won't take the 
product.  It's also a side-effect of software engineers still thinking 
they need to create "new" things and not "better" things.  And much of 
that is driven by how computers are marketed - marketing for computers 
hasn't changed since the glossy Apple ads for the Macintosh in 1984.  
"New is better" is still the industry mantra, not "we've found a 
simpler, more robust, and intelligent way to do this that took us years 
to come up with".  What?!  It runs on DOS?  Oh that's ancient!  We can't 
buy that!

Software marketing insinuates that software goes "bad" like moldy cheese 
or bread.  I've never seen software rot away (the Windows registry 
notwithstanding... heh) on my machines over the years, but if the 
industry pushes for "newer is better" all the time, even the largest 
customers end up believing it to their very core.  Software purchasers 
don't buy for features, they buy because they need to "keep up with the 
Joneses" sometimes.  This is changing, but was very true on the desktop 
for the past 10 years or so, overall.

>So, something you did last year is not going to resemble what you need to
>do this year very much, except in broad strokes. And even with a lot of
>experience in the core skills, the problems keep getting more complex
>because the opportunities are wider and expectations are higher.
>
I tend to disagree that the problems are more complex.  Computing is 
still either (ultimately) about manipulating custom hardware to do 
various things (embedded and systems integration type projects), text 
processing, and mathmatical calculations.  The marketing hype that says 
you need a 4 GHz machine to word-process has led the industry by the 
nose down the wrong path.  Very wrong.

>If you never do anything more than twice (at the most), how are you
>supposed to learn what your most likely performance is going to be, or your
>range of likely performances?
>  
>
This was the "promise" of Object Orientation - that someone could learn 
C++ and re-use code over and over and it would get better and better.  
But the marketers needed something "new" to feed the market with -- so 
new languages keep coming out instead of new modules for C++.  The whole 
industry's hog-tied by a business model which says we have to call 
making our products better by names like "Continuation Engineering".  
Like continuing a good thing is something icky to be avoided.

[snippage...]

>But, and this is a big but, that still doesn't give the customer that
>go/no-go number at the right stage of the project (at the beginning). 
>As a businessperson, I know that I need to know whether or not a
>development project makes monetary sense before I commit to it. 
>  
>
Back to the comments above about Aerospace and Medical software - 
*those* companies under higher levels of constraint and discipline 
almost always CAN accurately estimate an entire project.  Bids for 
entire aircraft systems are made 10 years before the system really hits 
the field in full deployment.  They are forced to work this way by the 
external constraints they allow and need for human safety.  Once 
business realizes that human safety includes safely making financial 
transactions, not having half the company down for a "virus" for a week, 
stuff like that... and they put some constraints around the process, 
things get better. 

The psychological problem we're working against is that after a person's 
been scared by something the first time, every time that same thing 
happens to them again, they are not as scared of it.  If the original 
virus and the need to deploy virus protection software didn't scare you 
enough to move away from the OS most succeptible to virii, you never 
will.  And you'll start to accept virii as "normal".  Same with bugs and 
crashes...

>I'd appreciate some discussion of this. As you can probably tell, this is a
>problem I've been pondering for a while.
>
Me too.  From a different perspective.  I'm a private pilot and I'm sure 
a lot happier flying behind avionics that have been through a much more 
rigorous software process than the software my company uses.  But what's 
funny about that is, as long as I'm not inside a cloud in the airplane, 
if the avionics fail, I can still get back to the airport by using a 
stopwatch, a magnetic compass, and my eyeballs. 

If my company's software fails, it could kill the entire company and 
thus, my livelihood - which seems to me to require a higher level of 
professionalism and risk management.  But sometimes I feel that software 
engineers in "regular" businesses don't take their discipline that 
seriously.  And most software engineering management gives lip service 
to processes, but has never seen processes like the ones in aviation, 
just as a single example.  They certainly wouldn't go LOOKING for things 
they could learn from that process and apply it to their own engineers.

Many "coders" treated like these people are in the original article have 
the ability to make or break the company, but they and their bosses 
don't act like it. They also have this impression they're somehow 
"artists" and thus above mundane things like coding up a standard 
library of software that they then FORCE themselves to use and adapt 
into future software they write, and that their management helps create 
them the tools to re-use as much software with known risks in each project

Just some thoughts - I actually have to go fix a broken system now... 
heh, ironic isn't it.  The day of the "on-call computer guy" should be 
over by now.  But it's not.

Nate




More information about the LUG mailing list