Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: Going the distance





Rob,

While I understand that certain aspects of engineering are always
statistical, including the very operation of the transistor and laser, I
think that designing in this way should be avoided where possible and
where it doesn't significantly lower cost.  Why assign MAC addresses,
for example?  Why not just select random ones?  The chance of two
stations on a LAN having the same address is sufficiently remote (2 ^^
40).  If it cost a huge amount to distribute those addresses, we
probably wouldn't bother.  It doesn't cost much, and the comfort factor
of knowing that the MACs are unique is worth it.  

If a completely worst case situation resulted in 10G operating only to
10km, to determine whether it was "safe enough" to operate over 15km
wouldn't we *at least* need to profile the quality distributions for
each component along the way?  How often do parts hit worst case, how
often do they exceed it, and by how much...  is this even something that
vendors are willing to share?  ("Sure! We throw away 95% of what we
build and the rest is within 2% of the worst case") :)

This also opens up a whole new dimension to manufacturing and test.  You
always check that your measurement is between A and B, and for process
control you probably measure how often it falls in various places. 
Would you have to throw away parts that were within spec because your
process wasn't following the "profile"?  If your manufacturing guy says
that he can make a part for cheaper because, due to better controls, he
can hit closer to the limit, wouldn't you want to be able to do that?

At the extreme, if everyone were to tune their processes such that they
could build dirt cheap parts that 100% of the time hit the worst case
specs, wouldn't you be back at 10km?  How many places will it end up
costing more because we don't have a simple answer to the question: "how
good is good enough?".  Perhaps this is a lot easier to do when you
control all the pieces of a system, rather than bringing pieces from
different vendors together (be it at the box or network level).  In past
systems I have worked on, for example, I've used the assumption that one
chip cannot be at maximum voltage when the one next to it is at
minimum.  A sure bet within a box, probably not so safe across them. 

Finally, would we assume that everyone got equipment of "random"
quality, or would we account for the fact that the more, ahem,
"cost-sensitive" network manager will probably end up with more than
his/her fair share of marginal equipment?  What about other correlating
factors like temperature, humidity, and age, that will affect whole
systems together?

If you want to use statistics, why not give them to your sales people? 
They can show a nice graph with a miniscule failure rate at 10km (and a
red line that says "specified limit") which moves up to 0.01% at 15km,
10% at 20km, and so on.  Then the network manager can pick his/her own
cost/benefit point along this curve.  This would also help in another
way -- they can be more confident that the next technology (100G?) would
have a good looking curve up to, but maybe not beyond, that little red
line.  Even if the curve looks great up to 300% of the little red line,
as the mature Gig might, this would be an important datapoint for some
to consider. 

Judging by the silence on this topic most people either agree with one
view or the other, though I honestly can't say which is preferred.  I
just hope that anyone working on the plane I fly next think like
myself.  

-Simon L. Sabato
-Naive Engineer
-Level One Communications


Rob Marsland wrote:
> 
> In reply to:
> The latter is simply not a tolerable situation
> to me.  It's simple math, if you want 10,000,000 trouble free
> installations, then you're going to have to ensure that the
> one-in-a-million combination of worst-case devices still works.
> 
> Is there really *any* other way to go about this?
> 
> ------------
> Yes there is.  Its called statistics.  Instead of designing for worst case,
> you design for 3 sigma, or 5 sigma, or whatever.  It makes a lot more sense
> even if you are designing atom bombs.  Well, ok, maybe worst case is
> necessary for that one.............
> 
> Rob
> 
> Robert A. Marsland
> Focused Research, Inc. (a New Focus company)
> 555 Science Dr.
> Madison, WI 53711
> (608) 238 2455
> (608) 238 2656 FAX