Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
Guys, I’ve made a lot of changes to my
worst-case analysis presentation. The latest is attached. (David,
please replace the pdf on the website with this one.) Basically I focused more on cost analysis.
Page 13 will be particularly interesting, I think. I also incorporated suggestions from
several people. Thanks for all the help. See you in Denver. Steve From:
owner-stds-802-3-poep@xxxxxxxx [mailto:owner-stds-802-3-poep@xxxxxxxx] On Behalf Of Hi Steve, Thanks for the quick analysis. The end result that you got makes sense due
to the fact the main reason we did classification in the 802.3af is to save
power supply costs. Your results support it. The 802.3at objectives required
enhancements for the classification especially the granularity since system
vendors was not happy with the current situation. I expect that (as shown in previous
presentations) that Power Management reduces systems cost dramatically. We all aware that we should find the point
in which in 802.3at we will have further improvements by negligible investment
in increased number of classes. I'll check our data base for testing costs
compare to IEEE802.3af and PS costs and review your analysis which looks
reasonable. Yair From:
owner-stds-802-3-poep@xxxxxxxx [mailto:owner-stds-802-3-poep@xxxxxxxx] On Behalf Of _____________________________________________ Guys, During today’s classification adhoc, there was a
lot of discussion about the cost of increasing classification granularity. I think that looking at this from an economic viewpoint
is probably a very good idea. Below is my rough attempt to estimate
costs. I welcome comments and other methods. 1. This is all about PSE costs.
PD costs are outside the scope of my analysis. 3. Increasing the number of classes
will increase the cost of manufacturing testing. e. Let Ntotal=Total number of
manufacturing test cases for a PSE (under Af). This’s total for PoE
and non-PoE tests. f. Let Nat=Number of class signatures
proposed under At. (There are 4 classes under Af, including class 0.)
i. Therefore, if we let Nat=30, the
cost of the system goes up approx 1.9% to 3.8%. a. Let Cps=Cost of main power supply
in an Af-PSE c. Let Wat=Worst-case wasted power
ratio due to granularity with new scheme (At). For 30 classes (Nat=30) on
an exponential scale covering 2W to 100W, I’ve calculated Wat=1.15. d. The relative cost of the smaller
power supply is approx (Cps/Ctotal)*(Wat/Waf). f. Then the cost savings would be
approx (50%)*(1.15/2.37)=24% So, about 4% extra test cost could facilitate a 24%
reduction in material cost, or a net reduction of about 20%.
Obviously, this doesn’t include the extra material cost associated with
the new class protocol (be it ping-pong or time-based). I don’t know if 30 is the optimal number of
classes; it’s just one number the adhoc group was throwing around. I realize this is a crude analysis, but then engineering
ecomonics usually is. Can anyone come up with better equations, or more
realistic numbers to plug in? Can anyone come up with a reasonable analysis that shows
no cost savings by increasing class granularity? Steve |