Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
Tom, Kind of missed everything on flexibility and provisioning and stuff we don’t know yet. Trying to use your words to convey some of what I am trying to get
across: We need optimal power budgets and distortion [channel] models that cover the range of flexibility that will be allowed in the standard. We would like to see stated Functional Requirements that include operating at 1 Gbps [downstream] in 120 MHz [contiguous
spectrum] channel width with consideration up to 10 Gb/s. I’m still not comfortable with those words, but you can see where my concerns are. Cheers, Mark ps. I was thinking further, the Task Force will also need a system loading model that examines OLT upstream scheduling impact on concatenated packet sizes under
a variety of loads that meets the cable industry’s predicted mix of traffic for varying combinations of business and residential services as well as channel models for business and residential cable topologies for both active and passive networks. From: Tom Staniec [mailto:staniecjt@xxxxxxxxx]
So after conferring by email with John Ulm and reading Kevin’s email and Mark’s just below, I offer the following which I think gets to the desired result: Objective 3: Develop an optimal network power budget and distortion model describing actual coaxial and hybrid fiber coaxial operational parameters for coexistence
with analog, QAM and DOCSIS as a reference baseline when EPoC is included in the network channel load at a minimum of 1 Gb/s with extension up to 10 Gb/s. Kevin, the statement above should give the accuracy you pointed to in your paragraph 1 below and the understanding we are going to need going forward for setting
the baseline . It also gives a point of reference for all parties, especially those not in the discussions, that there is an RF network operational complexity which must be met as to not impact existing services in any way. Mark, I think this addresses your points. While baseline is still included, ideal is not. Optimal is used as the point where all services in the coaxial network operate in concert with no subscriber
impact and complies with FCC, operator and EPoC requirements. That provides the reference baseline point where, when testing performance above and below that point will determine the impact on coexistent services. Comments? Thoughts? Regards Tom From: Mark Laubach [mailto:laubach@xxxxxxxxxxxx]
My two cents on Objective 4 (new). Kevin, I’m having problems with “baseline” and “ideal” also. Here is the way I think about it in my words. Maybe they’ll suggest some alternative wording
that still meets your objective’s objectives without “stirring the pot” too much. Also, apologies on doing this so late before the meeting and after submission time has passed. When a cable operator goes to install this EPoC system, they will provision an RF channel to match their available spectrum and the channel conditions within
that spectrum to achieve a desired MAC data rate. Provisioning will include configuring to the available spectrum (e.g. frequency, width, etc.) and the modulation rate to be used (subject to the flexibility of the PHY), and other configuration “stuff” needed
to fully configure the PHY and the system. Collectively, I’ll refer to any given set of configuration setup as a “provisioning profile”, and assume that each profile will supply a MAC data rate at a given error performance. I believe that one way to view
a motivation for Objective 4 is to establish a “must have” provisioning profile for a 1 Gbps MAC data rate and to require flexibility “around” that provision profile. One of the objective’s profile requirements is a 120MHz channel size. Essentially, it’s
a commitment by the standard that given for a cable plant, if the operator has 120MHz of spectrum AND IF AND ONLY IF the channel’s performance MEETS OR EXCEEDS this standard’s Functional Requirements (that are yet T.B.D. by the Task Force), then there will
be at least one provisioning profile (configuration) available in the product that provides 1Gbps in 120MHz of spectrum that meets or exceeds required performance at the MAC/PLS interface. Performance includes error rate, delay, delay variation, round trip
time, etc. Also performance likely has to be evaluated differently for business versus residential services. I agree that having a “golden provisioning profile” that produces 1Gbps MAC Data rate using 120MHz channel size is a good stake in the ground.
Are there other “must” provisioning profiles? Likely, but probably don’t need to be nailed at this time. Being able to flexibly adjust configuration and
achieve other configuration profiles and data rates is a must. Lower data rates, YES! and the Task Force will determine the range. However, this is actually regardless whether conditions permit a higher rate. A cable operator might just have less spectrum
that meets or exceeds Functional Requirements. Up to 10Gbps? Achieving this is dependent on more than just channel conditions and spectrum allocation, it is also based on hardware capability, relative costs, etc. I agree that the Task Force should study
implementations up to 10Gbps, but the T.F. will set the “top end” after sufficient diligence and study has been performed. Maybe we get to 10Gbps, but we don’t know yet….. Also the last bullet implies that upstream hardware must be capable of up to 10Gbps.
When will the industry see enough available spectrum to do that? My suggestion is to apply to downstream only at this time and treat upstream separately. HOWEVER be careful what is prioritized!!!!! Choose only what is essential at this time to set some expectations. This is a very complicated and detailed SYSTEM PROBLEM. The following are heavily coupled items and T.B.D. for the Task Force: + Flexibility, placement, and other details of minimum and maximum channel sizes (although SG can give some desirable guidance here) + The flexibility and granularity of adjusting different modulation rates across the channel
+ The error performance as delivered from the PHY to the MAC for various provisioning profiles + Impact of modulation, FEC, interleaving (and etc.) on round-trip-time necessary to achieve desired system performance + Impact on discovery and auto-negotiation + Impact of working around other services (if required by the cable operator) + etc. Not to be understated: EPoC is a LOT OF WORK for the Task Force.
Respectfully, Mark From: Noll, Kevin
[mailto:kevin.noll@xxxxxxxxxxx]
Tom, Your definition of baseline is what most accurately describes what I am trying to say in this objective. I don't see how the wording we have proposed is self-contradictory – it doesn't set up a situation where we are asking to do no harm at the same time we ask it to do
something that WILL cause harm. I'm open to finding another word to use here. I've searched through dictionary and thesaurus looking for an alternative, but "ideal" isn't it. --kan-- -- Kevin A. Noll, CCIE Time Warner Cable From:
Tom Staniec <staniecjt@xxxxxxxxx> Hey John You make a good point which actually alters my thinking on this topic. Using the word “baseline” might be what needs to be questioned. By technical definition,
I take baseline to mean: “information that is used as a starting point by which to compare other information.” So if I was designing a network (not an EPoC channel) for operation, I would design it to support “optimal network performance” for all service in support of
an analog, QAM, DOCSIS and an EPoC load at a prescribed level, distortion performance and temperature. That would be the point I would know testing could demonstrate compliance much like is done with FCC testing or network proof of performance. That treats
performance in all network configurations the same whether it is a passive network or N+”x” (pick a number) an presents a measurable norm. If the network operates below that level, then performance impact should be expected to the point where operation effectively ceases because of noise and other
outside factors. If the network operates above that point, then better performance should be expected with the caveat that there is a point of diminishing returns where composite distortions take a significant toll and can render the network useless. In short,
the “bathtub curve.” Moran (Motorola) has demonstrated time and again how many cable networks are not operated properly in the “bathtub curve” for optimal network performance and that is without EPoC in the picture.
That can’t be done solely by establishing a baseline of operation of a network for analog, QAM and DOCSIS. It has to be done with EPoC factored into the design
equation of the network from the start and not solely as the addition of an EPoC channel. In a deployed fully designed and engineered network, without concern for what the actual top end frequency currently is, where an EPoC channel is added then the expected
result should be that all service levels will be lowered to operate under the power constraints of the equipment whether passive or N+”x”. That may, in simple fact, mean the operator has to remove and install lower tap plates to provide a specified level in
the customer home who is not involved with EPoC at all. Based on how the network was originally designed and engineered that doesn’t guarantee that FCC yearly performance testing can be met. As the operators know, that has its own implications. So is the purpose of Objectives #3 & 4 to provide a “baseline” channel operation for EPoC or is it to establish an “optimal network performance for all deployed
services” when EPoC is in the carriage? Considering the operators (Noll et al) already invoked the “Network Hippocratic” Oath: “First, do no harm to my deployed services…” then Objective 3 directly and Objective 4 because it relies on #3 do not, in my view,
meet the premise. So I might agree to wording similar to yours if the objectives stipulate that the addition of an EPoC channel, at the stated data rates and BER performance,
can be met when there is no performance impact to other services in an optimal network design which becomes the baseline against which all performance is measured: positive or negative. So making the objective reflect that statement allows for directly developing
salient criteria for measuring the success or failure of EPoC in a coax network. How do we state that in the objective? Best regards Tom From: John Ulm
[mailto:julm@xxxxxxxxxxxx] Hi Tom, Like Kevin, I'm not sure I agree with your proposed changes. We want EPoC to be able operate in conditions worse than baseline, albeit at lower data rates. So the baseline plant conditions is NOT a minimum for
operators. Maybe we can word smith it to say "set the minimum plant conditions for baseline data rate operation". However, I'm in favor of keeping the wording of Objective 3 as it is. Similarly, I don't think the word "ideal" is appropriate. Taking an off the wall example, we could have a passive plant with 3GHz taps in it that gives us a boat load of spectrum. This
is enough spectrum to achieve 10Gbps even though plant conditions are far from ideal. I don't think "ideal" adds to Objective 4 so we shouldn't put it in. On Tue, May 8, 2012 at 2:28 PM, Ron Wolfe <rwolfe@xxxxxxxxxx> wrote: Hi Kevin, Sorry I didn’t get a chance to say hello a couple of weeks ago during the Q&A session. Next time
… To me, the notion of “ideal” should in fact be attainable, however, it is very unlikely that anything
defined as ideal would be sustainable over time. Certainly ideal conditions would not be anticipated outside of a lab environment. Ideal conditions would represent that environment where a system performance could reasonably be expected to perform to its
maximum throughput. I think you and Tom are actually saying the same thing when you consider that it is
ideal conditions that represent the conditions
that permit maximum performance. I think either works, though to me “ideal” was a concept I grasped immediately as representing closely
controlled lab conditions, albeit with the understanding that someone else might just as immediately grasp a completely different meaning. Regards, Ron From: Noll,
Kevin [mailto:kevin.noll@xxxxxxxxxxx]
Thanks for the comments, Tom, especially the reminder about the 1/10Gbps. I'll probably stick with "baseline" for now because it better conveys a multi-dimensional thought (plant conditions
are very multi-dimensional). Similarly, "ideal" implies unobtainable. Could you suggest language that leaves room for obtainability? --kan-- -- Kevin A. Noll, CCIE Time Warner Cable From:
Tom Staniec <staniecjt@xxxxxxxxx> Kevin I couple of quick comments: slide #5 objective 3 – wording change: Develop a channel model describing a typical real-world
coaxial cable plant to set the minimum baseline plant condition for the EPoC specification.
Slide #6 objective 4: wording change to sub-bullet 3:
a data rate higher than the baseline data rate
of 1 Gb/s
and up to 10 Gb/s
when transmitting in assigned spectrum in ideal
channel conditions
that permit;
Remove red words – add blue words and symbols
To me the first statement sets the expectation of the operator of what a “minimum” performing coax plant must do while telling the vendor the EPoC equipment must operate to the desired performance
in a minimum plant condition.
The second statement indicates in an “ideal” (ie performance that <far> exceeds minimum baseline plant conditions) plant condition the expectation is performance above 1 Gb/s up to 10 Gb/s with measured
performance exceeding BER and other standards currently not defined.
Regards
Tom From: Noll,
Kevin [mailto:kevin.noll@xxxxxxxxxxx]
I have updated the objectives based on our last conference call and comments received since then. I have attached
the deck as a PDF. Please review and comment. --kan-- -- Kevin A. Noll, CCIE Time Warner Cable <="" p=""> <="" p=""> <="" p="">
|