Thread Links | Date Links | ||||
---|---|---|---|---|---|
Thread Prev | Thread Next | Thread Index | Date Prev | Date Next | Date Index |
Kevin In answer to point one, there are actually both a network and a channel model. Think of it this way: if I look at an amplifier that operates from 50 to 750 MHz of bandwidth, there is a manufacturer stated nominal output power. BTW if we take that power level and divide it by the actual bandwidth (700,000,000 Hz) we get an important number which is power/Hz. That power represents the maximal point of functional operation for the complete network bandwidth without impact. I can’t operate every channel at that power level because the aggregate power of all channels will exceed the manufacturer nominal operation point by A LOT thereby generating massive distortion. So to determine the channel loading points which must be below the network loading point with a protective margin (network headroom) and knowing different modulation schemes present different channel power loading the network designer devises the network operating parameters which then translates into the channel power budget. The channels that have the highest peak power at any instant in time are the analog channels. QAM and DOCSIS channels look like a “noise haystack” so they done have a peak power but they do have an “average power.” That power cannot be set to the level of the analog channels so it has to be backed off and depending on loading (number of QAM or DOCSIS channels) by 6 to 10 dB below the peak analog level. Now we add an EPoC channel which might be 120 MHz wide with multiple carriers assuming OFDM and you once again have the average power issue so we have to turn all EPoC carriers down. Let’s assume 6 to 10 dB like DOCSIS and QAM but we don’t stop there. Because of adding roughly the same EPoC digital bandwidth as already exists with QAM and DOCSIS, we now have to lower the ALL carrier levels by another 3 dB to stay under the amplifier nominal operation point for the network with headroom. So that is why I treat it as a network power budget and distortion model. While not impossible, a single channel operated improperly shouldn’t cause problem in the network but a “channel” of EPoC which is actually multiple channels representing an aggregate power/Hz equation operated improperly can render a network useless of all customers. This is a simple explanation which doesn’t take into account what someone might do or how they might deploy but we don’t know that at this point so I go with what I expect. As for number 2 – once we have all of the above, then we can figure out if the above meets the operational requirements of any specific channel which are largely going to be determined by the vendor components they use at critical points in their design (think power amps, receiver sensitivity etc). For instance, where I live I have cable with an analog, QAM and DOCSIS carriage in a 750 MHz top end system. At different points time (read that as temperature highs and lows), my QAM channels (forward network) exhibit blocking, tiling and freeze frame. Simply stated the QAM carriers are operating toward the noise floor. Now think about lowering the same network power by 3 dB across the bandwidth to add EPoC. Now I have potentially made my problem worse for my QAM channels even if EPoC operates just fine for the few customers using it. So now I have to figure out how to fix my QAM customers. So generally no single channel or channel operation is important to the power equation but the aggregation of all the channels which comprise the network level and distortion parameters are because they determine how I have to operate the network. Once the network levels and distortion parameters are determined, we can provide the functional specification information Mark L is looking for because that will determine how they have to build their chip sets. At that point the operators can also simulate an EPoC channel in real networks following the engineering design determinations and figure out what impact the rest of our customers will see. Hopefully there is none but if there is we may have to go back to the drawing board. Tom From: Noll, Kevin [mailto:kevin.noll@xxxxxxxxxxx] Unfortunately changes to the submitted doc will have to wait until next week, but we'll continue working on it. To clarify things for myself: 1) Aren't things like "power budget" and "distortion model" components of a channel model? 2) Similarly, aren't the "operational parameters" part of a channel model? On the coexistence concern, recall that the objectives are taken together, not independent of one another. Objective 8 addresses the coexistence issue, and when read in context with Objective 3, would direct the TF to develop a model that addresses the coexistence concern. In context of Objective 4, it would also direct the TF to deal with the 1Gbps/10Gbps loading. However, I do understand your concern for those outside our discussions to understand the intent. I would like to do this without being redundant in the objectives. --kan-- -- Kevin A. Noll, CCIE Time Warner Cable From: Tom Staniec <staniecjt@xxxxxxxxx> So after conferring by email with John Ulm and reading Kevin’s email and Mark’s just below, I offer the following which I think gets to the desired result: Objective 3: Develop an optimal network power budget and distortion model describing actual coaxial and hybrid fiber coaxial operational parameters for coexistence with analog, QAM and DOCSIS as a reference baseline when EPoC is included in the network channel load at a minimum of 1 Gb/s with extension up to 10 Gb/s. Kevin, the statement above should give the accuracy you pointed to in your paragraph 1 below and the understanding we are going to need going forward for setting the baseline . It also gives a point of reference for all parties, especially those not in the discussions, that there is an RF network operational complexity which must be met as to not impact existing services in any way. Mark, I think this addresses your points. While baseline is still included, ideal is not. Optimal is used as the point where all services in the coaxial network operate in concert with no subscriber impact and complies with FCC, operator and EPoC requirements. That provides the reference baseline point where, when testing performance above and below that point will determine the impact on coexistent services. Comments? Thoughts? Regards Tom From: Mark Laubach [mailto:laubach@xxxxxxxxxxxx] My two cents on Objective 4 (new). Kevin, I’m having problems with “baseline” and “ideal” also. Here is the way I think about it in my words. Maybe they’ll suggest some alternative wording that still meets your objective’s objectives without “stirring the pot” too much. Also, apologies on doing this so late before the meeting and after submission time has passed. When a cable operator goes to install this EPoC system, they will provision an RF channel to match their available spectrum and the channel conditions within that spectrum to achieve a desired MAC data rate. Provisioning will include configuring to the available spectrum (e.g. frequency, width, etc.) and the modulation rate to be used (subject to the flexibility of the PHY), and other configuration “stuff” needed to fully configure the PHY and the system. Collectively, I’ll refer to any given set of configuration setup as a “provisioning profile”, and assume that each profile will supply a MAC data rate at a given error performance. I believe that one way to view a motivation for Objective 4 is to establish a “must have” provisioning profile for a 1 Gbps MAC data rate and to require flexibility “around” that provision profile. One of the objective’s profile requirements is a 120MHz channel size. Essentially, it’s a commitment by the standard that given for a cable plant, if the operator has 120MHz of spectrum AND IF AND ONLY IF the channel’s performance MEETS OR EXCEEDS this standard’s Functional Requirements (that are yet T.B.D. by the Task Force), then there will be at least one provisioning profile (configuration) available in the product that provides 1Gbps in 120MHz of spectrum that meets or exceeds required performance at the MAC/PLS interface. Performance includes error rate, delay, delay variation, round trip time, etc. Also performance likely has to be evaluated differently for business versus residential services. I agree that having a “golden provisioning profile” that produces 1Gbps MAC Data rate using 120MHz channel size is a good stake in the ground. Are there other “must” provisioning profiles? Likely, but probably don’t need to be nailed at this time. Being able to flexibly adjust configuration and achieve other configuration profiles and data rates is a must. Lower data rates, YES! and the Task Force will determine the range. However, this is actually regardless whether conditions permit a higher rate. A cable operator might just have less spectrum that meets or exceeds Functional Requirements. Up to 10Gbps? Achieving this is dependent on more than just channel conditions and spectrum allocation, it is also based on hardware capability, relative costs, etc. I agree that the Task Force should study implementations up to 10Gbps, but the T.F. will set the “top end” after sufficient diligence and study has been performed. Maybe we get to 10Gbps, but we don’t know yet….. Also the last bullet implies that upstream hardware must be capable of up to 10Gbps. When will the industry see enough available spectrum to do that? My suggestion is to apply to downstream only at this time and treat upstream separately. HOWEVER be careful what is prioritized!!!!! Choose only what is essential at this time to set some expectations. This is a very complicated and detailed SYSTEM PROBLEM. The following are heavily coupled items and T.B.D. for the Task Force: + Flexibility, placement, and other details of minimum and maximum channel sizes (although SG can give some desirable guidance here) + The flexibility and granularity of adjusting different modulation rates across the channel + The error performance as delivered from the PHY to the MAC for various provisioning profiles + Impact of modulation, FEC, interleaving (and etc.) on round-trip-time necessary to achieve desired system performance + Impact on discovery and auto-negotiation + Impact of working around other services (if required by the cable operator) + etc. Not to be understated: EPoC is a LOT OF WORK for the Task Force.
Respectfully, Mark From: Noll, Kevin [mailto:kevin.noll@xxxxxxxxxxx] Tom, Your definition of baseline is what most accurately describes what I am trying to say in this objective. I don't see how the wording we have proposed is self-contradictory – it doesn't set up a situation where we are asking to do no harm at the same time we ask it to do something that WILL cause harm. I'm open to finding another word to use here. I've searched through dictionary and thesaurus looking for an alternative, but "ideal" isn't it. --kan-- -- Kevin A. Noll, CCIE Time Warner Cable From: Tom Staniec <staniecjt@xxxxxxxxx> Hey John You make a good point which actually alters my thinking on this topic. Using the word “baseline” might be what needs to be questioned. By technical definition, I take baseline to mean: “information that is used as a starting point by which to compare other information.” So if I was designing a network (not an EPoC channel) for operation, I would design it to support “optimal network performance” for all service in support of an analog, QAM, DOCSIS and an EPoC load at a prescribed level, distortion performance and temperature. That would be the point I would know testing could demonstrate compliance much like is done with FCC testing or network proof of performance. That treats performance in all network configurations the same whether it is a passive network or N+”x” (pick a number) an presents a measurable norm. If the network operates below that level, then performance impact should be expected to the point where operation effectively ceases because of noise and other outside factors. If the network operates above that point, then better performance should be expected with the caveat that there is a point of diminishing returns where composite distortions take a significant toll and can render the network useless. In short, the “bathtub curve.” Moran (Motorola) has demonstrated time and again how many cable networks are not operated properly in the “bathtub curve” for optimal network performance and that is without EPoC in the picture. That can’t be done solely by establishing a baseline of operation of a network for analog, QAM and DOCSIS. It has to be done with EPoC factored into the design equation of the network from the start and not solely as the addition of an EPoC channel. In a deployed fully designed and engineered network, without concern for what the actual top end frequency currently is, where an EPoC channel is added then the expected result should be that all service levels will be lowered to operate under the power constraints of the equipment whether passive or N+”x”. That may, in simple fact, mean the operator has to remove and install lower tap plates to provide a specified level in the customer home who is not involved with EPoC at all. Based on how the network was originally designed and engineered that doesn’t guarantee that FCC yearly performance testing can be met. As the operators know, that has its own implications. So is the purpose of Objectives #3 & 4 to provide a “baseline” channel operation for EPoC or is it to establish an “optimal network performance for all deployed services” when EPoC is in the carriage? Considering the operators (Noll et al) already invoked the “Network Hippocratic” Oath: “First, do no harm to my deployed services…” then Objective 3 directly and Objective 4 because it relies on #3 do not, in my view, meet the premise. So I might agree to wording similar to yours if the objectives stipulate that the addition of an EPoC channel, at the stated data rates and BER performance, can be met when there is no performance impact to other services in an optimal network design which becomes the baseline against which all performance is measured: positive or negative. So making the objective reflect that statement allows for directly developing salient criteria for measuring the success or failure of EPoC in a coax network. How do we state that in the objective? Best regards Tom From: John Ulm [mailto:julm@xxxxxxxxxxxx] Hi Tom, Like Kevin, I'm not sure I agree with your proposed changes. We want EPoC to be able operate in conditions worse than baseline, albeit at lower data rates. So the baseline plant conditions is NOT a minimum for operators. Maybe we can word smith it to say "set the minimum plant conditions for baseline data rate operation". However, I'm in favor of keeping the wording of Objective 3 as it is. Similarly, I don't think the word "ideal" is appropriate. Taking an off the wall example, we could have a passive plant with 3GHz taps in it that gives us a boat load of spectrum. This is enough spectrum to achieve 10Gbps even though plant conditions are far from ideal. I don't think "ideal" adds to Objective 4 so we shouldn't put it in. On Tue, May 8, 2012 at 2:28 PM, Ron Wolfe <rwolfe@xxxxxxxxxx> wrote: Hi Kevin, Sorry I didn’t get a chance to say hello a couple of weeks ago during the Q&A session. Next time … To me, the notion of “ideal” should in fact be attainable, however, it is very unlikely that anything defined as ideal would be sustainable over time. Certainly ideal conditions would not be anticipated outside of a lab environment. Ideal conditions would represent that environment where a system performance could reasonably be expected to perform to its maximum throughput. I think you and Tom are actually saying the same thing when you consider that it is ideal conditions that represent the conditions that permit maximum performance. I think either works, though to me “ideal” was a concept I grasped immediately as representing closely controlled lab conditions, albeit with the understanding that someone else might just as immediately grasp a completely different meaning. Regards, Ron From: Noll, Kevin [mailto:kevin.noll@xxxxxxxxxxx] Thanks for the comments, Tom, especially the reminder about the 1/10Gbps. I'll probably stick with "baseline" for now because it better conveys a multi-dimensional thought (plant conditions are very multi-dimensional). Similarly, "ideal" implies unobtainable. Could you suggest language that leaves room for obtainability? --kan-- -- Kevin A. Noll, CCIE Time Warner Cable From: Tom Staniec <staniecjt@xxxxxxxxx> Kevin I couple of quick comments: slide #5 objective 3 – wording change: Develop a channel model describing a typical real-world coaxial cable plant to set the minimum baseline plant condition for the EPoC specification. Slide #6 objective 4: wording change to sub-bullet 3: a data rate higher than the baseline data rate of 1 Gb/s and up to 10 Gb/s when transmitting in assigned spectrum in ideal channel conditions that permit; Remove red words – add blue words and symbols To me the first statement sets the expectation of the operator of what a “minimum” performing coax plant must do while telling the vendor the EPoC equipment must operate to the desired performance in a minimum plant condition. The second statement indicates in an “ideal” (ie performance that <far> exceeds minimum baseline plant conditions) plant condition the expectation is performance above 1 Gb/s up to 10 Gb/s with measured performance exceeding BER and other standards currently not defined. Regards Tom From: Noll, Kevin [mailto:kevin.noll@xxxxxxxxxxx] I have updated the objectives based on our last conference call and comments received since then. I have attached the deck as a PDF. Please review and comment. --kan-- -- Kevin A. Noll, CCIE Time Warner Cable <="" p=""> <="" p=""> <="" p=""> <="" p=""> |