[STDS-802-16] IEEE802.16e - Speed up system level simulation ?
Dear All,
I trying to run a system level simulation for IEEE802.16e, but the
performance is extremely slow. Any idea how to speed up the simulation
? Is there any relevant paper that describe methodology used for system
level simulation for 802.16e ? The procedure I use for simulation is
1. At mobile, calculate received power after path loss, shadowing and multipath fading for each subcarrier
2. At mobile, measure interference power from first tier and second
tier base station, for each subcarrier, after path loss and shadowing.
Multipath fading from first tier and second tier base station is ignore
to reduce computation, assumming that it is averaged to 1.
3. Calculate SIR for each subcarrier using received power and interference power on each subcarrier.
4. Compute Equivalent AWGN SIR using Effective Exponential SIR Mapping
(EESM) [1] considering individual subcarrier SIR used for data burst.
5. Map Equivalent AWGN SIR to PER (Packet Error Rate).
Any idea how to reduce the computation ? Is there any relevant paper
that describe suitable methodology for this ? I find that multipath
fading (generating channel coefficient) and checking which subcarrier
is interfering from neighbouring base station is particularly
computational expensive. Also, if I were to consider multipath fading
from first and second tier interfering base station, the computational
requirement is even more !
[1]
"CINR measurements using the EESM method", http://www.ieee802.org/16/tge/contrib/C80216e-05_141r3.pdf
Regards,
Arthur.