Replies: 6 comments 2 replies
-
Hi, |
Beta Was this translation helpful? Give feedback.
-
No, a random fading value is computed for each RB independently. That comment is just for improving simulation speed, i.e.: we would not need to compute fading for RBs that are not used. I think the behavior you observed may be due to the error model used in Simu5G, which computes an error probability for each RB used for transmission. Roughly speaking, the more RB a transmission uses the higher the probability an RB is corrupted (hence, the whole transmission). |
Beta Was this translation helpful? Give feedback.
-
No, the number of allocated RBs depend on the buffer size and CQI only. However, assume you send a 40KB packet, it might happen that this requires allocating all the available RBs (say, 200RBs) in one TTI, maybe requiring to use also subsequent TTIs to send the whole content of the buffer. In this case the error model will compute the error probability over 200 RBs. |
Beta Was this translation helpful? Give feedback.
-
Hi thank you for your reply. It seems to answer my concerns. |
Beta Was this translation helpful? Give feedback.
-
You are correct. It is the expected behavior for the model we decided to implement - the idea is that if there is one RB with very bad quality the frame cannot be decoded correctly (even if the average quality over the used RBs is good) because you would miss one part of the frame. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your answers and suggestions. Best regards, |
Beta Was this translation helpful? Give feedback.
-
Hello there,
I've been encountering unexpected performance issues with my simulations. I have implemented both a MecApp and a UEApp and I am generating traffic from an UE. I've noticed that as I increase the number of resource blocks (numBands) available, I'm unexpectedly experiencing greater packet loss. While the end-to-end delay decreases(as expected), the rate of packet loss increases.
Additionally, I have observed that my SINR decreases with more resource blocks available.
Especially, I observed a fading attenuation 10 times higher when I increase from 200 RBs to 2000 RBs available accompanied by a rise in packet loss rate.
This trend seems counterintuitive to me.
I can give more details if needed.
Kind regards
Hakim
Beta Was this translation helpful? Give feedback.
All reactions