Vein delay does not change with beacon frequency or number of nodes

I am trying to simulate a vein troubleshooting application and analyze its performance. Research on the 802.11p standard shows that as the beacon frequency and the number of vehicles increase, the latency should increase significantly due to the latency in the macro protocol (for 50 vehicles at 8 Hz, an average latency of 300 ms).

But when simulating the application, the vein insertion delay is not very different (it ranges from 1ms-4ms). I've been testing Mac level functionality and it seems like the channel is idle most of the time. Therefore, when the packet reaches the Mac level, the pipe is already idle more than DIFS, so the packet is quickly sent. I have tried increasing the packet size and decreasing the transfer rate. This increases the previous delay by a certain amount. But there is no sharp increase in latency due to the deferral process. Do you know why this is happening?

+3


source to share


1 answer


Since you are using 802.11p, the default baud rate on the control channel is 6 Mbps (source: ETSI EN 302 663)

750 MB / s = 750,000 bytes / s



Your beacons contain 500 bytes. Thus, the transmission of the beacon signal takes about 0.0007 seconds. Since you have about 50 cars in your multi-lane scenario and for example they are sending beacons at 10 hertz, it takes about 0.35 seconds from 1 second to transmit your 500 beacons.

It is less cars in my opinion to create your mentioned effect because the channel is idling about 60% of the time.

0


source







All Articles