WARP Project Forums - Wireless Open-Access Research Platform

You are not logged in.

#1 2008-Jan-25 14:27:02

jiayang
Member
Registered: 2007-Mar-07
Posts: 19

change of frame loss rate

We did some experiments with reference design v4.0 and found that the frame loss rate increased over time. We ran a audio streaming program for around 1 hour, and the frame loss rate rose from about 0.1% to 1%. Is this expected on WARP? And what's the reason? Thanks a lot!

Offline

 

#2 2008-Jan-27 15:33:58

murphpo
Administrator
From: Mango Communications
Registered: 2006-Jul-03
Posts: 5159

Re: change of frame loss rate

We do see variations in performance with time, but I don't think there's a single cause.

A few sources could be:
-Changes in carrier and sampling frequency offsets. The frequencies of oscillators that provide the sampling and radio reference clocks drift with temperature. In the later reference designs, these clocks are both sourced by temperature compensated oscillators on the clock board. In OFDM reference v04, the sampling clock is derived from the 100MHz non-TC oscillator on the FPGA board.

-The thermal noise observed at the receiver will definitely change as the devices warp up.

-It's very hard to rule out changes in the environment over long experiments; extra 2.4GHz interferers, subtle channel changes, etc. In the uncoded OFDM design, a channel variation that causes a null for even a single subcarrier will directly affect the overall packet error rate.

Offline

 

Board footer