The Ofcom (UK regulator) consultation report on “Implementing Geolocation” on TV White Spaces regulatory issues, was released this week. It has been pretty clear for a while that “classic” cognitive radio, i.e. using detection of TV-transmitters is not going to provide any reasonable balance between white space utilization and adequate protection for the primary users (TV viewers). Results basically show that (in realistic scenarios) when low interference probabilities are to be achieved, the utilization of spectrum (i.e. the “amount of usable white space” ) basically goes to zero.
So something more is needed and geolocation is the way regulators and Ofcom in particular are going. A model as in the figure to the left has been proposed where the White Space Devices have to query a data base to determine if they are to be allowed to transmit. Stating the it location, the response will contain information about if and for how long the channel can be used, what power level can be used and if additional sensing is to be required (and at what threshold level).
Will this work? Certainly it will, but the question is still how well. The design of the algorithms to determine query response is still an open, and non-trivial issue. By adding more information, one should, at least in the ideal case, be able to make a better decision than with sensing only. It is of course easy to provide better protection for the primaries, but will this make us too cautious, not exploiting all spectrum opportunities. Could such combined schemes also achieve a significantly higher spectrum utilization for a given interference probability? The latter is not obvious and definitely not necesserily true for any geolocation scheme. One important problem is that geographic proximity/distance is not always a very good predictor of signal/interference levels. In wide area wireless systems (with unobstructed propagation) it may well work fine, but in dense urban/indoor environments, where the deployment of WSDs is most likely, such a prediction does not work very well. Adjacent channel interference problems indoor when the WSD is close to a TV set are other issues. In my mind we have a couple of really interesting open research questions:
- How large improvements in spectrum utilization can we achieve ? Is it actually worth while adding the increased complexity of geolocation ?
- What should the algorithms to most effectively “fuse” the signal detection and geographical location ?
Or has anyone the definitive answer ?
There are TV receivers catching signals from the cable, which in fact is significant portion in some counries. Are such cable receivers excluded from the database? How does the database look like? Are there also security issues involved in collecting user information?
As I understand, there is (still) now way knowing where the TV receivers are and if they are on cable or using “on-air” reception. That is really my point – the database can only know if a certain channel is used in a certain area for TV transmission. By avoiding that channel we can definitely bring down the interference probability in the area, but we miss also a lot of opportunities. Even worse, if we are deemed to protect the adjacent channels as well, the secondary spectrum utilization will be very small. This is particularly bad in areas with high cable or IP-TV penetration where we waste a lot of spectrum by excessively protecting the very few “on-air” DVB receivers that may still be out there. In rural areas, where there will be many “on-air” receivers, this is less of a problem.
When it comes to flexibility and adaptibility to future DTV usage, database seems to be the only plausible solution. I envisage that there would be generations of white space devices which can exploit more opportunity as generation goes on.