Advertisement
Blogs
Advertisement

Someday all basestations will be like this

Tue, 05/11/2010 - 9:37am
Rupert Baines, picoChip, www.picochip.com
RupertBainesQuite often, technology starts at the high-end (military, expensive lab gear or enterprise) and then with time as volumes rise it becomes affordable and moves into the mass-market. Think of computers (the famous but perhaps apocryphal, Thomas Watson’s “there is a market for perhaps five computers” or Ken Olsen “Why would anyone want a computer at home?”, or GPS mapping or cell-phones (which twenty years ago were the epitome of ‘too much money’ yuppy flashiness, and now a basic tool for anyone over 11 years old)

But it is not uncommon for technology to start in residential, then migrate to more sensitive or higher-value applications, and this trend is often seen in comms.

Ten years ago ADSL was being deployed for residential broadband, but it was viewed as too “quick’n’dirty” to be used for businesses. Enterprises used T1 lines and frame relay, while carriers were all on ATM. But today, most companies happily use ADSL or more likely VDSL, and carriers are likewise using DSL and even versions of Ethernet. Although the technology has changed slightly, because of the volumes of the consumer market these technologies have economies of scale, familiarity, provisioning systems and – as a result low CapEx and OpEx – that a business or carrier can use to improve its own cost structure, and thus the technology get re-cast in a slightly different role.

Another example is Wi-Fi, which was initially a residential system but over time has expanded its scope. People like Aruba, Meru and Cisco have taken it into the enterprise, adding sophisticated management, stringent security and the functionality a CIO requires. They have also integrated the technology with the IP-PBX and the corporate data and voice networks. In an other direction, companies like Belair & Tropos have added smarts to Wi-Fi and made it more rugged, so it can be deployed in metro areas and to enable wireless broadband coverage outdoors.

Once more, this has taken time and some very smart people to develop the new functionality – but by leveraging the economies of scale and widespread availability / familiarity of the technology from high-volume mass-market enables these new opportunities to become viable.

Femtocells are going through a similar expansion. Although most familiar for residential applications (as in AT&T 3G Microcell or Vodafone’s SureSignal) it is notable how people are looking at broader applications in residential, metro and rural contexts. This can dramatically improve the economics of a wireless network, significantly reducing CapeEx and OpEx of basestations, enabling deployment in places that normally could not be addressed, and upset the supply chain and the business models of the traditional manufacturers – creating winners and losers.

There are hard aspects to femtocells, both in development and in deployment (provisioning and the like).

But once these issues have been addressed for residential, then other applications become possible essentially “for free”. The cost of the system (CapEx) can be low because the sophisticated basestation functionality is available implemented as a cost-effective single chip device – which is only viable because of the multi-million unit volume of the consumer market. This is a major difference to the CapEx of a traditional picocell or microcell which relied on expensive processing, developed for that niche and unable to amortize cost over any significant volume.

A similar logic applies on OpEx. Conventional basestations require frequency planning and optimization. That is complicated and labor intensive – hence expensive and slow. Provisioning is likewise a slow, human-based activity: updating databases, loading configuration files testing and managing are all done one a “one at a time” basis which require skilled engineers and as a result have a high OpEx bill. (If even a major operator only has some thousands of basestations each very expensive then this is possible or acceptable)

Then the basestation required high quality backhaul: until surprisingly recently this was ATM (very expensive indeed), and it still needs dedicated links and careful QoS control.

For all of these reasons picocells have traditionally never quite succeeded. The advantages in coverage and capacity are clear, and the business opportunities for coverage and capacity attractive – but the costs – both CapEx and OpEx from equipment cost, planning, provisioning and deployment all combine to mean that the technology has largely been confined to niches.

The next size up of micro has similar issues, but at least if those high costs can be spread over more users the business case becomes viable: but essentially there is a ‘floor’ density below which deployment cannot happen, even if the technology is appropriate and the need for capacity or coverage is clear. (As noted before, this is one of the reasons carriers are suffering such capacity constraints).

But femtocells can dramatically change this: at a stroke, the technology can radically improve the economics of a wireless network.

Because the core technology already exists, and the chip is already designed a “new pico” can exploit it to enable a very cost-effective solution. Because the femtocell can automatically set its radio parameters to fit into the network without harm (“Self Organizing Network – SON”, OR “cognitive radio”) then there is no need for the expensive labor-intensive frequency planning. Because the provisioning system is deployed and scales to millions of units (based on TR-069/196 and the 3GPP standards), then this becomes automatic and “for free”. Because the femtocell is designed to work over standard broadband then the backhaul becomes very cheap and easy.

While picocells have not quite hit the market, and even microcells face this ‘floor’ of viability, a new suite of products based on these techniques are now viable and very attractive. Some people call them “super femtos”, others “greater femtos” and I have heard “picocell 2.0” or even “picocell done right”!

Operators are starting to deploy these.

A number are rolling out femtocells for enterprises. There are some specific challenges here that I’ll write about another time, but the deployments are happening. And, of course, there are companies like SpiderCloud or ipAccess with products explicitly addressing these applications.

Softbank has announced that as part of its femtocell launch one of the application areas is for sparse rural areas: places that would not normally get service because the usage density is too low can now sensibly be addressed, bringing 3G to regions that would otherwise never have it. This can be extended, not just in rural Japan but also to developing countries.

Vodafone had presented several times on it ‘metro zone’ concept, of using a dense network of small, cheap basestations to deliver high-density wireless broadband in capacity-sparse urban areas. This is often thought of for LTE, but they are stressing that the same logic applies to HSPA+ too, based on the existing femto technology.

So, we can expect to see an explosion in the number of basestations, as these ‘greater femtos’ start being widely deployed in businesses, in coverage blackspots, dense urban areas and hotspots and the like.

But the inevitable question is: who wins, who loses?

Consumers win, with faster links, better coverage and reduced fees (lower costs to a carrier mean it can cut tariffs). Carriers win, with better service and happier customers. The femtocell manufacturers win, as they have new opportunities which (even at aggressive cost points) can be very lucrative.

But the traditional suppliers, those who have made the conventional pico and micro basestations, will need to watch out or someone else will eat there lunch...
Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading