There are fewer than 400,000 premises actually connected to the national broadband network. Where can the government go from here? Dr Rob Nicholls, research fellow at Swinburne University of Technology and the Centre for International Finance and Regulation, explains.
The policy rationale behind a national broadband network is two-pronged. The first is a broadband infrastructure that ensures that Australian homes and businesses have broadband at a level that does not limit the national competitiveness compared to its trading partners. The second is to ensure that this broadband service is universal.
The policy challenge is that this was never the original policy rationale and that current policy positions in the national broadband network are inconsistent with a deregulatory stance.
The policies associated with the national broadband network are trying to solve a very broad range of problems, and most of these do not fit in with delivering a telecommunications network. Even the title of the policy is misleading. The term “national” is associated with universality. However, the national broadband network is actually a set of access networks and the connectivity between these access networks is provided by commercial players.
One of the problems with policy clarity is that there are technology debates that can cloud matters. There are two ways in which broadband services can be delivered: fixed line and wireless. In the wireless space, delivery can either use a territorial system, in the same way that mobile broadband is delivered, or via satellite. Fixed-line delivery uses a combination of optical fibres and (optionally) copper wire of some type. Current ADSL (which is a mercifully short acronym for asymmetric digital subscriber line) uses the copper pairs that historically delivered telephone services. The closer the fibre is to the premises, the higher the bit rate that will be able to be delivered. This is a result of the fact that copper pairs are not good at supporting big bit rate services. Coaxial cable is much better at delivering services over longer distances.
These issues lead to a set of acronyms and the need for standardisation. The existing Telstra copper network has two segments. The first is between the exchange building and the pillar in the street. The pillar typically provides about 300 copper pairs for about 150 premises. These premises are the “distribution area” (DA). The copper between the exchange and the DA is usually in very good condition, as the cables are kept pressurised with compressed air to prevent water ingress. The copper in the DA may well have moisture issues and these can limit the data rates that can be delivered. When the fibre runs all the way to the premises, the technology is called “fibre to the premises”, or FTTP. If the fibre runs to a new box of electronics serving a DA, this is “fibre to the node”, or FTTN. If the last part is coaxial, using the cable which can deliver cable television, then this is a hybrid of fibre and coaxial, or HFC.
A FTTP network uses one fibre that feeds a splitter which provides service to up to 32 premises and is normally designed to deliver to half that number. As a practical matter, co-locating 10 splitters per DA -- usually in the pavement near the pillar is a practical solution.
The FTTN solution requires power to be delivered to the electronics in the node. This is awkward as there is rarely a meter available.
The original approach for national broadband network planning sometimes known as NBM Mark I was for a limited government subsidy of a broadband network that would provide 12 Mbps downstream and 1 Mbps upstream. The requirement was set out in a tender process in 2008.
The likely (and assumed) technology was fibre to the node. The government offer was $4.7 billion and the tender process was designed to be in the form of a least-cost subsidy auction. That is, the winner would provide “best bang for the buck” without the government paying for the whole of the network.
The reviews after the 2013 election have changed the fixed line technology to a mix of FTTP (26%), FTTN (44%) and HFC (30%). Seven per cent of premises will still be served by wireless. This is the current government’s multiple technology mix.
The most major policy challenge is the fact that deploying a different technology mix in response to a new government’s statement of expectations is not a rapid process. The agreements between NBN Co, the government and Telstra, which took more than a year first time around, were more complicated given that each of Telstra and Optus were selling their HFC networks to NBN Co. The previous deal had provided a “disconnect fee” when customers were moved from HFC to the national broadband network.
Another issue is that technology marches on. When NBN Mark I was released, the iPhone 3GS was months away and the iPad had not been launched. The consumer expectation of multiple devices using fixed and mobile networks was in its infancy. Now consumers expect to get 25 Mbps from their 4G mobile devices, and for this to remain a target for fixed broadband seems odd.
It’s clear that trying to solve universal service policy matters at the same time as limiting the competitive effect of Telstra and rolling out a network whose design parameters are constantly falling behind the Pareto curve established by productive efficiency is problematic. There are areas where there is poor or no broadband. This is not tolerable in an advanced economy, ever. Solutions that simply transfer consumer wealth to incumbent operators are not the solution. Perhaps it’s time to take another look at what government intervention (financial or regulatory) is required to deliver a tightly specified outcome.
At the end of March 2015, there were 389,000 premises actually connected to the national broadband network. That’s not a lot to show for a process that started seven years earlier.
There are a few approaches that Parliament could take that would actually reflect the rapid changes in technology and the cross-subsidisation issues. The first is that the statement of expectations should not be based on bit rates in election years. Far better to follow the approach pioneered by David Murray in the Financial Systems Inquiry and have a target that moves with either the OECD or, and this is a much tougher ask, in line with our major trading partners.
The second is a rationalisation of universal service. It does not make sense to mandate that NBN Co should be the wholesaler of last resort in remote areas and Telstra is the retailer of last resort and then to require that both use different delivery technologies.
The third is to provide those subsidies that are needed for the rationalised universal service from consolidated revenue, rather than in the form of cross-subsidies from metropolitan areas. Once this is done, the final approach is logically to target government subsidies at the under-served suburban and regional areas where current broadband access is poor. In this environment, it will not matter that TPG has a better fibre-to-the-basement solution than NBN Co; there will be no need for special licence conditions. Instead, just let the market do its job. If this means that we end up with FTTP networks deployed by Telstra and others because people want to watch more than one Netflix ultra-high-definition streaming service, then intervention by government will be fruitless.
*This is an edited extract of an article that was originally published at Pearls and Irritations